Feeds

Evolutionary vs. traditional database design

DBA fights back

New hybrid storage solutions

In essence, there’s a story here that many database developers will recognize. Rapid changes to the schema are forced upon the development team and the resources (time and effort) required to implement those changes properly are not made available. In vain they complain that such changes will have a detrimental effect on the structure and viability of the database over time. Each time they are told “Yes, yes; thank you very much for your input. We agree that sustainability over time is very important to the business but this is an exception and you will rush this one change through for us. There will be time later to document the changes and rethink the overall design so don’t worry about it.” Eventually the structure becomes so gnarly that any change causes more problems than it fixes.

I’m not in any way trying to excuse these practices; I’m only trying to explain why we see so many poorly structured, poorly maintained databases. It is also clear that many EDBD supporters come from the application development world and it is the slow evolution that really drives them crazy. If it’s any consolation, it drives me crazy too, which is why I have so much sympathy with them; I believe they are trying to address a real problem.

So, have we proved that the TDBD process needs to be replaced?

No, I don’t believe that the existence of badly designed and/or maintained traditional databases proves that the current design process is flawed. Indeed we can argue that the existence of successful TDBD project helps to support the view that it isn’t the process that is flawed but the implementation.

However this certainly doesn’t mean that I have somehow proved that the TDBD is superior to the EDBD. So the next question that needs to be addressed is:

If all other factors were equal, which is more likely to succeed, EDBD or TDBD?

This is an important question and the correct answer is “I don’t know”. No-one knows. Lots of people have opinions but no-one really knows. One reason for the doubt is the sample size. There’s a very large number of examples of TDBD (both well and poorly executed) and very few examples of EDBD, of either sort.

But something we are sure about is that many factors interact to make a database design project (of either flavour) succeed or fail. These factors include, but are certainly not limited to:

  1. The intelligence of the designers and development team
  2. Their motivation
  3. The resources they are given
  4. The methodology they adopt
  5. How closely that methodology is followed

What we are really arguing about here is how important point 4 is when compared to the rest. My personal belief based on my experience is that the other factors listed here play a very important part in the success of a database design project and that 4 is less crucial. Your mileage may vary, but if I am correct, then it follows that changing the methodology is one of the least effective ways of fixing the problem.

To look at this another way, does anyone want to try an EDBD project using poorly motivated, untrained, under resourced people?

What should we do to address the problem of poor database design?

Address the other issues, of course:

  • Employ highly intelligent people for both design and maintenance.
  • Work hard to train and motivate them.
  • Give them the resources to follow the chosen methodology properly.
  • Do not force them to implement changes without proper consideration.
  • Ensure that they do not bog down the change management process in too much red tape.

I think the solution sounds easy; so why's it rarely implemented in practice? Because it is, in fact, rather hard. Intelligent people are difficult to recruit, managers who can motivate well are rare, and it costs money to provide the resources. So people cut corners. This is the reality. And none of this will change if we move to EDBD.

So, what about EDBD?

I have tried not to turn this article into an attack on EDBD because I think that doing so is counter productive. However, I also think it’s worth mentioning one issue that does concern me - TDBD essentially centralised the ER modelling, EDBD essentially decentralises it.

Why do I think that this is a problem? Well, there will always be a tension between the requirements of the enterprise; it simultaneously demands that we:

  1. make changes to the database yesterday in order to accommodate the changing business processes.
  2. provide clear, consistent analytical information that spans years.

In order to ensure the latter we need a clear overview of the data structure and what the data ‘means’. This is one of the many reasons why the TDBD process not only centralises the ER modelling but also considers design in terms of the user, logical and physical models. This isn’t ‘busy work’; it is vital to enable us to keep track of both the data and, equally importantly, its meaning.

My impression is that EDBD has its roots in the application developer community. At worst application developers tend to see the application as king and the database as an inconveniently complex and obstructive repository in which they are occasionally forced to store data. They tend to favour processes that support option 1.

TDBD has its roots in the database developer community. At worst database developers see the database as a temple and applications as annoying processes that, unless watched closely, will trash the temple, despoil the data and mangle the meaning. They tend to favour processes that support option 2.

Our job, no matter what our background, is to use our common sense and provide the best balance we can between these two options.

Dr. Mark Whitehorn has been developing databases for over 20 years. During that time, he has published 9 books and 2 million words in well over a thousand articles. In the commercial world, he works as a database and data warehouse consultant. In the academic world, he holds an honorary position at Dundee University where he teaches advanced data handling techniques. He also hold a research position at Cambridge University where he applies these techniques to improving our understanding of how Darwin developed the theory of evolution (somewhat ironically, given the title of this article).

Security for virtualized datacentres

More from The Register

next story
Not appy with your Chromebook? Well now it can run Android apps
Google offers beta of tricky OS-inside-OS tech
Greater dev access to iOS 8 will put us AT RISK from HACKERS
Knocking holes in Apple's walled garden could backfire, says securo-chap
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.