Feeds

Time to reject traditional database techniques?

'Big' data and the BI challenge

5 things you didn’t know about cloud backup

Mainstream database management system (DBMS) technology faces a challenge from new approaches that reject the relational model. The battleground is set to be the market for business intelligence based on very large databases.

Some main players in DBMS software are already jockeying for position with revamped database products aimed at recapturing ground lost to newer products. Recently Microsoft unveiled Kilimanjaro, the next massively scalable version of its SQL Server with a strong BI flavor, while database market number-one Oracle joined forces with Hewlett-Packard to launch its Exadata storage grid.

Both announcements shared a common theme: How to make huge volumes of data easily available to power business intelligence applications?

And the numbers in question are huge. We are, of course, familiar with "huge" numbers in time of billion-dollar banking-industry bail outs. But these figures are dwarfed by the numbers of transactions pouring into some company databases and the amount of storage needed to accommodate them.

Back in January, Google reckoned it processed 20 petabytes of data a day - a number that has doubtless grown significantly since. And even lower down the scale, LGR Telecommunications is reported to be adding 13 billion records each day to a 310 terabyte data warehouse system and expects its petabyte of disks to double in the next year.

Although such huge volumes are still relatively unusual, it will not be long before even relatively small organizations will think of terabytes and petabytes of data as commonplace. If they want to make practical use of the data in business intelligence applications, they will find their traditional relational DBMS technology stretched.

Cracks in the edifice

It is not only the logistics of storing and managing such enormous amounts of data that poses a big challenge to DBMS builders. There is also the problem of giving users access to the data in a form that it might actually be useful. User queries have grown more complex and the limitations of traditional access methods based on Structured Query Language (SQL) have been exposed.

The cracks in relational DBMS and the inadequacies of SQL were highlighted in a paper called The End of an Architectural Era, presented at the conference on Very Large Databases (VLDB) in September 2007. The collaborative work of several DBMS gurus - including Ingres/PostIngres originator Michael Stonebraker - the paper, declared the relational model obsolete and argued that alternative approaches were better suited to today's data management and access problems.

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
China hopes home-grown OS will oust Microsoft
Doesn't much like Apple or Google, either
Sin COS to tan Windows? Chinese operating system to debut in autumn – report
Development alliance working on desktop, mobe software
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Why has the web gone to hell? Market chaos and HUMAN NATURE
Tim Berners-Lee isn't happy, but we should be
Eat up Martha! Microsoft slings handwriting recog into OneNote on Android
Freehand input on non-Windows kit for the first time
Linux kernel devs made to finger their dongles before contributing code
Two-factor auth enabled for Kernel.org repositories
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?