Feeds

Software seer shows companies path to cheaper databases

DBSeer cuts through clouds with predictive database performance model

Boost IT visibility and business value

Anticipating where bottlenecks are going to develop in a live database has been one of the most bankable skills any self-respecting database administrator can have, yet researchers may now have figured out a set of algorithms that can do this automatically.

The DBSeer predictive modeling method, described in two academic papers authored by researchers at MIT and Microsoft, gives companies a way to model the ins and outs of their particular database so they can save on data center infrastructure and avoid downtime.

The DBSeer modeling method helps administrators spot DB problems without having to manually test out different configurations of the database under different load environments, the researchers write (PDF).

Its creators hope DBSeer can deal with the main shortcoming of running a database-as-a-service in an on-premise virtualized environment, which is that the isolation of compute power, per-VM billing, and the lack of information about the particulars of the underlying infrastructure makes tuning a database in the private cloud "more challenging than in conventional deployments."

"You can now answer many questions about your database that were previously only answered through 'try it and find out for yourself'," the lead author of the papers, Barzan Mozafari, tells The Register via email.

"Now in many cases we can predict what will happen without actually trying those configurations out. This can dramatically reduce the cost of testing and deploying your database configuration."

So far, the researchers have created an implementation of DBSeer that can help model performance for transactional MySQL workloads, but they believe it can be extended to other databases as well.

The system has proved so efficient that it has already piqued the interest of Teradata, which has tasked several of its engineers with the job of porting the DBSeer algorithm to its own software.

The system works by observing query-level logs and the OS statistics generated by a live database management system.

"It's a non-intrusive approach, i.e. it doesn't require modifying the database engine," Mozafari says. "It simply observes the load that comes into the database and the performance and resource consumption of the database and tries to understand the relationship between the two."

This allows DBSeer to model the CPU, RAM, network, disk I/O, and number of acquired locks per table, for various MySQL configurations.

To test the algorithm, the researchers generated 20 mixtures of the transaction processing performance council (TPC-C) benchmark with different ratios of transaction types. The average error rates of DBSeer's predictions ranged between 0 and 25 percent, with its I/O model performing best, with an average margin of error of 1 percent.

With a variance that low, we can see why Teradata would be interested in porting the technology to work with its own.

The researchers are due to deliver a further paper (draft PDF here) at the SIGMOD conference in June in New York, which will give further information on how to apply DBSeer to performance and resource modeling in highly-concurrent OLTP workloads.

The researchers hope that DBSeer can be extended to still other databases, including NoSQL ones.

"Row-store (NoSQL) ones are much simpler to model/predict because they are more linear (due to lack of locking) than a traditional transactional DB," Mozafari says.

If technologies like DBSeer are adopted, companies will be able to automate some of the tasks done by DBAs and make sure they're not provisioning more hardware for their databases than they actually need.

What has got El Reg's database desk all a-flutter is the thought of DBSeer being integrated into an off-premise rentable cloud, like, say, Amazon Web Services.

This would give database developers a technology that could give them real anticipated I/O performance for an off-site database, and go some way toward solving the numerous reliability concerns people have over running a database in the cloud. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Community chest: Storage firms need to pay open-source debts
Samba implementation? Time to get some devs on the job
Like condoms, data now comes in big and HUGE sizes
Linux Foundation lights a fire under storage devs with new conference
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
prev story

Whitepapers

Gartner critical capabilities for enterprise endpoint backup
Learn why inSync received the highest overall rating from Druva and is the top choice for the mobile workforce.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.