Feeds

Software seer shows companies path to cheaper databases

DBSeer cuts through clouds with predictive database performance model

Beginner's guide to SSL certificates

Anticipating where bottlenecks are going to develop in a live database has been one of the most bankable skills any self-respecting database administrator can have, yet researchers may now have figured out a set of algorithms that can do this automatically.

The DBSeer predictive modeling method, described in two academic papers authored by researchers at MIT and Microsoft, gives companies a way to model the ins and outs of their particular database so they can save on data center infrastructure and avoid downtime.

The DBSeer modeling method helps administrators spot DB problems without having to manually test out different configurations of the database under different load environments, the researchers write (PDF).

Its creators hope DBSeer can deal with the main shortcoming of running a database-as-a-service in an on-premise virtualized environment, which is that the isolation of compute power, per-VM billing, and the lack of information about the particulars of the underlying infrastructure makes tuning a database in the private cloud "more challenging than in conventional deployments."

"You can now answer many questions about your database that were previously only answered through 'try it and find out for yourself'," the lead author of the papers, Barzan Mozafari, tells The Register via email.

"Now in many cases we can predict what will happen without actually trying those configurations out. This can dramatically reduce the cost of testing and deploying your database configuration."

So far, the researchers have created an implementation of DBSeer that can help model performance for transactional MySQL workloads, but they believe it can be extended to other databases as well.

The system has proved so efficient that it has already piqued the interest of Teradata, which has tasked several of its engineers with the job of porting the DBSeer algorithm to its own software.

The system works by observing query-level logs and the OS statistics generated by a live database management system.

"It's a non-intrusive approach, i.e. it doesn't require modifying the database engine," Mozafari says. "It simply observes the load that comes into the database and the performance and resource consumption of the database and tries to understand the relationship between the two."

This allows DBSeer to model the CPU, RAM, network, disk I/O, and number of acquired locks per table, for various MySQL configurations.

To test the algorithm, the researchers generated 20 mixtures of the transaction processing performance council (TPC-C) benchmark with different ratios of transaction types. The average error rates of DBSeer's predictions ranged between 0 and 25 percent, with its I/O model performing best, with an average margin of error of 1 percent.

With a variance that low, we can see why Teradata would be interested in porting the technology to work with its own.

The researchers are due to deliver a further paper (draft PDF here) at the SIGMOD conference in June in New York, which will give further information on how to apply DBSeer to performance and resource modeling in highly-concurrent OLTP workloads.

The researchers hope that DBSeer can be extended to still other databases, including NoSQL ones.

"Row-store (NoSQL) ones are much simpler to model/predict because they are more linear (due to lack of locking) than a traditional transactional DB," Mozafari says.

If technologies like DBSeer are adopted, companies will be able to automate some of the tasks done by DBAs and make sure they're not provisioning more hardware for their databases than they actually need.

What has got El Reg's database desk all a-flutter is the thought of DBSeer being integrated into an off-premise rentable cloud, like, say, Amazon Web Services.

This would give database developers a technology that could give them real anticipated I/O performance for an off-site database, and go some way toward solving the numerous reliability concerns people have over running a database in the cloud. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
IT crisis looming: 'What if AWS goes pop, runs out of cash?'
Public IaaS... something's gotta give - and it may be AWS
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
Troll hunter Rackspace turns Rotatable's bizarro patent to stone
News of the Weird: Screen-rotating technology declared unpatentable
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.