Feeds

Software seer shows companies path to cheaper databases

DBSeer cuts through clouds with predictive database performance model

Secure remote control for conventional and virtual desktops

Anticipating where bottlenecks are going to develop in a live database has been one of the most bankable skills any self-respecting database administrator can have, yet researchers may now have figured out a set of algorithms that can do this automatically.

The DBSeer predictive modeling method, described in two academic papers authored by researchers at MIT and Microsoft, gives companies a way to model the ins and outs of their particular database so they can save on data center infrastructure and avoid downtime.

The DBSeer modeling method helps administrators spot DB problems without having to manually test out different configurations of the database under different load environments, the researchers write (PDF).

Its creators hope DBSeer can deal with the main shortcoming of running a database-as-a-service in an on-premise virtualized environment, which is that the isolation of compute power, per-VM billing, and the lack of information about the particulars of the underlying infrastructure makes tuning a database in the private cloud "more challenging than in conventional deployments."

"You can now answer many questions about your database that were previously only answered through 'try it and find out for yourself'," the lead author of the papers, Barzan Mozafari, tells The Register via email.

"Now in many cases we can predict what will happen without actually trying those configurations out. This can dramatically reduce the cost of testing and deploying your database configuration."

So far, the researchers have created an implementation of DBSeer that can help model performance for transactional MySQL workloads, but they believe it can be extended to other databases as well.

The system has proved so efficient that it has already piqued the interest of Teradata, which has tasked several of its engineers with the job of porting the DBSeer algorithm to its own software.

The system works by observing query-level logs and the OS statistics generated by a live database management system.

"It's a non-intrusive approach, i.e. it doesn't require modifying the database engine," Mozafari says. "It simply observes the load that comes into the database and the performance and resource consumption of the database and tries to understand the relationship between the two."

This allows DBSeer to model the CPU, RAM, network, disk I/O, and number of acquired locks per table, for various MySQL configurations.

To test the algorithm, the researchers generated 20 mixtures of the transaction processing performance council (TPC-C) benchmark with different ratios of transaction types. The average error rates of DBSeer's predictions ranged between 0 and 25 percent, with its I/O model performing best, with an average margin of error of 1 percent.

With a variance that low, we can see why Teradata would be interested in porting the technology to work with its own.

The researchers are due to deliver a further paper (draft PDF here) at the SIGMOD conference in June in New York, which will give further information on how to apply DBSeer to performance and resource modeling in highly-concurrent OLTP workloads.

The researchers hope that DBSeer can be extended to still other databases, including NoSQL ones.

"Row-store (NoSQL) ones are much simpler to model/predict because they are more linear (due to lack of locking) than a traditional transactional DB," Mozafari says.

If technologies like DBSeer are adopted, companies will be able to automate some of the tasks done by DBAs and make sure they're not provisioning more hardware for their databases than they actually need.

What has got El Reg's database desk all a-flutter is the thought of DBSeer being integrated into an off-premise rentable cloud, like, say, Amazon Web Services.

This would give database developers a technology that could give them real anticipated I/O performance for an off-site database, and go some way toward solving the numerous reliability concerns people have over running a database in the cloud. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.