Feeds

GPUs slick up with oil sleuths

Mind-boggling data streams

Combat fraud and increase customer satisfaction

I stopped by the Oil & Gas track at the 2010 GPU Tech conference this morning and learned quite a bit about the key drivers on the exploration side of the industry. I already knew the key drivers on the distribution side of the business - potato chips, watery fountain drinks and herbal energy pills - but that was presumably covered in a different break-out session. In this session, the speaker, from the exploration arm of oil giant Schlumberger, did a great job of laying out the big picture and relating it to their computing challenges.

It breaks down like this: we're hungry for oil and they need to find more of it. The costs of finding and extracting black gold have escalated as the easy stuff laying around the surface has already been found. While there is lots of oil out there, it's either still hiding or is buried beneath deep oceans or under piles of rocks. Finding it and pulling it out is where computers come in handy.

I've heard the term seismic processing for years and understand the concept - it's where you send waves into rocks and see how long it takes them to bounce off the rocks and hit receivers located somewhere else. Do this enough and you'll build up a good picture of what's located in the various strata underneath the surface. The more waves you send at a higher frequency, the better the picture. But this tends to send the amount of data you end up processing through the roof.

For example, the average ship running seismic gear has between 20,000 and 25,000 sensors on board, and you typically use several ships in concert to survey an area. This will yield anywhere from 50 to 200TB of data per run and take five to seven days of solid processing on a large number of systems to get results. If you ramp up the resolution, it can take 15,000-20,000 compute nodes running days or weeks to complete the job.

The competitive advantage for the surveying company comes from delivering high-quality results quicker than the next guy. Computing power is critical in winning that race. These oil and gas guys are brand agnostic to the extreme - they buy what yields the best price/performance (with an emphasis on performance) at any given time. Sometimes that means Intel, sometimes it means AMD - but right now, it means GPUs. Lots of GPUs, in fact.

Between June and October 2009, they almost doubled their overall capacity by adding GPU compute capacity and since then have doubled it again. They've seen about a six-fold reduction in overall cost and a five-fold increase in performance on their algorithms. According to the speaker, they didn't have much problem porting their code or performance tuning it to run under CUDA. Their analytical tools are a fairly limited set and all are embarrassingly parallel, making them a near perfect fit for the GPU computing model.

Getting such a non-qualified endorsement for GPU computing isn't surprising at the GPU Technology Conference, right? But it's a more compelling story when it comes from real world practitioners, rather than marketing slide monkeys or coin-operated sales people. ®

Combat fraud and increase customer satisfaction

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
BOFH: Oh DO tell us what you think. *CLICK*
$%%&amp Oh dear, we've been cut *CLICK* Well hello *CLICK* You're breaking up...
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Amazon reveals its Google-killing 'R3' server instances
A mega-memory instance that never forgets
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
prev story

Whitepapers

Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Top three mobile application threats
Learn about three of the top mobile application security threats facing businesses today and recommendations on how to mitigate the risk.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.