Supercomputing in retail?
What are they getting out of it?
I got a call recently from a reporter who confronted me with an interesting question. He had attended SC04 in Pittsburgh, and one of his key takeaways from the show was that retailing was on the cusp of adopting HPC-style computing to crunch their data in order to improve both top-line and bottom-line results. He asked me: “Has HPC made it into retailing?” and followed up with “If not, why not?” and “If so, what are they getting out of it?”
His questions launched me into my whole “Age of Analytics” speech, where I lay out the macroeconomic forces that will fuel a big move toward predictive analytics in not only retailing, but in almost every industry. Globalization and the associated increase in competitive pressures (lower margins, reduced barriers to entry, global sourcing, increased buyer power, etc.) will require businesses to use every asset at their disposal to defend their existing markets and discover profitable new opportunities.
One of the most underutilized assets in business today is the massive piles of data that we’ve gathered and stored. Combining that data with other datasets can provide incredibly valuable market and trend insights. The resulting workloads are essentially the same as traditional HPC workloads in terms of computational requirements. Some companies, including retailers like Walmart, are already doing this in a big way, while others are just beginning to dip their toes into it.
I’ll be writing more about this trend from both business and technical perspectives in the future, but for a preview – and a good overview of the topic – read this article: “Second look at HPC: Is retail ready for supercomputing?”
Is your company looking at adopting predictive analytics? Or are you doing it already? Let us know in the comments section below…