The F# language is in customer technology preview this week and is part of Visual Studio, which is in beta. Windows HPC Server 2008 R2, which is required for the integration with Excel, is in beta now as well, and it includes enhancements to the scheduler and tuning in the Message Passing Interface (MPI) stack used to create an HPC cluster. The MPI stack has optimizations for the latest processors from Intel and Advanced Micro Devices, better MPI debugging, and enhanced support for the Remote Direct Memory Access (RDMA) protocol over Ethernet and InfiniBand.
"We're just as fast as Linux on microkernel and other benchmarks, which is a big change for us over the past three years," says Mendillo, who added that the HPC business was "growing extremely fast" but would not quantify that. "We are getting a lot of consideration on deals we didn't see a year and a half ago. We are getting considered half the time now."
Part of this is due to the fact that there are new companies (particularly in life sciences) and users (particularly in financial services) who do not have experience with Linux and don't want it. They are growing up out of their workstations and into HPC clusters. And they want to keep their data on Windows servers and use Active Directory and they want to use tools like SharePoint to share the results of their calculations too.
To help make a Windows HPC Server cluster less intimidating, Microsoft's Systems Center tools have been tweaked with the upcoming R2 release to do a better job deploying a cluster. Mendillo says that System Center can deploy a 1,000-node cluster in between 4 and 5 hours. Microsoft has also created a tool nicknamed "The Lizard," short for the Linpack Performance Wizard, that takes the smarts of the best HPC techies at Microsoft and encapsulates it in a set of wizards that automatically optimizes a cluster to run the Linpack Fortran benchmark. The idea is that you tune for the benchmark and now your parallel applications will run better, too.
Mendillo would not say when Microsoft would get the recently acquired Star-P application parallelization tools into its stack, but it looks like the tools will go into Windows HPC Server at first, not Visual Studio. Mendillo said that the key people of Interactive Supercomputing are now working in the Microsoft "nerd center" across the street from the Massachusetts Institute of Technology in Cambridge, and that Star-P will be in technology preview with a future release of HPC Server sometime next year.
Microsoft's Windows HPC Server still didn't rank very high on the Top 500 supercomputing list that came out this week at SC09, but the University of Southampton popped onto the list at number 74 running the R2 beta on 66.8 teraflops cluster made of IBM's iDataPlex iron and using Xeon 5500 processors.
Expect Microsoft to have a much better showing soon, though. The Tokyo Institute of Technology has just inked a deal to do a second-generation cluster based on blade servers from Sun Microsystems, GPU co-processors from Nvidia, and Windows HPC Server 2008 R2. This machine, named Tsubame 2, will replace Tsubame 1, which was comprised of Sun Opteron blades and ClearSpeed math co-processors and was rated at 87 teraflops. Tsubame 2 is going to weigh in at a much heftier 3 petaflops of peak performance and will almost certainly rank in the top five when it is delivered in the spring of 2010. It may even take over the top spot. ®
How hard can it be? It's mostly VB6 based so it's pretty simple. You'd just have to simulate the Excel calls and compile it as a VB6 program.
I agree in principle with people who say these should be proper programs or database-based apps rather than spreadsheets, but if you think it likely you've never worked in an office.
My last task at my previous job was using VBA code in Access to automatically reformat and import a billion and one Excel spreadsheets- then every subsequent one. Why couldn't the data be fed straight into Access, or the forms be changed to make it easier? "It's been like this for 15 years, why change it?".
Then more VBA was required to run the results of an Access query through a string of Excel spreadsheets (can't just incorporate the code into Access or change the Excel code to make it easier as "it's always been like this") before dumping the results using VBA into a template form in word (again, set in a decade's worth of stone) and telling Outlook to put it into a (standard) email. They select the form they want to process, click "process" and get presented with the finished form for checking- even the standard email's written for them.
A complete masterpiece of code if I'm honest- took bloody ages but it's all documented, commented, fault-tolerant and as efficient as it can be given what I had to work with. It took the whole process a minute to run rather than an hour manually typing and re-typing information- but I've been told it's now being ignored and they're doing it manually again because "it's always been done like that" and they "don't trust the computer to do it right" even though they get to verify the form as correct (haven't had any problems so far). Plus they cleared a lot of their departmental work backlog, meaning that there's less work- meaning there'll probably be layoffs as they just don't need as many people.
That's what you've got to compete with if you're proposing moving things to a proper system- decades of frozen-solid company dogma, people throwing ice over any unfreezing efforts to try and keep their technically unnecessary jobs, and Excel being "free" (it's always on their computers but this new technology means paying money. Why'd you do that when excel's free?).
Any man who can push through a full database and proper program based system for a well established company- and keep them using it- is truly either a man to be reckoned with.
Folks build huge inefficient spread-sheets for job security. They like having to be kept around to "tweak" the piles of shit as they crash a lot.
This is a hate of mine, and I know it.
One of my school projects was to build an application (ticketing system) in Excel.
It was a NIGHTMARE, we were meant to do it by recording every action, pah... open up the source, sorted.
So so slow for the most basic things, its not flexible... it's shit.