Feeds

Microsoft vs. Teradata

Data Warehousing – there really isn't just one answer

Security and trust: The backbone of doing business over the internet

Microsoft’s approach

The problem that Microsoft elected to solve was that of producing an efficient multi-dimensional database engine that was fast and also cured the OLAP data explosion problem. This is another non-trivial problem but solving it, and using the resulting technology in the data marts, automatically solves Points 4 & 5 in our wish list. The data can be aggregated and that gives the blistering speed that’s required. In addition, multi-dimensional data means that users automatically get a hierarchical, dimensional and measured view of the data.

On the other hand, Microsoft’s approach means that you essentially accept that load times will be slower and auditing more of a challenge because of the proliferation of extra copies of data in the data marts. You also accept that the process will burn up more disk space.

However, supporters of this approach argue that the first three wish list points are not, in practice, much of an issue. Disk space and CPU cycles are cheap, auditing can be automated and that Microsoft is developing techniques such as proactive caching that essentially compensate for the delays in organising the data, bringing real-time analysis ever closer.

So, which is better?

One point is reasonably clear. If you have a need for a BI system that holds an awesomely large set of data, you will certainly be talking to Teradata. The company can field an impressive list of customers in the ‘monstrously, overwhelmingly, huge’ category. So, if we are simply going to rate the two strategies on ‘My BI system can be bigger than yours’ then Teradata wins.

But such a rating is nonsense for most enterprises. By definition, the average enterprise has an average BI requirement and both Microsoft and Teradata can provide a solution here. (Actually, assuming the skewed distribution that probably exists, we could even say that the modal company has a below-average requirement, but let’s not get picky). So both of these BI vendors have an appropriate technical solution for most companies and in practice, there seems genuinely to be very little overlap. Hermann Wimmer (Teradata’s Vice President of EMEA) told me that Teradata tends to focus only on the largest companies. Microsoft’s mantra has, for years, been “BI for the masses”.

In terms of the technologies, it is tempting to extrapolate that Microsoft couldn’t solve the problem of analytical access to relational data and therefore chose to ‘work around’ it. This is doubtless an oversimplification because, whilst it is true that this particular problem is known to be difficult to solve, it was also known to be soluble by the time Microsoft took a serious commercial interest in BI (Teradata had already done so). So, given its huge resources, Microsoft could have cracked the problem. In the same way, I have no doubt that Teradata could ‘do’ a multi-dimensional database engine if it elected to address the problem.

In addition, Teradata’s systems have always been ‘reassuringly expensive’. So Microsoft may well have rejected the highly specialised solution (that works for all conceivable sizes of data) and elected to pursue a line that offers a much more cost-effective solution for the majority of potential customers.

The bottom line is that while Teradata solution fits all, and sometimes may be the only feasible solution; Microsoft’s is likely to be much more cost effective for the majority.

PS

I am quite well aware that the relational model is a logical model and that it is therefore nonsense to imply that relational structures are inherently slow for the simple reason that the model says nothing about implementation on disk. The reason for the poor analytical performance of relational systems lies in the way that most RDBMS engine designers have elected to store their data structures on disk; it doesn’t lie with the relational model itself. Nevertheless, it remains true that on comparable hardware, analytical access to multi-dimensional data is usually orders of magnitude faster than the same access to data stored in the current crop of mainstream relational engines.

Providing a secure and efficient Helpdesk

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
'People have forgotten just how late the first iPhone arrived ...'
Plus: 'Google's IDEALISM is an injudicious justification for inappropriate biz practices'
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.