Feeds

Real-time data warehousing

A change in the air?

Maximizing your infrastructure through virtualization

Comment Data warehousing has historically been regarded as an environment which was specifically about analysing historic data, either to understand what has happened or, more recently, in order to try to predict what will happen: for example, to try to predict customers likely to churn.

However, there are also a number of environments in which you need to combine this historic information with real-time transactional data in order to make operational business decisions or to detect fraud or for anti-money laundering, and so on.

Traditionally (if we can use that term for something that has only come to the fore in the last few years), the way to implement this sort of capability has been through real-time data warehousing. This is what Teradata refers to as active data warehousing.

The way real-time data warehousing is typically implemented is that you use change data capture to grab relevant data after it is committed to the transactional database(s) and you then trickle feed that information to the warehouse via a message queue or similar mechanism. Then, in the warehouse, that new information is combined with historic data and, if relevant thresholds are crossed, alerts can be generated, rules triggered and action taken.

The question I want to pose is whether this is actually the best way to implement this sort of capability? It seems to me that using an event processing engine is a more appropriate technology to start from.

Event processing engines do not need you to commit data to a database before you start to action it; many of them can process deltas (changes to data) directly, so you don't need any change data capture; and you don't have any issues with getting data onto a message queue. The only potential problem is with the historic data that you may need from the data warehouse.

Most event processing engines have some sort of associated data store, though this is usually only for storing the events that have been captured. Some companies have specific capabilities for pattern recognition with these event stores. However, for real-time data warehousing you may well need to access historic information that would not form part of this database. So, vendors provide facilities to extract information from other sources (including data warehouses) where appropriate.

Now, consider the data needed from the warehouse in these sorts of circumstances. Are these large ad hoc or complex queries? No. In order to build the sort of workflow-style decisioning that is central to active and real-time data warehousing you have to know what you are looking for. In other words, queries are pre-built and the data sets you are going to be exploring are known.

This leads to two possible scenarios: in the first, the dataset you need to explore is relatively limited, in which case why not replicate a copy of the data into the event processing database and have the rules built on that platform. If you truly want real-time performance this will be much faster than the "traditional" approach to real-time data warehousing.

Secondly, there is the scenario where the dataset that needs to be explored is very large. You could sample it, of course, but if it is considered impractical to do this or to replicate the data to the event store, then I think that it is still (potentially) preferable to do the processing on the event engine and then to link that to the warehouse for query purposes: what you would need is a fast pipe between the event engine and the warehouse, and you want a warehouse that is optimised for these specific queries and, I would argue, this will be best served by a data warehouse appliance.

If this is a valid proposition then there are two things that are likely to happen in the market. First, we may see event processing and appliance suppliers forming partnerships (and developing fast pipes between them) and, secondly, we will see Teradata coming under even more pressure in the market, as active data warehousing is currently one of its major differentiators.

Copyright © 2006, IT-Analysis.com

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.