Feeds

Real-time data warehousing

A change in the air?

Secure remote control for conventional and virtual desktops

Comment Data warehousing has historically been regarded as an environment which was specifically about analysing historic data, either to understand what has happened or, more recently, in order to try to predict what will happen: for example, to try to predict customers likely to churn.

However, there are also a number of environments in which you need to combine this historic information with real-time transactional data in order to make operational business decisions or to detect fraud or for anti-money laundering, and so on.

Traditionally (if we can use that term for something that has only come to the fore in the last few years), the way to implement this sort of capability has been through real-time data warehousing. This is what Teradata refers to as active data warehousing.

The way real-time data warehousing is typically implemented is that you use change data capture to grab relevant data after it is committed to the transactional database(s) and you then trickle feed that information to the warehouse via a message queue or similar mechanism. Then, in the warehouse, that new information is combined with historic data and, if relevant thresholds are crossed, alerts can be generated, rules triggered and action taken.

The question I want to pose is whether this is actually the best way to implement this sort of capability? It seems to me that using an event processing engine is a more appropriate technology to start from.

Event processing engines do not need you to commit data to a database before you start to action it; many of them can process deltas (changes to data) directly, so you don't need any change data capture; and you don't have any issues with getting data onto a message queue. The only potential problem is with the historic data that you may need from the data warehouse.

Most event processing engines have some sort of associated data store, though this is usually only for storing the events that have been captured. Some companies have specific capabilities for pattern recognition with these event stores. However, for real-time data warehousing you may well need to access historic information that would not form part of this database. So, vendors provide facilities to extract information from other sources (including data warehouses) where appropriate.

Now, consider the data needed from the warehouse in these sorts of circumstances. Are these large ad hoc or complex queries? No. In order to build the sort of workflow-style decisioning that is central to active and real-time data warehousing you have to know what you are looking for. In other words, queries are pre-built and the data sets you are going to be exploring are known.

This leads to two possible scenarios: in the first, the dataset you need to explore is relatively limited, in which case why not replicate a copy of the data into the event processing database and have the rules built on that platform. If you truly want real-time performance this will be much faster than the "traditional" approach to real-time data warehousing.

Secondly, there is the scenario where the dataset that needs to be explored is very large. You could sample it, of course, but if it is considered impractical to do this or to replicate the data to the event store, then I think that it is still (potentially) preferable to do the processing on the event engine and then to link that to the warehouse for query purposes: what you would need is a fast pipe between the event engine and the warehouse, and you want a warehouse that is optimised for these specific queries and, I would argue, this will be best served by a data warehouse appliance.

If this is a valid proposition then there are two things that are likely to happen in the market. First, we may see event processing and appliance suppliers forming partnerships (and developing fast pipes between them) and, secondly, we will see Teradata coming under even more pressure in the market, as active data warehousing is currently one of its major differentiators.

Copyright © 2006, IT-Analysis.com

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.