Feeds

Big data hitting the fan? Nyquist-Shannon TOOL SAMPLE can save you

Lessons from information theorists brainboxes - in pictures

Designing a Defense for Mobile Applications

Big Data's Big 5 You are working on a big data project that collects data from sensors that can be polled every 0.1 of a second.

But just because you can doesn’t mean you should, so how do we decide how frequently to poll sensors?

The tempting answer is to collect it all, every last ping. That way, not only is your back covered but you can also guarantee to answer, in the future, all the possible questions that could be asked of that data.

When dealing with transactional data I certainly default to the “collect it all” approach because generally the volume of transactional data is much lower. But with sensor data there can be a real trade-off.

Suppose you can capture all of the data (10 times a second) at a cost of £45m per year. But you could also collect data once every 100 seconds (which is 0.1 per cent of the data) for 0.1 per cent of the cost, a measly £45,000. Are all those tiny little details really worth the extra £44,955m?

More importantly, how do you make the decision?

Welcome to Nyquist–Shannon sampling, also known as Nyquist Theorem, from Harry Theodor Nyquist (1889–1976) and Claude Elwood Shannon (1916–2001).

Nyquist was a gifted student turned Swedish émigré who'd worked at AT&T and Bell Laboratories until 1954 and who earned recognition in for his lifetime's work on thermal noise, data transmission and in telegraph transmission. The chiseled Shannon is considered the founding father of the electronic communications age, who devised the concept of information theory while working at Bell Labs in the 1940s. His award-winning work looked at the conditions that affect transmission and processing of data.

Their theorem has since been used by engineers to reproduce analogue (continuous) signals as (discrete) digital and look at bandwidth.

Let’s think about a real example where their theory might help us: polling smart electricity meters at regular intervals. If the sole analytical requirement is to monitor usage on a daily basis it doesn’t require a genius to work out the required sampling rate. But suppose we are given a broader specification.

Say your client tells you: “We know that our customers have appliances that come on and off on a regular cycle (fridges, freezers etc) and we want to analyse the usage of those individual appliances.”

I can’t tell you how to perform the analysis (I’d need more information) but I can instantly put a lowest limit on how frequently you need to sample in order to be able to guarantee this ability.

Take the appliance that switches on and off most rapidly and sample slightly more than twice as often as it switches. So, if it switches on once every 20 minutes and off about 10 minutes later (so it goes on and off about three times an hour), you should sample just over six times an hour.

As said, our answer is Nyquist–Shannon, which states:

If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart.

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Apple fanbois SCREAM as update BRICKS their Macbook Airs
Ragegasm spills over as firmware upgrade kills machines
Attack of the clones: Oracle's latest Red Hat Linux lookalike arrives
Oracle's Linux boss says Larry's Linux isn't just for Oracle apps anymore
THUD! WD plonks down SIX TERABYTE 'consumer NAS' fatboy
Now that's a LOT of porn or pirated movies. Or, you know, other consumer stuff
EU's top data cops to meet Google, Microsoft et al over 'right to be forgotten'
Plan to hammer out 'coherent' guidelines. Good luck chaps!
US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
Crooks don't have folders labelled 'drug records', opines NY beak
Manic malware Mayhem spreads through Linux, FreeBSD web servers
And how Google could cripple infection rate in a second
FLAPE – the next BIG THING in storage
Find cold data with flash, transmit it from tape
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Reducing security risks from open source software
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Consolidation: the foundation for IT and business transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.