Feeds

Big data hitting the fan? Nyquist-Shannon TOOL SAMPLE can save you

Lessons from information theorists brainboxes - in pictures

Internet Security Threat Report 2014

Big Data's Big 5 You are working on a big data project that collects data from sensors that can be polled every 0.1 of a second.

But just because you can doesn’t mean you should, so how do we decide how frequently to poll sensors?

The tempting answer is to collect it all, every last ping. That way, not only is your back covered but you can also guarantee to answer, in the future, all the possible questions that could be asked of that data.

When dealing with transactional data I certainly default to the “collect it all” approach because generally the volume of transactional data is much lower. But with sensor data there can be a real trade-off.

Suppose you can capture all of the data (10 times a second) at a cost of £45m per year. But you could also collect data once every 100 seconds (which is 0.1 per cent of the data) for 0.1 per cent of the cost, a measly £45,000. Are all those tiny little details really worth the extra £44,955m?

More importantly, how do you make the decision?

Welcome to Nyquist–Shannon sampling, also known as Nyquist Theorem, from Harry Theodor Nyquist (1889–1976) and Claude Elwood Shannon (1916–2001).

Nyquist was a gifted student turned Swedish émigré who'd worked at AT&T and Bell Laboratories until 1954 and who earned recognition in for his lifetime's work on thermal noise, data transmission and in telegraph transmission. The chiseled Shannon is considered the founding father of the electronic communications age, who devised the concept of information theory while working at Bell Labs in the 1940s. His award-winning work looked at the conditions that affect transmission and processing of data.

Their theorem has since been used by engineers to reproduce analogue (continuous) signals as (discrete) digital and look at bandwidth.

Let’s think about a real example where their theory might help us: polling smart electricity meters at regular intervals. If the sole analytical requirement is to monitor usage on a daily basis it doesn’t require a genius to work out the required sampling rate. But suppose we are given a broader specification.

Say your client tells you: “We know that our customers have appliances that come on and off on a regular cycle (fridges, freezers etc) and we want to analyse the usage of those individual appliances.”

I can’t tell you how to perform the analysis (I’d need more information) but I can instantly put a lowest limit on how frequently you need to sample in order to be able to guarantee this ability.

Take the appliance that switches on and off most rapidly and sample slightly more than twice as often as it switches. So, if it switches on once every 20 minutes and off about 10 minutes later (so it goes on and off about three times an hour), you should sample just over six times an hour.

As said, our answer is Nyquist–Shannon, which states:

If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart.

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
Intel, Cisco and co reveal PLANS to keep tabs on WORLD'S MACHINES
Connecting everything to everything... Er, good idea?
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.