Feeds

Big data hitting the fan? Nyquist-Shannon TOOL SAMPLE can save you

Lessons from information theorists brainboxes - in pictures

Secure remote control for conventional and virtual desktops

Big Data's Big 5 You are working on a big data project that collects data from sensors that can be polled every 0.1 of a second.

But just because you can doesn’t mean you should, so how do we decide how frequently to poll sensors?

The tempting answer is to collect it all, every last ping. That way, not only is your back covered but you can also guarantee to answer, in the future, all the possible questions that could be asked of that data.

When dealing with transactional data I certainly default to the “collect it all” approach because generally the volume of transactional data is much lower. But with sensor data there can be a real trade-off.

Suppose you can capture all of the data (10 times a second) at a cost of £45m per year. But you could also collect data once every 100 seconds (which is 0.1 per cent of the data) for 0.1 per cent of the cost, a measly £45,000. Are all those tiny little details really worth the extra £44,955m?

More importantly, how do you make the decision?

Welcome to Nyquist–Shannon sampling, also known as Nyquist Theorem, from Harry Theodor Nyquist (1889–1976) and Claude Elwood Shannon (1916–2001).

Nyquist was a gifted student turned Swedish émigré who'd worked at AT&T and Bell Laboratories until 1954 and who earned recognition in for his lifetime's work on thermal noise, data transmission and in telegraph transmission. The chiseled Shannon is considered the founding father of the electronic communications age, who devised the concept of information theory while working at Bell Labs in the 1940s. His award-winning work looked at the conditions that affect transmission and processing of data.

Their theorem has since been used by engineers to reproduce analogue (continuous) signals as (discrete) digital and look at bandwidth.

Let’s think about a real example where their theory might help us: polling smart electricity meters at regular intervals. If the sole analytical requirement is to monitor usage on a daily basis it doesn’t require a genius to work out the required sampling rate. But suppose we are given a broader specification.

Say your client tells you: “We know that our customers have appliances that come on and off on a regular cycle (fridges, freezers etc) and we want to analyse the usage of those individual appliances.”

I can’t tell you how to perform the analysis (I’d need more information) but I can instantly put a lowest limit on how frequently you need to sample in order to be able to guarantee this ability.

Take the appliance that switches on and off most rapidly and sample slightly more than twice as often as it switches. So, if it switches on once every 20 minutes and off about 10 minutes later (so it goes on and off about three times an hour), you should sample just over six times an hour.

As said, our answer is Nyquist–Shannon, which states:

If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart.

Secure remote control for conventional and virtual desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
The total economic impact of Druva inSync
Examining the ROI enterprises may realize by implementing inSync, as they look to improve backup and recovery of endpoint data in a cost-effective manner.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.