Feeds

Your mind's '.brain' jpeg-like picture file format probed

Cunning noggin algorithm favours curves over flatness

The essential guide to IT transformation

Top boffins say they have gained insights into one of the most amazing capabilities of the human brain – its ability to store and recognise visual images. They say that the noggin compresses pictures in a process roughly as efficient in terms of storage space as turning photos into jpegs – but one much superior for the purpose of using the data subsequently.

“Computers can beat us at math and chess,” says neuroscientist Ed Connor, “but they can’t match our ability to distinguish, recognize, understand, remember, and manipulate the objects that make up our world. For now, at least, the ‘.brain’ format seems to be the best compression algorithm around.”

Human eyes produce roughly megapixel imagery, but the brain doesn't have enough storage to cope with much of this so it needs to turn the info into '.brain' files that it can work with later.

Connor and his colleagues' research involved creating a computer simulation of the cells in the "V4" area of primate brains such as yours and mine. It seems that V4 cells' response to images from the eye is strongly sensitive to to the amount of curve or angle in objects perceived. According to a statement accompanying the research:

High-curvature regions are relatively rare in natural objects, compared to flat and shallow curvature. Responding to rare features rather than common features is automatically economical.

According to the braincell sims, compressing out flat, shallow parts of pictures and keeping the curved or pointy bits can yield "an 8-fold decrease in the number of cells responding to each image, comparable to the file size reduction achieved by compressing photographs into the .jpeg format".

But such .brain pictures, stripped of much of the information on flat bits, are still good for subsequent recognition of objects, faces etc.

“Psychological experiments have shown that subjects can still recognize line drawings of objects when flat edges are erased. But erasing angles and other regions of high curvature makes recognition difficult,” says Connor.

The research is outlined in a paper titled A Sparse Object Coding Scheme in Area V4 which was published yesterday in the journal Current Biology. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
Is this the real life? Is this just fantasy?
Our LOHAN spaceplane ballocket Kickstarter climbs through £8000
Through 25 per cent but more is needed: Get your UNIQUE rewards!
China building SUPERSONIC SUBMARINE that travels in a BUBBLE
Shanghai to San Fran in two hours would be a trick, though
LOHAN tunes into ultra long range radio
And verily, Vultures shall speak status unto distant receivers
SpaceX prototype rocket EXPLODES over Texas. 'Tricky' biz, says Elon Musk
No injuries or near injuries. Flight stayed in designated area
Galileo, Galileo! Galileo, Galileo! Galileo fit to go. Magnifico
I'm just a poor boy, nobody loves me. But at least I can find my way with ESA GPS by 2017
Astronomers scramble for obs on new comet
Amateur gets fifth confirmed discovery
prev story

Whitepapers

Best practices for enterprise data
Discussing how technology providers have innovated in order to solve new challenges, creating a new framework for enterprise data.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?