“Hi, I’m your data; can you hear me now?”

FROM PHYSYCSTODAY.COM – see article here: http://www.eetimes.com/electronics-blogs/other/4373153/-Hi–I-m-your-data–can-you-hear-me-now-

 I just saw a fascinating story in a recent issue of Physics Today on “listening” to your data—literally, see “Shhhh. Listen to the data.” The article showed how making real-world data audible allows the ear and brain to sense patterns, extract features, and find occurrences which might otherwise not be found by conventional data-analysis packages.

This may seem a counterintuitive throwback to those quaint, ancient methods in our software-intensive world, but reality is that the brain can extract things that even our most impressive computers and algorithms can’t, or which require significant computing power to achieve. Also, the brain is good at dealing with the unexpected, while even the best data-analysis package can only find what it has been “programmed” to expect.

A few years ago, I spoke to some people doing software for the DARPA autonomous-vehicle road race, and asked them about the biggest challenges they faced. The answer was pretty quick and unambiguous: having the vehicle “see” where the actual road was, and not be misled by trees, signs, fences, obstacles, distractions, road irregularities, and the almost countless other realities of what the vehicle’s cameras could see. Many lines of code and corresponding MIPS were dedicated to image recognition and feature extraction, they added.

The irony is that seeing and then knowing where the road is turns out to be pretty easy for almost anyone, even those with poor actual driving skills. Yet the brain is not executing millions of lines of code, nor doing MFLOPS of processing to figure it out. Whether using audible, visual, or other senses, the brain is amazingly good at determining patterns and anomalies. And don’t kid yourself: we have almost no idea how the brain does this, despite what the neuroresearchers would like you to think.

Experienced engineers use all their senses when designing, assessing what’s going on, and finding out what’s not going as expected. Good design and debug is a combination of formal tools and also the human ones: sight, sound, feel, and yes, smell. The best debugging methods I have seen and used are also the oldest: look, listen, expect the unexpected, and then stop and think, before jumping to your next step.

Have you ever used the informal tools of human senses individually or in combination, to find the source of your problems or assess your designs? Did you do this intentionally or accidentally?

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

  • FET project

  • EC funded

  • Under FP7

  • Project details

    Project number: 258749
    Call (part) identifier: FP7-ICT-2009-5
    Project Start Date: 1st September 2010
    Project Duration: 48 month

%d bloggers like this: