Tuesday, March 12, 2013

Finding the visible in the invisible

I don't really intend this to be a general blog, but here is something I found very interesting and it leaped right out at me as an opportunity for programming.

http://www.nytimes.com/video/2013/02/27/science/100000002087758/finding-the-visible-in-the-invisible.html

So many things that are done are based on subtle facial twitches, heart rates, with this stuff you could codify it visually with lag time (IE the movement stuff is a post processing thing) but lets say they could do this in ALMOST real time, and you had the visual part in google glass. So lets say a 1-2 second lag time. You could SEE someone getting heated when you ask them a question, even if they are not "saying" they are angry or mad. Optimistically you could have a hippy answer and say "this will let us be more socially adept" but really people would just probably use it to see if their answers or responses are positive when trying to pick up the ladies  

Officers could probably tell IMMEDIATELY if you were drunk, and if they used this tech to do that (IE no outward signs of drunkeness but had flush skin, breathing wrong or something) would that be enough to allow them to search you (4th amendment? ) would these be allowed by Jury members in a court situation? Could software be done to look at subtle eye ticks and be calibrated to tell if someone is lying in real time? How about the ability to test for attraction (IE when a person glances at you, does their heart rate increase, do their eyes widen) etc... What a cheat sheet, Hell, if I was going to make a social interaction program with it, I would actually just call the program Cheat Sheet.

No comments:

Post a Comment