As a graduate student, I took a seminar with “Tom Furness”:http://www.hitl.washington.edu/people/tfurness/. Before he came to Seattle to run the “HITLab”:http://www.hitl.washington.edu/, Furness designed advanced cockpit systems for the Air Force. One of the greatest dangers to pilots was something pretty dumb: forgetting to put the gear down before landing. In a very busy cockpit, it was essential that such an error get (to use a computer analogue) a higher interupt value. The warning had to rise above the cacophony and flashing lights of an already crowded informational area. After trying a wide variety of approaches, they found the one that would work.
Engineers recorded a message from a test pilot’s daughter, with the sound coming as a whisper from behind the left ear: “Daddy, put the gear down or you will crash.” When Tom tells the story, it gives you shudders to think of it, and that was exactly the intent.
I was suddenly reminded of this while reading a story on automated “mentors” for soldiers:
Akin to the promptings that newscasters get from their producers via earplugs during a broadcast, mentoring (as the technique is called) has already proved valuable to military teams in achieving mission goals. But instead of a human mentor, the Sandia group is looking to devise a software mentor that could offer advice like “take a deep breath and relax your upper body” or “pay attention to what Private Smith is about to say, his excitement level indicates it could be important.”
The researchers recently used a neural network to learn the signatures of abstract desirable traits like “leadership,” as well as warning signs such as “nervous,” “afraid” or “daydreaming,” by analyzing the pulse, respiration, perspiration, facial expressions, head movements and other biometric data streams coming from sensors attached to group members. Goals could be achieved in a stressful virtual environment only if the group cooperated effectively. By perfecting their approach in virtual reality, the scientists hope to someday enable automatic advice and counsel from a virtual mentor.
This is one of those technologies that is disturbing: and for me, disturbing is interesting. It’s disturbing in the same ways that subdermal technologies are, and for some of the same reasons. It is technology that is extraordinarily intimate. It “knows” things that only other people can know, and that sometimes neither they _nor_ you yourself knows. It’s creepy for a machine to know when you are _subconsciously_ affecting your body in some way, and it is creepy-squared when that is then reported to you by a simulated social entity.
But it is at the same time, for me, endlessly exciting. People throw around McLuhanian ideas of extending our nervous system, but these guys plan “to add a 128-channel electroencephalogram to correlate brain events with social interactions,” along with an electromyograph, electrocardiogram, and a bunch of other stuff. Where do I sign on?
If this is enough to pique your interest, definitely check out the video (“mpg”:http://www.sandia.gov/ACG/videos/Mentor%20Movie.mpg). It looks like what they are doing is recording biometrics of a bunch of a team in a “kind of virtual reality scenario” (looks like “counterstrike”:http://www.counter-strike.net/ to me), and then predicting each persons’ state.
Best quote? Dave Warner (who, perhaps not coincidentally, also spent some time at the HITLab)
This is like an exosomatic evolution, where the biology of humanness is actually creating other structures to support human-like activities. The computers are a step in evolution in the sense that biology is putting information together outside of itself that ultimately becomes part of the system in which biology is connected to. It’s a cybernetic loop. So this is the first phase in, probably, what will be a long series evolutional steps to… make the world a better place.
Great to know there are people seriously pushing the boundaries here. Exciting (though, of course, in some ways concerning) stuff.