Wednesday, January 25, 2012

Joining the Mozilla BrowserID Initiative

Have you ever had trouble remembering the login to one of your dozen or so online accounts?  Or have you ever had confusion over which user is logged into a given webmail or social network on a shared computer?

Mozilla Firefox is rolling out a new browser-based initiative to permit the storing of all IDs associated with one single login identity.  You can think of it as a single key to your online identities instead of a keychain of dozens of unique keys.  BrowserID will be even more secure than the complex conventional manner of site login management as the user will be able to centrally control all accounts associated with your personal online content.

Over the coming months I will have the privilege of working with their engineers to bring this platform to a broader audience through partnerships with leading browser,  email and social network providers across the web. 

BrowserID will enable you to toggle between professional and personal online accounts with ease and make it much simpler for families to be able to share computers without confusion resulting from users having different browser cookies for different signed in users across multiple web properties.  Like the BMW being able to identify its driver when he/she approaches, the Firefox browser with BrowserID enabled will be able to effortlessly escort you to your frequented websites without the typical confusion of multiple cryptic logins.

From the company that popularized browser-based advertisement controls and tabbed browsing, a new advancement in web technology is about to unfold.  Web surfing is about to get faster and friendlier.

Stay tuned as we roll out more exciting features of this product to a browser or website near you.

Wednesday, January 4, 2012

EEG hats for everyone

NeuroSky Chip Toy
There are a few interesting companies developing "Brain Computer Interfaces" for toys and digital devices.  These devices read electrical fields above your scalp that indicate activity happening on the inside of your skull.  Though the devices can't capture thoughts, they can signal what regions of your brain are active at any given moment.  What this means is that the skin of your head can be used in lieu hand gestures, replacing a keyboard, mouse or joystick input.

Developers should care about this because two of these companies are seeking our help, in that they are inviting us to code applications to leverage their consumer headsets.  My colleagues and I have been testing the different tools to see if they can socket into mobile apps for use in stress management or mobile gaming.  These tools are currently in use in the medical field for those who lack the ability to leverage conventional computer interfaces.  The question is whether these tools will ever supersede the hand (keyboard/mouse/gesture), or the tongue (Siri/DragonDictate/GoogleVoiceSearch) for interfacing with computers.  

Force Trainer - Uncle Milton
NeuroSky is the price performer in consumer electronics so far.  ($40-$70)  Devices have been mass produced with Mattel and Uncle Milton Toys to bring these to households in the US market with a very basic single command that comes from the cerebral cortex and (perhaps) the frontal lobe.  The significant benefit of NeuroSky's chips and sensors is that they are dry-contact, not requiring the wet or gel contacts used in medical-grade EEG.  The left forehead contact is the point that is supposed to affect the toy, elevating a ping-pong ball when the mind is still in concentration.

NeuroSky MindFlex Toy
NeuroSky claims to be coming out with a new headset similar to the toys released by Uncle Milton (above) and Mattel (left), but that will interface with your mobile applications instead of the hard-wired hard-coded toys previously released.  They are hosting meetups in Silicon Valley to work with developers on coming out with the first round of apps that will interface with these headsets.  So stay tuned on that front.  One thing they tell us though is that there will still only be the on/off command structure of the frontal lobe input.  So don't expect to do right/left or complex motor interpretation such as a spatial game.

A very attractive aspect of NeuroSky products is that they are already bluetooth based.  So the player doesn't need to worry about wires.  This can give the user the illusion of some kind of telepathy, which the presence of wires might minimize. Also, the dry-contact sensors overcome the potential consumer adoption hurdle that medical grade EEG gel contacts would encounter. 

Our experience with the NeuroSky chip is that it is very easy to set up but challenging to manipulate.  The process of control is to occupy the mind with a focused thought for a steady period to keep the contacts in the headset sensing a continual steady signal.  High mental activity causes the ping-pong ball to drop and not move.  And, allegedly, a still but focused mind causes the steady signal necessary to complete the circuits and turn the lights/fan on.

Epoc Headset and USB
The more versatile consumer headset is the Epoc headset from Emotiv.  ($299) This one has 16 sensors across the top of the scalp.  So it is able to pick up points that can reflect complex motor thoughts such as right/left/forward/backward.  In addition they claim to capture some emotive states and even facial expressions.

The advantage of the Epoc is that it is already open to developers, interfacing to your computer through a USB key.  They are soliciting all developers to start filling out their proprietary app store (iTunes model) for games and tools that other consumers will be able to use with their own Epoc headsets in the future.  So we have the opportunity to start coding for a headset that is already pretty close to medical grade.

Currently, developers need to code their apps in Windows only for the PC platform.  In the future, we may be able to use this tool for Apple and mobile operating systems as well.  But there has been no promises on this from Emotiv.

Epoc Touchpoint Scan
The disadvantage of Epoc is that you do need to use wet-contacts to the scalp in order to pick up the electrical signals.  It's unlikely that consumers will be willing to re-apply the saline solution for each sitting of their EEG games.  But, this is what we have to work with at this point.  The process of training for the Epoc is a very gradual pairing of discoverable point contact combinations and specific output commands the user wants to exert on the Epoc-compatible game or tool. 

There isn't much point of consumers purchasing the Epoc at this point because the developer community hasn't yet produced a broad range of tools or games for exploration. 

NIA Headset & CPU
I would like to give an honorable mention to the NIA (Neural Impulse Actuator) headset from OCZ.  It's "honorable" only because it's no longer in the race, as OCZ has discontinued manufacture of the product.  However, they were able to develop quite sophisticated software for the PC interface, produce a small CPU to read and interpret the input signals, and manufacture the headset for under $100. The disadvantage of the NIA was that it read only three point contacts across the forehead, and would also pick up electrical signals generated by the muscular motion of the eyebrows.

I really appreciated the ambitious scope of the NIA software to capture right/left commands that could be mapped by the user to actual USB and keyboard keystrokes used in game control.  If it didn't have the requirement of needing to be calibrated in Windows, in theory this would enable an EEG control for xBox, Playstation, or any other device that accepted the market-standard and platform-agnostic USB input.

NIA direct USB Input
A common critique of the brain computer interface products is that they are complex to start using.  I'd have to say that the NeuroSky products are the easiest to use out of the box.  (Both the MindFlex and Force Trainer were up and running in less than a minute after battery installation.)  Epoc and Nia take multiple steps to set up and quite a long process to calibrate to the user.

All these devices require a learning curve as the user gets familiar with the idea of sending signals to a machine from a part of the body that tends to be largely passive.  In the distant future the thought of interfacing with a new device through the skin on our scalp will take only as long as one's first interaction with a touch screen.  But now, watching users wince and squint as they try to flex their brains with these devices shows the inherent foreignness of the concept to mainstream consumers.