|NeuroSky Chip Toy|
Developers should care about this because two of these companies are seeking our help, in that they are inviting us to code applications to leverage their consumer headsets. My colleagues and I have been testing the different tools to see if they can socket into mobile apps for use in stress management or mobile gaming. These tools are currently in use in the medical field for those who lack the ability to leverage conventional computer interfaces. The question is whether these tools will ever supersede the hand (keyboard/mouse/gesture), or the tongue (Siri/DragonDictate/GoogleVoiceSearch) for interfacing with computers.
|Force Trainer - Uncle Milton|
|NeuroSky MindFlex Toy|
Our experience with the NeuroSky chip is that it is very easy to set up but challenging to manipulate. The process of control is to occupy the mind with a focused thought for a steady period to keep the contacts in the headset sensing a continual steady signal. High mental activity causes the ping-pong ball to drop and not move. And, allegedly, a still but focused mind causes the steady signal necessary to complete the circuits and turn the lights/fan on.
|Epoc Headset and USB|
The advantage of the Epoc is that it is already open to developers, interfacing to your computer through a USB key. They are soliciting all developers to start filling out their proprietary app store (iTunes model) for games and tools that other consumers will be able to use with their own Epoc headsets in the future. So we have the opportunity to start coding for a headset that is already pretty close to medical grade.
Currently, developers need to code their apps in Windows only for the PC platform. In the future, we may be able to use this tool for Apple and mobile operating systems as well. But there has been no promises on this from Emotiv.
|Epoc Touchpoint Scan|
There isn't much point of consumers purchasing the Epoc at this point because the developer community hasn't yet produced a broad range of tools or games for exploration.
|NIA Headset & CPU|
I really appreciated the ambitious scope of the NIA software to capture right/left commands that could be mapped by the user to actual USB and keyboard keystrokes used in game control. If it didn't have the requirement of needing to be calibrated in Windows, in theory this would enable an EEG control for xBox, Playstation, or any other device that accepted the market-standard and platform-agnostic USB input.
|NIA direct USB Input|
All these devices require a learning curve as the user gets familiar with the idea of sending signals to a machine from a part of the body that tends to be largely passive. In the distant future the thought of interfacing with a new device through the skin on our scalp will take only as long as one's first interaction with a touch screen. But now, watching users wince and squint as they try to flex their brains with these devices shows the inherent foreignness of the concept to mainstream consumers.