Skip to main content

Tan Le developed a wearable EEG, and the other week a team of engineering doctoral students at the University of Florida used it to control drones in a competitive race as the first step to developing a brain/robot interface. One of Tan Le’s more elegant contributions to this telepathic headset was the algorithm that lets it unfold the convolutions of the brain, making it much more accurate and a lot easier for anyone to put on and start training with. This has major implications for everyone from the physically disabled who will gain previously impossible degrees of self reliance, to the military research teams trying to create their own personal Gundams. The major breakthrough’s that made EEGs wearable were developed in parallel in a number of different research projects around the world during the early part of this century, and affordable (as little as $500 a pop) the early part of this decade, and so far every year since has seen major improvements in their functionality. Another major player in this field is Ariel Garten, so I had to include some commentary from her. It seems like her system is a lot lower rez then Tan Le’s, only able to trigger actionable input from the whole brain state (Alpha, Theta) rather than specific mapped neural sites. But they are both on the market with a cheap neural computer interface as are a number of others, and there is no way to know who might come out with the advancement that pushes us into the future.