Skip to main content

Tan Le developed a wearable EEG, and the other week a team of engineering doctoral students at the University of Florida used it to control drones in a competitive race as the first step to developing a brain/robot interface. One of Tan Le’s more elegant contributions to this telepathic headset was the algorithm that lets it unfold the convolutions of the brain, making it much more accurate and a lot easier for anyone to put on and start training with. This has major implications for everyone from the physically disabled who will gain previously impossible degrees of self reliance, to the military research teams trying to create their own personal Gundams. The major breakthrough’s that made EEGs wearable were developed in parallel in a number of different research projects around the world during the early part of this century, and affordable (as little as $500 a pop) the early part of this decade, and so far every year since has seen major improvements in their functionality. Another major player in this field is Ariel Garten, so I had to include some commentary from her. It seems like her system is a lot lower rez then Tan Le’s, only able to trigger actionable input from the whole brain state (Alpha, Theta) rather than specific mapped neural sites. But they are both on the market with a cheap neural computer interface as are a number of others, and there is no way to know who might come out with the advancement that pushes us into the future.

Using a combination of solid and liquid printing, MIT printed the first ever 3D Printed Hydraulic Powered Robot. No assembly was required, beyond popping on a motor and battery. Which means now our Evil Robot Overlords will be able to print up their minions themselves. The advance that made this possible was developing a technique to print both solids and liquids in the same printer, and I find it somewhat surprising that they got the best results for the liquid printing using a regular ink jet printer.

One of the most exciting fields of engineering is the application of existing principles in totally new ways to solve long standing human-centric problems, and there have been recent breakthroughs on two such problems that will extend help to millions of people.

The GyroGlove is just what it sounds like: a glove with one or more small but intense Gyroscopes attached to it that will help steady the hands of victims of Parkinson’s and other degenerative neural disorders. Being able to shave without slicing your own throat/femoral artery or being able to eat soup without splattering half of the bowl across the table is a given for most folks, but for those who suffer from the trembling such a condition induces it makes all the difference in regaining a life with dignity and control.

For the visually impaired, electronic communication has meant a telephone so you could talk to people, or a (tiny motors driving very tiny vertical rods mechanism slaved to your internet connection) single line of Braille that you would have to read and remember until the next line was built up on the interface, and maybe the next, until you finally held the entire sentence in your mind. The advances in speech-to-text have been tremendous in the last decade, and that has helped, but there is finally a cost effective potential solution for a Braille Tablet. Using microfluidics rather than motors, a whole new class of Braille electronic outputs become available that make it possible to offer a complete screen worth of text rather than a single line, and for a fraction of the previously available outputs price.

These are powerful advances to my mind, offering new help to a lot of people that never had these options before. Even though each solution will only benefit some percentage of their target populations, I can’t help but grin at the thought that we continue to push back the barriers that keep us all from advancing.

In Shelf Life Episode 5 we get to learn how astronomers collect baseline data over time, and collate it into a meaningful picture about how stellar phenomena change in periods as short as generations. The common belief in scientific circles used to be that stellar events either happened overnight, like supernovas, or took tens of thousands to millions of years to evolve to the next stage. Recently some museums have been compiling the images of the night skies taken on photographic plates as far back as the 1890s into a huge database, and then processed the results to show small pieces of sky over that 130 year span. What they discovered was that lots of stars fluctuate over a decade or two much more than anyone suspected, rather than remaining unchanged for the lifetimes of civilizations. I can’t wait to find out what new insights we gain with this as a baseline supposition as we process still more collections of data that we were never able to put together before computers made it easy.