Acapella Science has put together a great little song giving you the history of Exoplanets, from its beginnings back in 1990 to the present, culminating with the TRAPPIST-1 discovery. You have to appreciate him when he says I’m a harmony addict working on a master’s in theoretical physics; what ELSE was I going to make a YouTube channel about? This is his latest, but far from his only such production; he tries to crank one of these out each week. So I thought I would include a few more, with Entropic Time for the second tune, and the final one is CRISPR-Cas9 (Bring Me A Gene). I don’t usually include videos with a stinger at the end hyping the person’s channel, but this isn’t just music, it is science at the same time, and that is a combination worth supporting. The man creating these is Tim Blais, and I hope he keeps making these for a long time to come.

The folks at Blitab have created a tablet with a touch screen to search and select with, but it is not the main display. Instead it translates text from the Web and other digital sources into Braille at about 65 words per display (depending on what you are reading). There have been Braille display devices for a while, but they have been limited to a single line which can only hold about 5 words on average by moving rigid pins up and down to form each letter. They also cost thousands of dollars. The Blitab Braille display uses layers of fluids and a special proprietary membrane they aren’t talking about to form an entire screen’s worth of words, and when they finish refining it they plan to put it on sale in the fall for around $500. Using it 8 hours a day to read will give you a battery life of 5 days, a discharge rate I wish my own tablet could match. This has the potential to open up a lot of the online world and its resources to the visually impaired at an affordable price in a way that has never been available to them before. There is a similar project in development at the University of Michigan, but so far it is mostly in the research stage. Thanks to the MIT Technology Review for the heads up on this one.

This weekend you can swing by the UK’s National Space Centre for an out of this world experience featuring the Science of the Time Lords exhibit. On the 28th and 29th they will be doing a presentation about the science behind the UK’s most popular TV franchise, Doctor Who. Each year they set up a fun family weekend where they look at the fact behind the fiction of this iconic program, and this time they are focusing on the core concept of the show: Time Travel and the ultimate Time Lord vehicle, the TARDIS! The schedule includes workshops, competitions, challenges, talks, exhibitions, and so much more. The exhibits I would most like to see in person include the fully realized recreation of the 1978 TARDIS Control Room from the Tom Baker era, and the Members of the UK 15th Cyber Legion showing off their costumes and detailing how you can create your own. From my perspective, the only down side to these events are the fact that they will be happening on a continent different from the one I live on. I intend to do my best to attend next year, though!

Picturehouse teamed up with the Science Museum in the UK to give away 10 pairs of tickets to the Robots exhibition. Why is this important enough to mention, even though the odds of my stopping by before it closes are slim to zip? Because I wish I could be there, and if I mention it you might manage to actually make it. This isn’t a collection of metal boxes with faces painted on them; it is the 500-year-old story of humanoid robots and what it means to be human. The presentation is set in five different periods and places, with over 100 robots, 15 of which still work today. If you are one of the luck few that manage to attend this display, I would be grateful if you could take a few pictures and send them my way, so I could post them here. The exhibition will be running from from February 8th to the 3rd of September 2017, so you have a bit of time to catch it.

The 2016 Nobel Prize for Chemistry went to three researchers who have actually created a range of functioning nanotechnology devices, molecular scale machines that replicate motors, vehicles, and muscles. Each of the three started out with a single function tool, added other functions one after another, and ended up with something a thousand times thinner than a human hair that could do real work. I am sure to a lot of people it doesn’t seem like something that small could do anything that would make a difference to their lives. What useful thing could you do with a programmable device so small that you would need an electron microscope to see it?

The first thing that comes to my mind is to teach it to recognize malignant cells such as cancer, and load it up with a medicinal payload to deliver to such a cell, leaving its uninfected neighbors unharmed. Considering what our current chemotherapy treatment does to the rest of the human body, poisoning the entire thing in the hopes that the cancer cells will die before too many of the healthy cells do, I think this would be a serious improvement. Plus, that is a lot simpler to do than getting it to regrow a missing hand or eye or other body part, so it could be rolled out quickly. I am sure the profit from the cancer cure they could deliver within the next few years would go a long way towards financing the additional research and development needed for the more complex physical repairs to the human body.

Another application would be using them to build things one atom at a time. If you think today’s computers are powerful, wait till you see how small the computer can get when constructed using this method. You could build a fully functional Oculus Rift grade computer plus include all the headset functionality, and embed them into your contact lenses. Or you could just use the nanotech to create the much simpler room temperature superconductors, again depending on the profits of the simpler process to finance the R&D needed to develop the more complex one. Nanotechnology has been one of the Holy Grail’s of Science since Richard Feynman introduced the concept in 1959, and a bit more than half a century later it looks like we are finally getting it to work, at least the early tools. Check out this BBC Story to get the full details.