Skip to main content

When I first saw Alien, I loved the exoskeleton Ripley wore in her battle with Mama Monster. Some folks in Japan have now built it, and not just as a prototype, but as something you can use in emergencies or on a construction site. It would also form a worthwhile core for a good Giant Mecha suit, getting us one step closer to that reality as well. Thanks to Crunchyroll for the heads up on this one.

This sounds a lot more like a commercial than I would normally share here, but the concept is unique; using a small spherical robot as a real time marker for your 3D Augmented Reality character to manifest on. This gives you flexibility and mobility not previously available to interact with your environment. While the usage they are targeting is within a game, the potential applications range far beyond that.

For instance, this could be used as a personal tour guide in a museum, slaved to a GPS, a museum map, and an extensive database of facts on each exhibit, along with speech recognition processing. It would be able to answer your every question about any exhibit in great detail. Or linked to the camera and a library of geometry and trigonometry functions, you could use nearby buildings and moving vehicles to learn various math functions with literally real world examples, and again query the system to get a full understanding of what you were learning, with your virtual tutor traveling your city or town with you.

OK, for the outdoor applications you might want to carry a pocketful of the round robots with you, to replace the ones crushed under city bus tires or swept into storm drains by sudden showers as you go along. But those bots are extremely simple, and after another 6 months of producing them ought to become quite cheep as well, making their use in such environments quite cost effective. Thanks to Tech Crunch for the heads up on this one.

The builder of this 2 foot tall (OK, 60 centimeters, but who’s counting?) robot incarnation of Hatsune Miku goes by the name of Rozen Zebet, and he released this video Saturday to show just how good his replicant is. While I personally like the holographic versions they use for the live stage performances, this one is quite tasty, and definitely shows some of the improvements robotics have come up with in recent years. Thanks to the Anime News Network for the heads up on this one.

Ready to learn how to run your own brainwave controlled robot? Yes, I know a real robot would have its own self-contained intelligence system rather than being teleoperated, but still this is pretty cool. This report from DigInfo is about a joint French and Japanese robotics project that could grant freedom undreamed of to paraplegics and other physically challenged folks. Of course, it is also the path leading to the kind of world made popular in the Bruce Willis movie Surrogates, but every advance comes with a potential dark side attached.

The Darpa Robotics Challenge is all about building robots that can operate in a human environment robust enough to do useful work during emergencies, to aid and supplement first responders. Whether the emergency is man made or natural, the robots need to operate human devices, such as doors, stairs, tools and vehicles, as well as recognize and asses their environments for emergency context. If you have been waiting for your chance to shine as a robotics engineer, this might be your kind of challenge. Tracks A and B have already been selected, Track C for software control systems, and Track D for combined hardware and software, are both still open. The initial work should be developed and submitted with the GFE Simulator package, which is the robot simulation software from the Open Source Robotics Foundation. Even if you don’t think your skill set is quite up to entering a competition of this caliber, if you have any interest in developing your own robots you should download this free software suite and try out your hand at design and development. If you do enter, and you are selected to continue past the entry level, at the next stage you may be eligible for some funding to develop your design. Our next Evil Robot Overlord could be one you made yourself! Thanks to the folks at Popular Mechanics for the heads up on this one, and check out their article for lots more detail. And yes, I did just install Ubuntu 12.04 specifically so I could get the best build of GazeboSim installed and running.

I love living in the world of the future. Dubbed Project Green Brain, the engineering teams at the Universities of Sheffield and Sussex are writing computer models of the brains of bees, specifically the systems in the brain that interpret a bee’s vision and sense of smell. They intend to link this to robotic sensors designed to perceive the same stimulus and install it into a flying robot. The purpose of the project is to advance understanding of simple non-human brain structures and artificial intelligence, but they already have a number of practical applications in mind, from search and rescue in dangerous environments such as mines or nuclear power plants, to finding the source of gas leaks, to actually pollinating crops in areas where hive collapse has eradicated real bees. As long as they don’t include stingers I think this will be a project worth following, particularly since this is the first Artificial Intelligence project I know of that is being designed to run on desktop PCs rather than supercomputers.