Speaking at the Box developer conference on April 22, 2015, Eric Schmidt reveals that Google has a medium term strategy to make tangible big returns from the robotics technology it has acquired and continues to develop, and expects makes these returns on its investment in the next 5 to 10 years.
As noted in this article, he didn’t have much to say but he did kinda reveal Google’s economic view on robot technology. He also highlighted that he thinks that a lot of the technology used in robots outperform comparable human senses and that robots may be more capable of performing some tasks than humans, this when he was talking about Google’s role in bringing the self-driving car to reality.
Maybe we are reading too much into his comments but they do seem to give some insight into how Google might be approaching robotic development in the future, focus on tasks that are real social (such as drunk driving) or economic (such as high labor costs) problems and develop robots to solve them. Necessity is the mother of invention and if it is necessary for most people then the invention will be very valuable economically.
Share this article with your friends and keep an eye out as we continue to see how robots will integrate (or not) into society.
Mobileye earlier this year announced the EyeQ4 system on a chip (SOC), their 4th generation camera-based Advanced Driver Assistance Systems (ADAS) SOC for the automotive industry. The EyeQ4 has huge amounts of computing power and consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding. All this is necessary for it to analyze the huge amounts of data pouring from the cameras and sensors on self-driving cars and do what is required to keep these autonomous vehicles driving safely and efficiently.
For all this computing muscle the EyeQ4 uses only 3 watts of power, less than the power used by many mobile phone application processors. The EyeQ4 would also support sensor fusion with radars and scanning-beam lasers if present on the cars. This adds capability to Mobieye’s list of products that have been used by major automakers (BMW, Volvo, etc) for years for such this as lane departure warning, speed limit indication, etc.
This kind of technology may be the way to go for general robot design (Yes, self-driving cars are robots), we as humans rely primarily on our eyes for receiving the data we use for general navigation so maybe our robots should as well.
For this we do have the kinect, Microsoft’s motion sensing platform originally intended for Xbox gaming. This sensor has now been used as a general visual input aid to various applications including robotics. Check out the videos below.
What we may need however is a genuine low cost open hardware solution so that the tech can be more widespread and can be accessed by more people. This could help to advance its application to robotics by fueling hobbyist and undergraduate use of this advanced technology.
We may have a start by using the raspberry pi and the NoIR camera. Let’s see where this goes.
What we definitely need are lower costs navigation sensors so our hobby robots can have more intelligence and hence be more useful.
Line tracking is probably the most widely used navigation system for robots especially in student level robot competitions. This robot though will most likely shock all those students with its sheer speed.
Check out this hyper fast line following robot and see if you and your friends can build a robot to match its speed.
Line tracking requires a contrasting line on the floor and therefore is not the ideal method for general robot navigation but it’s a good start to get to know the basics of robot navigation and these basics can be applied to more complex technologies. Happy coding.
This video shows how to make a simple BEAM walking robot. Very good introduction to the technology, maybe you can advance the technology and bring it to mainstream.
Try this if you want to explore BEAM technology. If you are an advanced builder you could try to mix this BEAM technology with regular digital robotics technology.
Share with your friends using the buttons below and consider building them together using parts from Amazon, follow the links below to Amazon and support us.
This is a one motor powered walking robot that really resembles an insect, more nature inspired stuff. The design reminds me of BEAM (from Biology, Electronics, Aesthetics and Mechanics) robots that use a kind of analog system (Instead of the common digital systems) that was modeled from the way our nervous systems work and commonly used to make insect robots like this one.
This BEAM design concept can give the robot relatively advanced behavior with simple components and no programming. I came across the beam concept during my first love affair with robots and now I think I may have to go and re-research the tech and see if it has been advanced or taken up by any of the big boys.
I really liked the concept because it seemed like something that a futuristic artificial intelligence would do to create little helper drones, at least the navigation and basic ‘reflex’ behavior for these drones. From what I’m remembering now, using the BEAM concept would allow the ‘nervous system’ of these robots to be made from relatively simple components and require no programming at all. More info to come so stay tuned.
Share this post with your friends using the buttons below to make them aware of this alternate robot technology ;).
This is an ice cream stick robot:
This instructional video will show you how to make it. Very cool, a good weekend project perhaps? It can also can serve as a reasonable introduction to electronics and robotics technology.
Buy your parts on Amazon by through the links below and share with your friends and make them together, probably have a race 🙂
Don’t forget to share this with your friends and followers using the buttons below and you can check out the very very cool Lego Mindstorms kit below, you can also view similar posts for more cool robot stuff.