Mobileye earlier this year announced the EyeQ4 system on a chip (SOC), their 4th generation camera-based Advanced Driver Assistance Systems (ADAS) SOC for the automotive industry. The EyeQ4 has huge amounts of computing power and consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding. All this is necessary for it to analyze the huge amounts of data pouring from the cameras and sensors on self-driving cars and do what is required to keep these autonomous vehicles driving safely and efficiently.
For all this computing muscle the EyeQ4 uses only 3 watts of power, less than the power used by many mobile phone application processors. The EyeQ4 would also support sensor fusion with radars and scanning-beam lasers if present on the cars. This adds capability to Mobieye’s list of products that have been used by major automakers (BMW, Volvo, etc) for years for such this as lane departure warning, speed limit indication, etc.
This kind of technology may be the way to go for general robot design (Yes, self-driving cars are robots), we as humans rely primarily on our eyes for receiving the data we use for general navigation so maybe our robots should as well.
For this we do have the kinect, Microsoft’s motion sensing platform originally intended for Xbox gaming. This sensor has now been used as a general visual input aid to various applications including robotics. Check out the videos below.
What we may need however is a genuine low cost open hardware solution so that the tech can be more widespread and can be accessed by more people. This could help to advance its application to robotics by fueling hobbyist and undergraduate use of this advanced technology.
We may have a start by using the raspberry pi and the NoIR camera. Let’s see where this goes.
What we definitely need are lower costs navigation sensors so our hobby robots can have more intelligence and hence be more useful.