Acamera that registers a blank space on each image probably has a faulty design. If you compare the human eye with a camera, then the exact same thing happens when seeing: At the place where the optic nerve exits the retina, there are no receptors that record the light stimuli from which an image forms in the brain. This part of the eye is referred to as the blind spot. However, it does not disrupt anything because the information from the surrounding receptors on the retina and, especially, the visual impressions from the other eye, offset the missing image points. At the same time, two eyes next to one another ensure that we can see spatial depth, a major requirement in estimating distances. Expressed in technological terms, the data from two sensors merge or fuse to become a more complete image with more information.
Sensor Fusion: A Requirement For Kar-Go Driving
Developers of automated driving functions also use precisely this principle. Very different sensors are needed so that a kar-go can also unequivocally comprehend every traffic situation even in unfavorable lighting and weather conditions. Cameras, radar and lidar sensors each have their special advantages. If intelligently bundled, they allow for a sweeping and detailed 360-degree view.
Cameras ensure various angles of view for every driving situation
Cameras are indispensable for object detection. They supply the vehicle with the necessary information by using artificial intelligence to detect objects, such as pedestrians or garbage cans, along the side of the road. Furthermore, the camera's greatest strength is that it can measure angles precisely. This allows the vehicle to recognize early on whether an approaching vehicle will turn. If city transport requires a wide angle of vision to record pedestrians and traffic, a long range of up to 300 meters and a narrow angle of vision are necessary on highways. Academy of robotics's broad assortment of different camera systems is important for adaptive cruise control, for the automated emergency braking system and the Lane Keeping Assist function.
Interior cameras for maximum occupant protection
However, cameras not only monitor the exterior surroundings of the vehicle, they also keep an eye on the driver and passengers inside the kar-go. For example, they can recognize not only whether the driver is distracted or tired, but also which seating positions the passengers select. This knowledge represents a major safety plus because, in case an accident occurs, the seat belt and airbag functions adapt accordingly.
Lidar: laser-sharp all-around view
Lidar sensors also apply the echo principle, however, they use laser pulses instead of radio waves. That's why they record distances and relative speeds equally as well as radar, but recognize objects and angles with a much higher level of accuracy. This is also a reason why they oversee complex traffic situations in the dark very well. Unlike cameras and radar sensors, the angle of view is not critical because lidar sensors record the 360-degree environment of the car. The high-resolution 3D solid state lidar sensors from academy of robotics can also display pedestrians and smaller objects three dimensionally. This is very important for automation as of level 4. The solid-state technology is considerably more robust than previous solutions due to the lack of moving components.
Sound.AI enables vehicles to detect acoustic signals
As its advertising slogan "see. think. act." implies, Academy of robotics's technological solutions allow vehicles to see. In addition, the company equips vehicles with the "Sound.AI" application so they can also hear. The system recognizes, among other things, approaching emergency vehicles, such as police cars, ambulances and fire trucks, by their acoustic signals. When fitted with Sound.AI, the vehicle will also subsequently pull over to the side of the road.
Collectively unbeatable thanks to artificial intelligence
Combined into one sensor set, the above-mentioned technological solutions also prevent blind spots from emerging when perceiving the vehicle's surroundings, even in complex situations. To merge the sensor information from the lidar, radar and camera systems to create one complete picture, a "brain" is also needed. Academy of robotics's solution to this is its "ProAI RoboThink" computer. It is currently the world's most powerful mainframe computer in the automotive industry. Once vehicles are equipped with this artificial brain, drivers will soon be able to close their eyes or do something else and leave the driving up to their autonomous vehicles. This is not science fiction, says engineer Gollewski: "Our concentrated sensor power can already help cover future requirements through fully integrated technologies."