AImotive aims to convert regular cars into driverless ones inexpensively

While other autonomous car projects rely on an expensive radar-like system called Lidar, AImotive is trying to do the same using regular cameras and AI

The AImotive office is in a small converted house at the end of a quiet residential street in sunny Mountain View, spitting distance from Googles headquarters. Outside is a branded Toyota Prius covered in cameras, one of three autonomous cars the Hungarian company is testing in the sleepy neighbourhood. Its a popular testing ground: one of Googles driverless cars, now operating under spin-out company Waymo, zips past the office each lunchtime.

While other autonomous car projects, including those from Waymo and Uber, rely on an expensive (but very useful) radar-like system called Lidar for depth perception and obstacle detection (as well as cameras for seeing the colour of traffic lights and signs), AImotive is trying to do the same using regular cameras combined with artificial intelligence. This means the company can convert a regular car into a driverless one for a fraction of the price around $6,000 as opposed to $70,000-$100,000.

The whole traffic system is based on the visual system, explained founder and CEO Laszlo Kishonti. Drivers dont have bat ears and sonars, you just look around and drive.

AImotives driverless technology relies on regular cameras combined with artificial intelligence. Photograph: Olivia Solon for the Guardian

AImotive, founded in 2015, wants to replicate that human ability. The only way to do this is with AI.

The car features four fish-eye cameras, covering each side of the vehicle, as well as dual stereo cameras facing forward and backward. The trunk of the car is filled with a chunky, power-intensive PC that stitches feeds from the cameras the eyes of the car together in real time to create a three-dimensional model of the environment.

Artificial intelligence is applied to the video feeds to make sense of the surrounding environment, so the system understands whats around the car and at what distance. Is the object a human, a car, a cat, a road or a sidewalk? If its a human or animal, where are they going? Are they about to stop? Or run across the road? These are relatively simple tasks for humans, but challenging for computers faced with frames of pixels.

The computer that powers AImotives driverless car. Photograph: Olivia Solon for the Guardian

In a test drive around Mountain View, the Guardian got to see through the many different eyes of the machine, watching it map distances and objects in real time. The cars live view is combined with a GPS-type location engine that places the vehicle on a map. AI also powers the motion engine which plots the vehicles driving path on the road, and this is fed into a control engine which informs the steering, braking and acceleration of the car.

Unfortunately, AImotive has yet to get its California driverless car licence, so the Prius was controlled by a human for our 10-minute ride. However the fully autonomous function has been tested on the roads and underground car parks in Budapest, where AImotive is headquartered.

Lidar became the standard sensing technology for autonomous vehicles after cars fitted with the technology did well in DARPAs driverless car competition a decade ago, Kishonti said.

By launching in 2015, AImotive hit a sweet spot in artificial intelligence and computer vision. The speed of research in the field has been remarkable, said COO Niko Eiden.

The drawback of a vision-based system rather than a Lidar system is that its limited by what it can see just like humans. This means performance is poorer when theres fog or snow. On the flip-side, in theory Almotives approach is more flexible and the car could be dropped into any city and know what to do without previously creating a detailed 3D map of the roads as a reference point.

This should mean that AImotives cars can handle unpredictable changes to the route, for example when roadworks redirect traffic onto the wrong side of the road.

Every day theres a temporary traffic sign somewhere, what does the first car that sees that do? asks Kishonti.

The AImotive team uses a more extreme scenario as a thought experiment: What if an elephant escapes from the circus? It doesnt matter how many times a car drives round Mountain View, its not going to be able to account for these types of black swan events.

We would recognise it as a big mass of body, but probably would not be able to forecast what this animal will do, admitted Kishonti.

To teach the system how to handle extreme conditions, AImotive uses a simulator to drive millions and millions of virtual miles to see how its car reacts in a wide range of driving conditions and interactions with other cars, people and animals.

You can go to Scandinavia and see snow but theres no New York in Scandinavia. How do you test all types of traffic with snow? asks Kishonti. With simulations, thats how.

AImotives car traveling around Budapest.

AImotive doesnt have any plans to build its own cars from scratch, but is working with companies like Volvo to provide the self-driving brains of cars and trucks.

Like Otto, the truck automation company that Uber recently acquired (and is currently the target of a major lawsuit over allegedly stealing trade secrets from Waymo), AImotive believes that the trucking industry is ripe for automation.

Trucking companies are competing for 1-2% price difference, but 60% of the cost is the driver, said Kishonti. However, he recognizes that there are political hurdles over and above the technical ones.

Trucking companies would be happy if the drivers could be eliminated but the trade unions will have different ideas, he said.

In the meantime, AImotive can provide truck companies with safety features, such as detecting cyclists in drivers blind spots.

For now, the company is focused on getting the licence to drive its cars legally in California and then Nevada. Once the paperworks in place it will launch a highway pilot in a few months before experimenting in urban settings in early 2018.

Read more:

Facebook Comments

لا يوجد تعليقات