Subscribe

Subscribe to Our Blog

Ambient AI and XAVIER, an AI Car Supercomputer

Fri Jan 06 22:00:00 EST 2017


NVIDIA’s background is in gaming and building supercomputers and GPUs for that purpose. Whilst that might not appeal to everyone, it has been the training field for some incredibly complex computing and has led NVIDIA to be able to participate in so many additional markets, and to be the best performing stock in the S&P500.

The four main areas of activity, and of this evening’s announcements, are Gaming, VR/AR/MR (Virtual, Augmented and Mixed Reality), Data Centers and Self Driving Cars.

Huang started by suggesting we were enjoying the most exciting time in the computer industry ever, with machine learning and deep Neural Networks creating a big bang for AI.

I won’t cover the announcements in gaming in this blog, but needless to say they were exciting for those in the community and included a partnership with Facebook and the launch of GeForce Now, an on demand option for gamers without the computing power required on their own PC, leveraging cloud supercomputing.

The first phrase that really got me interested was “Ambient AI” and the announcement of the NVIDIA SPOT, a device that can plug in anywhere in the home and provide voice recognition for your Google Assistant.

The start of the keynote however was really about NVIDIA’s developments in the $10 trillion transportation industry, an industry that Huang thinks can and will be revolutionized by AI.

Jen-Hsun envisaged a world where there were fewer parked cars, fewer accidents, less wasted fuel and lower environmental impact from vehicles. He also imagined cars to be your most personal robot or assistant.

NVIDIA believes that AI is the solution to self-driving and the launch of their XAVIER, an AI Super Computer brings all their technical experience to bear on the automotive industry. The vision is for self-driving systems that learn rather than ones that are programmed. It is a vision of a car that interacts with the driver using face recognition, head tracking, gaze tracking, and lip reading (as well as voice recognition) to include the driver in the AI’s ecosystem, so the driver’s behaviour becomes part of the awareness of the vehicle.

A learning vehicle should be able to tell you when it is less confident and ask you to take the controls and then go into a co-pilot mode where it supports your driving whilst continuing to learn.