Nationwide we HAUL it ALL!  Services start at $9.95, ANY SIZE… 7 days a week year round.

Faster than Amazon, Hauling items within Hours!  Learn More about SERVICES

Haultail is Nationwide from Courier to Big and Bulky Rapid Delivery. Learn More about LOCATIONS

  • Download now!

HERE’S HOW CLOSE TESLA IS TO MAKING TRUE SELF-DRIVING CARS A REALITY

Automotive News

It’s no secret that Elon Musk can’t wait for a true self-driving Tesla car to hit the road. On Tuesday, the Tesla CEO touted an Autopilot software update that will allow drivers to stream Netflix and YouTube videos when a car is stopped—and promised to enable video streaming in moving cars “when full self-driving is approved by regulators.”

That exciting announcement came not long after Musk laid off about 10% of Tesla’s Autopilot team, reportedly over frustration with the department’s slow progress.

So, where exactly is Tesla at with its self-driving effort? Recently, the electric carmaker’s director of artificial intelligence, Andrej Karpathy, shared an insightful update.

Speaking at a workshop hosted by the International Conference on Machine Learning, Karpathy, who leads the team developing the machine learning system used in Autopilot, introduced some of the most sophisticated tasks a Tesla car can perform without human intervention.

The most notable was fully autonomous driving on highways. Tesla’s “Navigate on Autopilot” feature, which was released a year ago, is capable of taking a person from point A to point B if the entire route is on a highway. Without any human attention, the car knows how to keep a safety distance from other cars, make turns and lane changes, and get off at the right exit.

Karpathy noted that there are various robotic systems in the market that can achieve similar results, but their underlying mechanics differ widely.

For example, some self-driving cars come with a “lidar” (a radar using light instead of radio waves) at the top. This equipment is used for pre-mapping the entire area where a car is expected to move. A driver then selects a route on that map and direct the car to move itself along it.

By contrast, Tesla’s Autopilot system doesn’t have a pre-mapping function. Instead, it relies primarily on the eight cameras that capture a 360-degree view around the car at any moment. These cameras each produces real-time video streams that are then parsed by neural networks developed by Karpathy’s team in order to guide motions.

Currently, Tesla is trying to enhance this system and make it work for complicated city roads.

Last month, Tesla released its latest version of Enhanced Summon, a feature that enables a car to move itself from a parking spot to its driver based on GPS location.

“There’s no one in that car. It’ll just come to you,” Karpathy said. “And then, you get in like royalty, and it’s the best… when it works.”

Taking such self-driving function beyond the parking lot is still a challenge, because “seeing” and analyzing the number of traffic lights and signs, road marks and surrounding objects on a typical local road will take too much computing power for a neural network system.

Karpathy said the neural network his team is training now can process about 50 such tasks simultaneously.

Original story from OBSERVER

We updated our privacy policy as of February 24, 2020. Learn about our personal information collection practices here.