So you may have heard about Tesla’s new Full-Self Driving hardware, sometimes referred to as Hardware Version 3. The new hardware is a direct replacement for the Nvidia full self-driving computer that Tesla used in its HW2 and HW2.5 equipped vehicles.
The FSD chip integrates two custom-designed neural processing units. Operating at 2 GHz, each NPU has a peak performance of 36.86 trillion operations per second (TOPS), and with two NPUs on each chip, the FSD chip is capable of up to 73.7 trillion operations per second of combined peak performance.
Now, I’m no computer engineer, but that sounds pretty amazing.
Today, all new Teslas come with the new full self-driving computer, and, as of September 2019, Tesla started to retrofit some of their newer Tesla’s that had paid for Full-Self Driving with their new Hardware 3 computer. Apparently, my 2019 Model 3 just missed getting the new FSD Hardware installed during production, but I recently got the call to come in for the retrofit, and I got my new Full Self-Driving computer upgrade today.
You see, our Highway 404 north of Toronto is often under construction, and it quite often has those concrete barriers running right alongside the most leftward lane. I’ve always noticed that the Tesla AutoPilot isn’t the least bit shy of driving alongside these barriers at full speed, but that’s not what I took note of today.
What I noticed today was that my car was actually detecting and displaying traffic pylons that were on the other side of those concrete barriers. Not only that, but these pylons were practically on the other side of the highway median strip. Let’s take a good close look at what my car saw. (See video below.)
You may not notice it at first, but there’s a group of four pylons lined up on the other side of the concrete barriers, almost on the other side of the median strip dividing the opposing lanes of traffic.
If you don't catch it in the video, take a close look at the slow motion view of the four pylons... and then compare the slow motion view of the four pylons displayed in the car’s road visualization
That means my car’s front cameras spotted these pylons, and then Tesla’s neural network recognized these objects as pylons... at 115 km an hour, across a highway median strip, on the other side of a concrete barrier.
That’s pretty impressive.
Bear in mind that Tesla's neural net recognizes a lot more than just cars, trucks, buses, and pylons. However, for the time being, Tesla is holding back on displaying everything that Tesla vehicles can actually recognize... lest Tesla drivers get the idea that Teslas are permitted to respond to everything that we humans can see displayed in the car's visualization. In other words, if Tesla drivers knew that their cars were actually recognizing red traffic lights, then the fear, naturally, would be that drivers would abdicate responsibility for stopping to the car's FSD computer. However, at this point in time, Tesla is not allowing their cars to perform at this level of autonomy.
When it comes to the environment, we are all neighbours.