Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Tesla's 'Dojo' supercomputer will train its vision-centric autonomous tech

It has to teach the systems to respond with the same acuity and speed as a human.


Tesla has gone all-in on vision-only autonomous driving, to the point of even phasing out radar sensors in some of its EVs. Now at a CVPR 2021 workshop, Tesla senior director of AI Andrej Karpathy has explained how it's planning to do this by using an in-house supercomputer called "Dojo," as TechCrunch has reported.

Karparthy explained that with vision-only tech, computers must respond to new environments with the same speed and acuity as a human. However, doing that requires AI training on a massive dataset with a powerful supercomputer to crunch it. Tesla has one of those in house with "Dojo," a next-gen model with 1.8 exaflops of performance and 10 petabytes of NVME storage running at 1.6 terabytes per second.

While the system hasn't been benchmarked, Karparthy figures it would be one of the fastest in the world. "If you take the total number of FLOPS it would indeed place somewhere around the fifth spot,” Karpathy told TechCrunch. “The fifth spot is currently occupied by NVIDIA with their Selene cluster, which has a very comparable architecture and similar number of GPUs."

To train the system, Tesla's supercomputer collects video from eight cameras on Tesla vehicles each running at 36 frames per second. While that generates a huge amount of data, it's more scalable than building and maintaining high-definition maps around the world. However, it also requires nearly instantaneous processing, which needs to be treated as a supervised learning problem.

So far, the system works well in sparsely populated areas, where cars can drive around with no intervention. However, Tesla has found (like all other autonomous vehicle companies) that navigating densely populated areas is much more difficult. Still, Karpathy said that Tesla's computer has been able to handle new types of traffic warnings, pedestrian collision detections and pedal misapplications, the latter happening when a driver accidentally presses the gas instead of the brakes.

Despite several notorious Tesla accidents where the autonomous driving systems failed to pick up obstacles or correctly track a route, CEO Elon Musk is firmly committed to vision-only. "When radar and vision disagree, which one do you believe? Vision has much more precision, so better to double down on vision than do sensor fusion," he tweeted recently. The company believes a supercomputer will finally help vehicles attain advanced self-driving capability, but it's best to take a wait-and-see attitude as we've heard that tune before.