Tesla driver uses Apple Vision Pro on Autopilot, gets arrested


A video of a Tesla owner using the new Apple Vision Pro headset while using the car’s assisted driving features, essentially letting the vehicle drive itself, and purportedly getting arrested is going viral on social media and highlighting a potentially new danger on the road.

Published on Friday—the same day the Vision Pro went on sale in the U.S.—the 25-second video appears to show 21-year-old Dante Lentini, who posted the video on his X account, typing and scrolling with the $3,500 headset. For a portion of the video, Lentini’s hands are not on the wheel, which is a requirement for all three of Tesla’s assisted driving features: Autopilot, Enhanced Autopilot, and Full Self-Driving. (Despite the name, none of Tesla’s assisted driving features make its cars fully autonomous).

The video goes on to show Lentini in a stopped car with the police next to him. In a comment on X, Lentini confirms that the person in the video is him and that he was arrested. As of Monday morning, Lentini’s video had more than 24 million views on X, the platform owned by Tesla CEO Elon Musk.

Gizmodo reached out to Lentini and Tesla on Monday morning for comment but did not immediately hear back. Tesla famously dissolved its PR department back in 2020.

Some users on social media accused Lentini, who says on LinkedIn that he is a product manager at the software development startup Hyper, of staging the video for clicks and views, although whether he did so is not immediately clear. In response to comments lamenting the stupidity of driving while wearing the headset, Lentini responded: “my bad, had a meeting.”

Apple specifically warns users against using the Vision Pro while driving in its user guide. The company did not reply to a request for comment.

“Always remain aware of your environment and body posture during use. Apple Vision Pro is designed for use in controlled areas that are safe, on a level surface,” the company notes. “Never use Apple Vision Pro while operating a moving vehicle, bicycle, heavy machinery, or in any other situations requiring attention to safety.”

Tesla’s driver assisted features, especially Autopilot, have resulted in federal scrutiny on the carmaker in recent years. In 2021, the National Highway Traffic Safety Administration opened a formal probe into the company over crashes involving Tesla’s driver assisted features—accidents that have left more than a dozen people dead.

And last July, the safety regulator asked Tesla for more information on a software update that allows drivers to use Autopilot for extended periods of time without putting their hands on the wheel.

“The resulting relaxation of controls designed to ensure that the driver remain engaged in the dynamic driving task could lead to greater driver inattention and failure of the driver to properly supervise Autopilot,” the regulator’s chief counsel John Donaldson said in a letter at that time.

This article originally appeared on Gizmodo.


Please enter your comment!
Please enter your name here