A video circulating online shows a Tesla driver wearing Apple’s unreleased “Vision Pro” headset while the car operates on Autopilot, leading to his supposed arrest. While the situation turned out to be a staged skit, it raises concerns about the potential misuse of assisted driving technology and the ethical implications of augmented reality (AR) devices in vehicles.
The Staged Incident:
The video depicts the driver, identified as Marques Lentini, seemingly pulled over by police with the Vision Pro headset on. Lentini later confirmed it was a skit, but the initial confusion sparked discussions about the dangers of distracted driving and the legal gray areas surrounding Autopilot’s capabilities.
“[I] was in the right place at the right time,” he said. “That’s why we filmed the police.” In other words, he filmed the cops on unrelated duties to make people believe he was arrested.
Apple specifically warns users against using the Vision Pro while driving in its user guide. The company did not reply to a request for comment.
“Always remain aware of your environment and body posture during use. Apple Vision Pro is designed for use in controlled areas that are safe, on a level surface,” the company notes. “Never use Apple Vision Pro while operating a moving vehicle, bicycle, heavy machinery, or in any other situations requiring attention to safety.”
Despite its name, Autopilot is not a fully autonomous driving system. It requires constant driver supervision and intervention, and Tesla explicitly warns against relying solely on it. Using any device that further distracts the driver from their primary responsibility of operating the vehicle is dangerous and illegal in most jurisdictions.
Vision Pro’s Unseen Potential:
Apple’s Vision Pro, still under development, is rumored to be an AR headset with various applications, including potentially overlaying navigational information onto the real world while driving. While such technology could theoretically enhance driving experiences, concerns remain about potential distractions and the need for robust safety measures.