It’s been a big week for car news, triggering new arguments about whether A.I.-assisted technology can compensate for the fallibility of humans, or whether our faith has been horribly misplaced.
Last week as America headed into its big 4th of July weekend, Uber released new data suggesting their ride-sharing app was reducing the number of drunk drivers — presumably saving lives — and also announced plans to update their app so it monitors their drivers for unsafe behaviors.
Last week also brought reports of the first fatal crash of a Tesla driver using its Autopilot — with all of its sad ironies. In May, Joshua Brown lost his life in Florida when his car, a Tesla Model S driving under autonomous cruise control, crashed into a tractor-trailer that pulled out in front of him.
Just one month before the crash Brown had posted a YouTube video of the car “saving” him from a near-collision. Tesla’s Elon Musk had even shared that video on Twitter. A neighbor remembered how Brown had gushed that “For something to catch Elon Musk’s eye, I can die and go to heaven now.” Tesla later called him “a friend to Tesla and the broader electric vehicle community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission.”
Of course, Tesla’s autopilot feature was never meant to take over for drivers. Just nine weeks earlier, a Volvo engineer had called Tesla’s autopilot an “unsupervised wannabe,” arguing that “It gives you the impression that it’s doing more than it is.” Tesla’s response to the incident shared some interesting statistics. “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”
Maybe the real problem is us humans. After the crash, USA Today discovered YouTube footage of other Tesla owners playing Jenga and Checkers while driving, or even taking a nap. “Even though Tesla tells drivers to ‘keep your hands on the wheel at all times and stay alert,’ the temptation to test a no-hands drive is just too much,” the article concluded.
Meanwhile, Brown’s insurance agent told the Times that “He’d probably fly an F-18 to test-drive it,” and they reported that he’d dismantled bombs for the Navy during the Iraq war. He truly loved cutting-edge technologies and even started a company to bring internet service to rural areas. “His Tesla, in other words, was simply one more extension of his technology-driven life.”
Ironically, Brown had reportedly been returning from a visit to Disney World when tragedy struck — and there was one more interesting wrinkle. When the police found his wrecked vehicle “There was a movie playing,” according to one witness. When Robert VanKavelaar discovered the wrecked Tesla in the yard of his home in Gainesville, Florida, “I could hear it…” he told Inside Edition. “The police officer told me it was a ‘Harry Potter’ movie.”
While Tesla insists its in-dash display doesn’t play movies, later reports suggested the movie was playing on a portable DVD player. While there’s still questions about what was going on at the time of the crash, it may have been the last piece of technology he’d ever use. “It was still playing when he died and snapped a telephone pole a quarter mile down the road,” according to the driver of the truck that killed him.
Back in 2001, Douglas Adams joked darkly about cellphone users speeding up their cars just to find a patch of highway with better reception. So instead of technology serving its users, “It’s actually killing them off.” And an analogous skepticism seems to be a natural first response to the arrival of high-powered A.I. When Elon Musk tweeted Brown’s video of the near-miss collision, the dozens of positive tweets were greeted by a few that were more critical.
“a human would’ve likely slowed down proactively, right? The tech still needs tweaks”
“do you think aggressive human drivers will take advantage of this feature, knowing the car will get out of the way?”
“Now we can play chicken with Teslas and get a viral video.”
This week a second (non-fatal) Tesla crash has also been blamed on Tesla’s autopilot. But an Uber announcement last week still showed an unshaken faith that technology can make us safer.
Uber announced that their software had become the official “designated driving app” of the non-profit Mothers Against Drunk Drivers, writing that “technologies like Uber provide an incredible opportunity to improve road safety in new and innovative ways — before, during and after every ride. The company noted that last year drunk driving caused 41 percent of the weekend’s fatalities — and in a video cited one study from Temple university which found DUI-related death decreased by 10 percent in Seattle since the introduction of Uber.
Uber also published new data from the Atlanta police department for the six years between 2010 and 2016, showing that DUI arrests fell 32 percent over that period — roughly corresponding to an increase in Uber pickups.
More data showed Uber’s busiest period is around midnight on Fridays and Saturdays. And a survey of Uber drivers found a whopping 80 percent of them said they’d personally used Uber to avoid drinking and driving.
Along with MADD, Uber has now teamed up with the Governor’s Highway Safety Associations for another data-based approach to improving driver safety by monitoring their drivers while they’re driving. Or at least, keeping track of what they’re doing with their phone — whether it’s mounted on their dashboard or wiggling around in the driver’s hands — and also providing daily reports on how safely the driver in performing. There are even real-time reminders about when it’s time for a break, and the company will be rolling out the features soon in 11 U.S. cities.
If we don’t end up using an A.I. to drive our cars better, maybe instead we can use it to make us into better drivers.
Feature image: A YouTube video of a Tesla Autopilot test, from Slow News Day.