The National Highway Traffic Safety Administration (NHTSA) said it will investigate the 12th Tesla car crash where Autopilot, its semi-autonomous driver-assist, may have been involved. This one took place Dec. 7 in Norwalk, Connecticut. A 2018 Tesla Model 3 driving Interstate 95 rear-ended a parked police car early Saturday. There were no injuries to the occupants of either car, including a dog in the Tesla’s back seat.
According to police, the 33-year-old driver had Autoplot activated as he drove I-95. Just before the accident, the driver said, he turned his attention to the back seat to make sure the dog was okay. They said two Connecticut state troopers were waiting for a tow truck to assist a disabled vehicle when the Tesla, with the license plate MODEL3 leading the charge, nailed the police car, bounced off, and then hit the stranded vehicle.
A state police spokesperson added that the police cruiser had its flashers on and road flares behind it; she also reiterated the dog was okay. The driver was issued a misdemeanor summons for reckless driving and reckless endangerment.
Tesla says ‘Autopilot.’ Doesn’t Mean It Is (Yet)
Here’s the problem the National Highway Traffic Safety Administration worries about: If you don’t pay careful attention to the fine print on Tesla.com, or if you listen too much to CEO Elon Musk’s wide-ranging praise of Autopilot, you might think it’s almost a set-and-forget feature.
Hardly. Tesla Autopilot is Level 2 (of 5) self-driving. It’s what you get on a limited-access road with virtually no roadside hazards, and in a technical sense, what you get when you combine stop-and-go adaptive cruise control with lane-centering assist. The driver has to keep his or her hands on the wheel because once in a great while (and moreso on curves) the car loses its ability to lane-center, or because another car cuts in front into your lane and then slows down. ACC often can’t react that quickly.
Teslas have bells and whistles, such as the ability to automatically change lanes after the driver taps the turn signal. (Other cars do this, not a lot, though.) The car typically has a driver-involvement sensor, either a wheel motion sensor for the micro-movements that alert drivers always make, or an eye-tracker that watches that the driver’s eyes are looking ahead. There’s a timer, about 10 seconds, that then sounds an alert if it still senses driver inattention. After 5-10 seconds more, the self-drive system beeps, flashes, and orders the driver to take charge. A few high-end systems, on BMW, Cadillac, and the like will slow the car, turn on the hazard lights, stop the car, and call (via telematics) for help. Future cars might be able to pull off the road.
NHTSA’s special crash investigation program will investigate this crash. Other fatal Tesla crashes have been investigated by the National Transportation Safety Board (NTSB), which also investigates airplane and train crashes.
The concern with Tesla is that some drivers have reported they can continue driving for longer periods – many minutes – without the car alerting the driver. This sets up bizarre situations, such as the one in May, when porn actress Taylor Jackson and her boyfriend managed to ply her trade in a Tesla and come out of it with a nine-minute video. Jackson told Business Insider the couple’s forward progress stalled, temporarily, when she accidentally hit the steering wheel and disabled Autopilot.
Elon Musk, in a perhaps informal competition with President Trump to see who can blast out more memorable tweets, posted at the time, “Turns out there’s more way to use Autopilot than we imagined … Shoulda seen it coming …” So to speak, Elon.
Naturally, safety activists criticized Musk for having a cavalier attitude. But the bigger issue is the whole aura around Autopilot, including its name, may imply more capability than the car deliver. So far, no production car has reached Level 3 autonomy, which is for autonomous driving where the driver can be hands-off and eyes-off but has to be able to assume control after an appropriate interval. So far, it’s not clear the interval would be enough for a snoozing driver to regain alertness.
Of the dozen incidents where Autopilot was believed engaged, there have been several fatalities. In 2018, Walter Huang died in a California accident where Autopilot was engaged at the time of the crash.
Some reports have shown that in cars with partial self-driving, the accident rate has gone down. So it’s not clear-cut about the risks with the current level of automation.
Now read:
- NTSB: Autopilot Design Flaw, Inattentive Driver Led to Tesla-Firetruck Crash
- Driver in Fatal Tesla Autopilot Crash Wasn’t Paying Attention: Feds
- Report: Tesla Model X Accelerated Toward Barrier Before Fatal Crash
from ExtremeTechExtremeTech https://ift.tt/2Eu65Q4
ليست هناك تعليقات:
إرسال تعليق