الأربعاء، 22 سبتمبر 2021

NEWS TECHNOLOGIE

Concerns about artificial intelligence and its impact on work are not new, but as more companies deploy these solutions we’re seeing decided snags in the process. One point many of these conversations take for granted is that AI-powered tools work. What happens if they don’t?

The pandemic has fueled an explosion in semiconductor sales and a significant rise in the number of employees who are kept under surveillance by their employers. In some cases, people aren’t just being watched — they’re being graded. This might not be a problem if the AI tools in question were robust enough to do the job, but all available evidence suggests they very much are not.

A new story at Motherboard details the results of Amazon’s latest push to introduce AI technology in the workplace. Last February, Amazon began installing cameras from the fleet camera company Netradyne, with the supposed goal of keeping drivers safe. Netradyne’s website pitches the company’s technology in exactly these terms, emphasizing that it can keep drivers focused on the road. The system tracks whether drivers maintain proper following distance, obey stop signs and street lights, and keep their attention on the road.

It’s hard to argue with the idea that people who drive for a living should be required to do these things. But according to the drivers actually delivering Amazon’s packages, the system is a nightmare. The problem isn’t that people are being forced to follow the law. The problem is that the Netradyne system isn’t very good at deciding when a driver is or isn’t breaking the law and Amazon offers no method for drivers to contest events.

“I have been ‘dinged’ for following too close when someone cuts me off,” one driver told Motherboard. “If I look into my mirrors to make sure I am safe to change lanes, it dings me for distraction because my face is turned to look into my mirror. I personally did not feel any more safe with a camera watching my every move.”

Another driver indicated the Netradyne AI system has a major problem with false stops. Apparently, the system has a bad habit of flagging yield signs as stop signs (and penalizing drivers for failing to stop), while simultaneously penalizing drivers if they stop at a stop sign and then pull forward slowly to look around a blind curve. Anyone who has driven for any length of time is aware that neighborhoods and businesses do not always maintain proper lines of sight. It can be dangerous to accelerate away from a stop sign without checking around a brush-obscured corner.

“Most false positives we get are stop sign violations,” he said. “Either we stop after the stop sign so we can see around a bush or a tree and it dings us for that, or it catches yield signs as stop signs. A few times, we’ve been in the country on a dirt road, where there’s no stop sign, but the camera flags a stop sign.”

A human driver who observes another human taking an intersection cautiously will reflexively scan the situation for context clues about why this is happening. Netradyne’s AI is incapable of this kind of evaluation. It only “sees” whether the vehicle is operating according to its own inflexible logic.

Amazon spokespeople insist that the Netradyne system has yielded positive results, with accidents down 48 percent, stop sign and signal violations down 77 percent, driving without a seatbelt reduced by 60 percent, following distance violations down 50 percent, and distracted driving decreased by 75 percent. These are impressive figures, to be sure. But they don’t actually tell us much and Amazon isn’t known for its honesty when dealing with the press.

For starters, we don’t know how this information was being gathered prior to the Netradyne system’s installation, so we don’t know how to compare the before-and-after figures. The 77 percent reduction in stop sign and signal violations may reflect the fact that Amazon’s delivery drivers are being more diligent, or it could indicate that drivers are avoiding false positives at stop signs by behaving in a less-safe manner that’s also less likely to cause a ding on their driving record.

Part of the problem is that these metrics are being used to determine how much Amazon’s delivery partners get paid. Too many Netradyne events can ruin a company’s score, reducing how much it earns from Amazon that month. There’s probably validity to the concept that this creates an incentive for a company to hire good drivers, but the ability of such metrics to achieve their goals is predicated on the idea that they’re measuring correctly in the first place.

Amazon is squeezing companies to make pro-safety changes while simultaneously pushing companies to adopt delivery schedules so punishing, some drivers carry plastic bottles in lieu of attempting to visit a restroom. Earlier this year, two Oregon companies effectively shut themselves down rather than continue hauling packages for Amazon. Investigative reports have repeatedly found that Amazon’s warehouse culture is a brutally difficult work environment, so it’s not surprising to see the company pushing the same model outwards in its business model.

Various delivery companies believe Amazon has instituted these practices so it can avoid paying them. Amazon insists it’s only trying to protect safety. According to Motherboard, various companies are telling drivers how to circumvent these tracking systems because turning them on means handing Amazon an excuse not to pay. Why aren’t drivers wearing seatbelts? Because Amazon insists on delivery schedules so demanding, drivers say they don’t have time to wear them. Regardless of what one believes about Amazon’s intentions, it’s obvious that its methods are having unintended consequences that work against the goal of improving driver safety records.

Perhaps the most maddening aspect of the entire situation is that there is no meaningful appeal process. While delivery companies can apparently submit feedback tickets for specific events, the Netradyne system logs hundreds of events per week and Amazon almost never overturns a previous decision. Companies mostly don’t try to contest these decisions because contesting them almost never works.

This kind of problem should ring more alarm bells than it probably will. For all the good AI can do in the right circumstances, tools like Netradyne are not ready for deployment if they generate false positives at this kind of rate. If Netradyne can’t offer a product that properly detects driver behavior in all circumstances, it has no business claiming otherwise.

It’s possible that Amazon really has seen the kind of safety improvements it claims, but increased safety is not the only variable that matters, here. A safety system whose improper detection systems cause drivers to act in an unsafe manner is definitionally less safe than one which does not. It’s all well and good to claim that the benefits represent a net improvement, but that does no good to the individuals who are unfairly penalized or even rendered unemployed because a random piece of software decided they were a problematic driver with zero human oversight or review.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/3lLJcwx

ليست هناك تعليقات:

إرسال تعليق