Meal Prep Delivery

Friday, November 1, 2024

Self-Driving Tesla Hits Deer Without Braking—Is Pedestrian Safety at Risk?

Self-driving cars might seem futuristic, but car companies worldwide are working hard to make them a normalized part of human society. It’s easy to see how a self-driving car could make your life easier. But this sort of autonomous technology also raises serious questions about ethics and safetynamely, whose safety does the car prioritize in a dangerous situation? And can we really trust a software program to make a life-or-death decision?

Recently, a Tesla driver shared an experience that definitely raises some of those serious safety questions. X user @TheSeekerOf42 posted photos and videos after crashing into a deer on the road while his Tesla was in “Full Self Driving” mode. The vehicle suffered some minor cosmetic damage. More concerning was the fact that the video he shared revealed the self-driving vehicle did not even slow down when approaching the deer, nor did it stop after the collision.


FSD didnt stopped [sic], even after hitting the deer on full speed,” wrote @TheSeekerOf42 on X.com. “Huge surprise after getting a dozen of false stops every day!”

Not slowing down before a collision and not stopping afterward seems like a pretty major safety issue for self-driving cars. But was this an example of a mistake in the software, or is this how Tesla’s self-driving software is intended to work? Let’s take a look.

How Does Self-Driving Technology Work?

Most self-driving cars use an array of sensors, cameras and radar technology to navigate the road and detect things outside the car, such as obstacles and road signs. The software takes in all of the inputs from its cameras and sensors and uses that information to make real-time decisions on the road.

Tesla’s self-driving tech is slightly different because its cars only use cameras to navigate, opting to go without the lidar and radar scanners other systems employ. Experts say this can cause major problems, especially in cases where a Tesla’s self-driving software encounters something unfamiliar.

The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an over-turned double trailer – it just didnt know what it was, Phil Koopman, associate professor of electrical and computer engineering at Carnegie Mellon University, told the WSJ. A person would have clearly said something big is in the middle of the road, but the way machine learning works is it trains on a bunch of examples. If it encounters something it doesnt have a bunch of examples for, it may have no idea whats going on.

How Do Self-Driving Cars Prioritize Safety?

Self-driving cars have the potential to completely eliminate human mistakes from driving. A computer program doesn’t get tired, won’t drive more emotionally after a long day at work, and can’t get distracted by a text or choosing a new playlist. But another thing it can’t do is make ethical decisions.

So what happens when the software faces an unavoidable accident and has to decide between the safety of two or more people? Unfortunately, the answer to this question is still quite murky. In the case of @TheSeeker420, it does not seem like Tesla’s software made any effort to swerve or slow down, prioritizing the safety of its passengers over avoiding an external obstacle.

But what if the obstacle had been a human, not a deer? Would the software have responded differently? These are some of the difficult questions posed by self-driving cars, presenting safety and ethics dilemmas that may be hard to answer until automated driving technology advances further. Whatever the case may be, auto companies will have to be exceedingly transparent and honest about safety if they want the masses to put their lives in the hands of a self-driving computer program.

Sources:

  • Wall Street Journal:“The Hidden Autopilot Data That Reveals Why Teslas Crash” (2024)
  • Forbes: “What Are Self-Driving Cars? The Technology Explained” (2024)

The post Self-Driving Tesla Hits Deer Without Braking—Is Pedestrian Safety at Risk? appeared first on Family Handyman.



Article source here: Self-Driving Tesla Hits Deer Without Braking—Is Pedestrian Safety at Risk?

No comments:

Post a Comment

Teddy Bear Insulation? Man Uncovers Unusual Wall Stuffing During Reno

People have found all kinds of unusual things in their walls during renovations, but this find from TikToker and woodworker Connor Nijsse m...