Analyze, evaluate, and apply principles to real-world, complex situations of the assigned article below. Open a new Microsoft® Word document or similar and use the Critical Thinking Questions at the...

1 answer below »


Analyze, evaluate, and apply principles to real-world, complex situations of the assigned article below.



  • Open a new Microsoft® Word document or similar and use the Critical Thinking Questions at the end of the article as a guide:Do not include the questions in your summary.

  • Follow the writing rules you learned from ENC1101 (title page which includes your name and course, double space, spell-check, minimum of 500 words, etc..)

  • Save the file with your last name in the file name.

  • Upload the document to Canvas.







How Safe Are Self-Driving Cars?


According to the National Safety Council, an estimated 38,300 people were killed and another 4.4 million were injured as a result of accidents on U.S. roads in 2015. The vast majority of fatal accidents are due to human error, so self-driving vehicles have the potential to save a lot of lives. Indeed, one study estimated that widespread adoption of self-driving vehicles by the year 2030 could eliminate 90 percent of all auto accidents in the United States, thus eliminating close to $190 billion in auto repair and health care–related costs annually, and, even more importantly, saving thousands of lives.


The NHTSA recently adopted the Society of Automotive Engineers' levels for automated driving systems. The six levels range from complete driver control to full autonomy, as summarized below: Level 0 (no automation): A human driver controls it all: steering, brakes, acceleration, and the like, although the car may include some warning or intervention systems.



  • Level 1 (driver assistance): Most functions are controlled by the human driver, but some specific functions (such as steering or accelerating) can be done automatically by the car.

  • Level 2 (partial automation): These cars have at least one driver assistance system of "both steering and acceleration/deceleration using information about the driving environment" (such as cruise control or lane-centering) that is automated. The driver must still always be ready to take control of the vehicle, however, to handle "dynamic driving tasks."

  • Level 3 (conditional automation): Drivers are able to completely shift "safety-critical functions" to the vehicle under certain traffic or environmental conditions. The driver is still present and is expected to "respond appropriately" if asked to intervene.

  • Level 4 (high automation): At this level, cars are fully autonomous and are designed to handle all aspects of the dynamic driving task—even if a human driver does not respond appropriately to a request to intervene—including, performing all safety-critical driving functions and monitoring roadway conditions for an entire trip. However, it's important to note that this is limited to the "operational design domain" of the vehicle—meaning it does not cover every driving scenario.

  • Level 5 (full automation): Cars at this level have a fully autonomous system that is designed to handle all aspects of the dynamic driving task under all the roadway and environmental conditions that could be managed by a human driver— including extreme environments, such as on dirt roads (which are unlikely to be navigated by driverless vehicles in the near future).


Autonomous vehicles are chock full of sensors and cameras that observe, monitor, and record the surrounding environment, including other vehicles in the vicinity. All these data are fed into an artificial intelligence algorithm that makes decisions on what movements are right, wrong, safe, and unsafe for the car to perform given the conditions it is experiencing. Self-driving cars even have the ability to share their driving experiences and recorded data with other cars so that each car's computer can adapt its algorithm to the environments faced by other vehicles. The goal of this information sharing would be to improve the ability of all self-driving vehicles to react to situations on the road without actually having to experience those situations firsthand.


Tesla CEO Elon Musk, perhaps optimistically, anticipates the first fully autonomous Tesla to be ready by 2018 but expects that regulatory approval may require an additional one to three years. Audi, BMW, Fiat Chrysler, Ford, General Motors, Mercedes, Nissan, Toyota, Volvo, and Waymo (the new name of Google's self-driving division), all have some level of autonomous vehicle today, and all have plans to deliver a fully autonomous vehicle by 2025 or sooner.


The testing of autonomous vehicles has not been without incident, however. In 2016, one of Google's self-driving cars hit a bus during a test drive in California because the car made an incorrect assumption about how the bus would react in a particular situation. The vehicle had identified an obstruction on the road ahead, so it decided to stop, wait for the lane next to it to clear, and then merge into the other lane. Although the vehicle detected a city bus approaching in that lane, it made an incorrect assumption that the bus driver would slow down. The bus driver, however, assumed the car would stay put, so he kept moving forward. The car pulled out, hitting the side of the bus while going about 2 mph. This was the first time in several years of testing on public roads that a Google self-driving car caused a crash. Understanding why a crash involving a self-driving car occurred is important in order to avoid a repeat of that accident scenarios. In this case, Google made necessary changes to its software so that it would "more deeply understand that buses and other large vehicles are less likely to yield" than other types of vehicles.


A Tesla Model S with its autopilot system activated was involved in a fatal crash in 2016, the first known fatality in a Tesla in which autopilot was active. The accident occurred when a tractor-trailer drove across the highway perpendicular to the Model S. Neither the driver nor the car noticed the big rig or the trailer "against a brightly lit sky," and the brakes were not applied. The vehicle's radar didn't help in this case because, according to Tesla, it "tunes out what looks like an overhead road sign to avoid false braking events." The NHTSA is investigating the accident to determine if the autopilot system was working properly; if not, it could consider ordering a recall to repair the problem.



Critical Thinking Questions


1. When self-driving cars are involved in accidents, where does liability reside? Is it the driver's fault? Is the car manufacturer or software manufacturer libel? How might the deployment of self-driving cars affect the insurance industry?


2. Some industry experts believe that the future of autonomous cars depends on the standardization of artificial intelligence algorithms across all vehicles. Such standardization would allow vehicles from different automobile manufacturers to share driving experience data and artificial intelligence algorithm updates. So, an adjustment like the one that Google made so its software would "more deeply understand that buses and other large vehicles are less likely to yield" could be shared with other automakers. What are the pros and cons of implementing a standard artificial intelligence algorithm across all manufacturers? Do you think the vehicle manufacturers would accept this mandate? Why or why not?


3. Automated driving systems range from complete driver control (level 0) to full autonomy (level 5). Should the degree of care exercised in developing vehicle software increase as the level of autonomy increases, or should all vehicle software be treated with the same level of care? Explain your answer.

Answered Same DayApr 05, 2021

Answer To: Analyze, evaluate, and apply principles to real-world, complex situations of the assigned article...

Azra S answered on Apr 06 2021
149 Votes
CRITICAL THINKING- SELF-DRIVING CARS AND SAFETY
Alex
Course-
Self-driving cars come in different
levels so I believe that liability for an accident would depend on the level of autonomy used by self-driving cars. In cars that are fully autonomous, it is natural to consider the software manufacturer liable (Coeckelbergh, 2016). However, I personally believe that the responsibility of an accident largely lies on the driver. While on the road, a driver, whether driving or not, should pay close attention to the road. That is the only reason he is designated as a driver. If a driver wishes to relax or engage in...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here