News:

Needing some php assistance with the script on the main AARoads site. Please contact Alex if you would like to help or provide advice!

Main Menu

Self-driving cars

Started by The Ghostbuster, May 12, 2015, 03:39:48 PM

Previous topic - Next topic

tradephoric

Automation doesn't work well when you have bad data.  A faulty angle-of-attack sensor on Lion Air Flight 610 led to the Maneuvering Characteristics Augmentation System (MCAS) to be activated.  MCAS was designed to pitch the nose down under a very specific set of circumstances (mainly happening during take offs) but with the faulty angle-of-attack sensor MCAS was activated throughout the flight which led to loss of control and the deaths of all 189 people on board.  A few months later the same issue with MCAS occurred on Ethiopian Airlines Flight 302 killing all 157 people on board ultimately leading to the worldwide grounding of the MAX Air Fleet. 



Roadgeekteen

Quote from: WillWeaverRVA on May 25, 2021, 10:19:31 AM
Quote from: Roadgeekteen on May 24, 2021, 09:38:25 PM
Quote from: kalvado on May 24, 2021, 09:04:44 PM
Quote from: Roadgeekteen on May 24, 2021, 08:30:05 PM
I hope we get them soon. Humans are bad drivers.
Once we get self driving cars, you may realize how good human drivers are...
Eh a car can't get drunk.

A Tesla recently crashed and burned completely to a crisp with the occupants inside because they trusted the car's autopilot to make its own decisions. We're not there yet, but the fact people think we are is a good indication that this is not the right time.
The driver could still probably take emergency control. Planes are mostly autopilot but they still have pilots.
God-emperor of Alanland, king of all the goats and goat-like creatures

Current Interstate map I am making:

https://www.google.com/maps/d/u/0/edit?hl=en&mid=1PEDVyNb1skhnkPkgXi8JMaaudM2zI-Y&ll=29.05778059819179%2C-82.48856825&z=5

Henry

I hear GM is working on something like this right now...
Go Cubs Go! Go Cubs Go! Hey Chicago, what do you say? The Cubs are gonna win today!

jeffandnicole

Quote from: kkt on May 25, 2021, 12:57:57 AM
Quote from: jeffandnicole on May 24, 2021, 11:37:15 PM
Quote from: kalvado on May 24, 2021, 09:04:44 PM
Quote from: Roadgeekteen on May 24, 2021, 08:30:05 PM
I hope we get them soon. Humans are bad drivers.
Once we get self driving cars, you may realize how good human drivers are...

6 million accidents occur annually. Over 30,000 are killed annually...and that's half what it was just a few decades ago.

It's by far one of the most dangerous activities most people do on a daily basis.

Commercial flights are nearly fully automated by computers. The correlation to the fact that there's also very few airline crashes is very relevant.

The key word being "nearly".  If the FAA were that confident that computers could do it all, they wouldn't have human pilots, let alone both a pilot and copilot.  The autopilot is great when things are going well, but has no ability to recover when things aren't going well.


I'm not sure how nearly could be mis-interpreted here. The FAA continues to mandate a pilot and copilot for many reasons, including having ultimate control over the airplane and its occupants, and can deviate when necessary.

Autopilot generally prevents issues from occurring in the first place, resulting in there not being a need to recover.

Quote from: tradephoric on May 25, 2021, 10:24:56 AM
Automation doesn't work well when you have bad data.  A faulty angle-of-attack sensor on Lion Air Flight 610 led to the Maneuvering Characteristics Augmentation System (MCAS) to be activated.  MCAS was designed to pitch the nose down under a very specific set of circumstances (mainly happening during take offs) but with the faulty angle-of-attack sensor MCAS was activated throughout the flight which led to loss of control and the deaths of all 189 people on board.  A few months later the same issue with MCAS occurred on Ethiopian Airlines Flight 302 killing all 157 people on board ultimately leading to the worldwide grounding of the MAX Air Fleet. 

It's important to note a significant issue here: The pilot was still in control but couldn't resolve the issue in these incidents. Other pilots felt similar issues and were able to recover.

The FAA and other administrations were able to ground the entire fleet worldwide. When there's a significant vehicle recall, it's voluntary on the vehicle owner bringing the car in. There's no massive ban on people driving the vehicle.

It's also important to note that many like to engage in the sport of extreme nitpicking. No matter what happens, someone will find the rare instances of an issue.

kalvado

Quote from: tradephoric on May 25, 2021, 10:24:56 AM
Automation doesn't work well when you have bad data.  A faulty angle-of-attack sensor on Lion Air Flight 610 led to the Maneuvering Characteristics Augmentation System (MCAS) to be activated.  MCAS was designed to pitch the nose down under a very specific set of circumstances (mainly happening during take offs) but with the faulty angle-of-attack sensor MCAS was activated throughout the flight which led to loss of control and the deaths of all 189 people on board.  A few months later the same issue with MCAS occurred on Ethiopian Airlines Flight 302 killing all 157 people on board ultimately leading to the worldwide grounding of the MAX Air Fleet.
THat's why system design should include redundancy and cross-checks. And that works for humans exact same way.
How many human driver crashes occurred because driver didn't see something - such as motorcycle in adjacent lane? Or tried to swerve around something non-existing?

WillWeaverRVA

Quote from: Roadgeekteen on May 25, 2021, 10:55:30 AM
Quote from: WillWeaverRVA on May 25, 2021, 10:19:31 AM
Quote from: Roadgeekteen on May 24, 2021, 09:38:25 PM
Quote from: kalvado on May 24, 2021, 09:04:44 PM
Quote from: Roadgeekteen on May 24, 2021, 08:30:05 PM
I hope we get them soon. Humans are bad drivers.
Once we get self driving cars, you may realize how good human drivers are...
Eh a car can't get drunk.

A Tesla recently crashed and burned completely to a crisp with the occupants inside because they trusted the car's autopilot to make its own decisions. We're not there yet, but the fact people think we are is a good indication that this is not the right time.
The driver could still probably take emergency control. Planes are mostly autopilot but they still have pilots.

They could, but that requires a level of education that is clearly not provided. The occupants of the Tesla were in the back seat and one of them was trying to crawl to the front seat to take control when it crashed.
Will Weaver
WillWeaverRVA Photography | Twitter

"But how will the oxen know where to drown if we renumber the Oregon Trail?" - NE2

kalvado

Quote from: jeffandnicole on May 25, 2021, 10:59:04 AM

Quote from: tradephoric on May 25, 2021, 10:24:56 AM
Automation doesn't work well when you have bad data.  A faulty angle-of-attack sensor on Lion Air Flight 610 led to the Maneuvering Characteristics Augmentation System (MCAS) to be activated.  MCAS was designed to pitch the nose down under a very specific set of circumstances (mainly happening during take offs) but with the faulty angle-of-attack sensor MCAS was activated throughout the flight which led to loss of control and the deaths of all 189 people on board.  A few months later the same issue with MCAS occurred on Ethiopian Airlines Flight 302 killing all 157 people on board ultimately leading to the worldwide grounding of the MAX Air Fleet. 

It's important to note a significant issue here: The pilot was still in control but couldn't resolve the issue in these incidents. Other pilots felt similar issues and were able to recover.

The FAA and other administrations were able to ground the entire fleet worldwide. When there's a significant vehicle recall, it's voluntary on the vehicle owner bringing the car in. There's no massive ban on people driving the vehicle.

It's also important to note that many like to engage in the sport of extreme nitpicking. No matter what happens, someone will find the rare instances of an issue.
MCAS was much more than that - it was actively turning control. 2 out of 3 crews encounting that in flight crashed, about half of US pilots in a sim did as well.
I can see something similar happening for cars which get out of control.
as for recalls - VW recall was reinforced by "no title transfer" in some states - so car could not be sold.
Car certifications lacks enforcement powers, but there are some critical recalls with "do not drive" message attached. E.g. https://www.nhtsa.gov/press-releases/low-completion-rates-do-not-drive-warning

J N Winkler

There is a tendency to see driving as an activity that includes so many repetitive tasks a monkey could do it, but Tom Vanderbilt's Traffic quotes research to the effect that it is actually one of the most complex things we do in terms of cognition and muscle movement.  The New York Times article does definitely suggest that Alphabet and the ridesharing companies have delved far enough into the nitty-gritty of making self-driving vehicles safe that they have had to pare back the elements of their business plans that assume they will be widely available in the short term.

In comparison, autopilot for aircraft is much simpler technology--so much so that implementations pre-date modern silicon-based computer hardware by over 30 years (rudimentary forms of pilot assist by 1912, first-generation auto-landing capability by 1959).

Moreover, supervising something that is running with automation most of the time, but requires brief human intervention at unpredictable intervals, requires a somewhat different skillset than providing human input 100% of the time.  It is a generalization of the problem of supervising a car running with old-school cruise control engaged on an uncrowded freeway:  it's not the same as keeping the car moving forward with foot input on the accelerator pedal, since staying on task and watching for upcoming situations that require manual control inputs become more salient issues.
"It is necessary to spend a hundred lire now to save a thousand lire later."--Piero Puricelli, explaining the need for a first-class road system to Benito Mussolini

GaryV

And Tesla's typical response when a vehicle crashes is that we told you it doesn't drive itself.  This despite them calling the feature "Autopilot". 

jeffandnicole

Quote from: GaryV on May 25, 2021, 01:12:50 PM
And Tesla's typical response when a vehicle crashes is that we told you it doesn't drive itself.  This despite them calling the feature "Autopilot". 

You're taking it too literal, just like people who believe cruise control means they don't have to hold the steering wheel.  And a vehicle owner will be given much more instruction than John Doe who reads a summarized  news story.

tradephoric

Quote from: jeffandnicole on May 25, 2021, 10:59:04 AM
It's important to note a significant issue here: The pilot was still in control but couldn't resolve the issue in these incidents. Other pilots felt similar issues and were able to recover.

Regulators assumed it would take pilots a mere 3 seconds to respond to an unexpected MCAS activation.  In the flight before the doomed Lion Air Flight the pilots were able to regain control of the plane but it took them minutes to resolve the unexpected MCAS activation - not 3 seconds.  What potentially saved that flight was the fact there was an off-duty pilot sitting in the jump seat behind the pilots who noticed a runaway trim wheel and correctly diagnosed the malfunctioning MCAS system.  Keep in mind the faulty angle-of-attack sensor were leading to erroneous airspeed/altitude readings that the pilots would have needed to sorted out while dealing with the automatic pitch down issue they were experiencing.

Scott5114

Pilots have to go through far lengthier training and far stricter licensing requirements than drivers do. These dipshits trying to sit in the back seat of the car while it pilots itself wouldn't qualify for a pilot's license.

Quote from: jeffandnicole on May 25, 2021, 01:24:56 PM
Quote from: GaryV on May 25, 2021, 01:12:50 PM
And Tesla's typical response when a vehicle crashes is that we told you it doesn't drive itself.  This despite them calling the feature "Autopilot". 

You're taking it too literal, just like people who believe cruise control means they don't have to hold the steering wheel.  And a vehicle owner will be given much more instruction than John Doe who reads a summarized  news story.

Have you met the general public? Naming a feature "Autopilot" while insisting it's for marketing reasons and you shouldn't actually try to let the car drive itself has predictable results.

When presented with any piece of printed material, always expect the general public to pick out the three words out of the whole thing that get them what they wish it meant, and pretend that the rest doesn't exist.
uncontrollable freak sardine salad chef

kalvado

Quote from: Scott5114 on May 25, 2021, 02:16:01 PM
Pilots have to go through far lengthier training and far stricter licensing requirements than drivers do. These dipshits trying to sit in the back seat of the car while it pilots itself wouldn't qualify for a pilot's license.

Quote from: jeffandnicole on May 25, 2021, 01:24:56 PM
Quote from: GaryV on May 25, 2021, 01:12:50 PM
And Tesla's typical response when a vehicle crashes is that we told you it doesn't drive itself.  This despite them calling the feature "Autopilot". 

You're taking it too literal, just like people who believe cruise control means they don't have to hold the steering wheel.  And a vehicle owner will be given much more instruction than John Doe who reads a summarized  news story.

Have you met the general public? Naming a feature "Autopilot" while insisting it's for marketing reasons and you shouldn't actually try to let the car drive itself has predictable results.

When presented with any piece of printed material, always expect the general public to pick out the three words out of the whole thing that get them what they wish it meant, and pretend that the rest doesn't exist.
Three words? That is pessimistic. I would say 5 is a more reasonable expectation for a take-home message.

vdeane

Yeah, that's the thing - Tesla justified the name as saying "it's like an aircraft autopilot, aircraft still have pilots ready to take over if something happens, so why wouldn't you pay attention in case you need to take over from the car?".  Never mind that most people do not think of real-world aircraft systems when they hear the term, they think of the inflatable one from the MOVIE Airplane!

To answer the original question, while self-driving systems today do expect the driver will take over if needed (even though real-world drivers often don't), the ultimate concept is a pod with no steering wheel and the front seats turned toward the rear so that all occupants of the car can talk to teach other, with in-car advertising displayed on the windows.
Please note: All comments here represent my own personal opinion and do not reflect the official position of NYSDOT or its affiliates.

kkt

It's poor design to expect a human to be sitting in the driver's seat doing nothing but still paying attention to the road for a long time.  If the driver is supposed to be paying attention, they need to have something to do, at least steering.  Having them do nothing is an invitation to sit in the back or watch a movie on their phone.

kalvado

Quote from: kkt on May 25, 2021, 03:47:41 PM
It's poor design to expect a human to be sitting in the driver's seat doing nothing but still paying attention to the road for a long time.  If the driver is supposed to be paying attention, they need to have something to do, at least steering.  Having them do nothing is an invitation to sit in the back or watch a movie on their phone.
that's why I don't use cruise. Last time I actually used cruise for about a mile was to take shoes off on a highway without stopping.

CtrlAltDel

Quote from: kalvado on May 25, 2021, 04:36:18 PM
Quote from: kkt on May 25, 2021, 03:47:41 PM
It's poor design to expect a human to be sitting in the driver's seat doing nothing but still paying attention to the road for a long time.  If the driver is supposed to be paying attention, they need to have something to do, at least steering.  Having them do nothing is an invitation to sit in the back or watch a movie on their phone.

that's why I don't use cruise. Last time I actually used cruise for about a mile was to take shoes off on a highway without stopping.

I don't know if that's better.
Interstates clinched: 4, 57, 275 (IN-KY-OH), 465 (IN), 640 (TN), 985
State Interstates clinched: I-26 (TN), I-75 (GA), I-75 (KY), I-75 (TN), I-81 (WV), I-95 (NH)

formulanone

Quote from: I-55 on May 24, 2021, 11:04:15 PM
But a human can't get hacked.

Television and radio have been doing it before computers were commonplace.

Computers just make it easier for humans to hack themselves, or at least assist with defragmentation.

jeffandnicole

https://www.inquirer.com/business/septa-bus-driver-schools-jobs-philly-atlantic-city-20210612.html

The article itself is about wages and salaries and need for bus drivers. However, there's a few paragraphs about self driving buses. While many diminish the possibility that they'll catch on, the fact is they already exist, probably in greater numbers than people realize, and they are significantly researching and developing more buses to run by themselves.

kalvado

Quote from: jeffandnicole on June 12, 2021, 07:51:34 AM
https://www.inquirer.com/business/septa-bus-driver-schools-jobs-philly-atlantic-city-20210612.html

The article itself is about wages and salaries and need for bus drivers. However, there's a few paragraphs about self driving buses. While many diminish the possibility that they'll catch on, the fact is they already exist, probably in greater numbers than people realize, and they are significantly researching and developing more buses to run by themselves.
The way things are described, it sounds like driverless technology is right around the corner - and it is going to stay there for decades to come.

MCRoads

I build roads on Minecraft. Like, really good roads.
Interstates traveled:
4/5/10*/11**/12**/15/25*/29*/35(E/W[TX])/40*/44**/49(LA**)/55*/64**/65/66*/70°/71*76(PA*,CO*)/78*°/80*/95°/99(PA**,NY**)

*/** indicates a terminus/termini being traveled
° Indicates a gap (I.E Breezwood, PA.)

more room plz

empirestate

Quote from: kalvado on June 12, 2021, 11:23:58 AM
The way things are described, it sounds like driverless technology is right around the corner - and it is going to stay there for decades to come.

I mean, driverless technology is relatively easy and has been here for some time. The trick has always been the mixture of driverless and driven technology.

Quote from: MCRoads on June 12, 2021, 04:33:32 PM
Anyone seen this?

https://amp.reddit.com/r/teslamotors/comments/nrs8kf/you_think_ice_cream_truck_stop_signs_are_a_problem/?utm_source=share&utm_medium=ios_app&utm_name=iossmf

I think self-driving cars are a little further off than Tesla would like you to think...

And this is not a bad example. In a fully driverless system this wouldn't come up, because signals either wouldn't need to exist, or else would identify themselves to all vehicles and thus obviate the need for the vehicles to guess.

Dirt Roads

The number one rule for driverless vehicles is complete segregation of the guideway.  No humans, animals, tools and equipment, or trees, rocks and other debris can be permitted to enter the guideway while vehicles are in motion.  (Sometimes, there are reasons to allow maintenance personnel to enter the guideway area).  This also includes parts that can fall off of other vehicles, such that most driverless vehicles have redundant shield covers beneath steering and suspension elements.

This all becomes an insurance issue in the world of autonomous cars.  If you encounter road debris (an occasional risk element), the control system cannot be responsible to avoid contact/impact under many circumstances.

empirestate

Quote from: Dirt Roads on June 12, 2021, 06:45:58 PM
The number one rule for driverless vehicles is complete segregation of the guideway.  No humans, animals, tools and equipment, or trees, rocks and other debris can be permitted to enter the guideway while vehicles are in motion.  (Sometimes, there are reasons to allow maintenance personnel to enter the guideway area).  This also includes parts that can fall off of other vehicles, such that most driverless vehicles have redundant shield covers beneath steering and suspension elements.

Yup, and there are many such systems in place for public transit. So the challenge of translating that to private transit is a social one, not a technological one.

QuoteThis all becomes an insurance issue in the world of autonomous cars.  If you encounter road debris (an occasional risk element), the control system cannot be responsible to avoid contact/impact under many circumstances.

Sure, and that's doubtless a far simpler insurance issue than that of the risks of human-driven vehicles. For one thing, the question of individual fault becomes moot, at least as far as the owner/occupants of the vehicle are concerned.

jeffandnicole

Quote from: Dirt Roads on June 12, 2021, 06:45:58 PM
The number one rule for driverless vehicles is complete segregation of the guideway.  No humans, animals, tools and equipment, or trees, rocks and other debris can be permitted to enter the guideway while vehicles are in motion.  (Sometimes, there are reasons to allow maintenance personnel to enter the guideway area).  This also includes parts that can fall off of other vehicles, such that most driverless vehicles have redundant shield covers beneath steering and suspension elements.

This all becomes an insurance issue in the world of autonomous cars.  If you encounter road debris (an occasional risk element), the control system cannot be responsible to avoid contact/impact under many circumstances.

Behing that driverless vehicles are on the road now, Rule Number 1 has been long broken. These vehicles could have been right around you and you never even noticed.



Opinions expressed here on belong solely to the poster and do not represent or reflect the opinions or beliefs of AARoads, its creators and/or associates.