Driverless cars

Started by cpzilliacus, July 24, 2013, 08:45:51 AM

Previous topic - Next topic

cpzilliacus

New York Times: Driving Sideways

QuoteThe driverless car, like other utopian pursuits, seems always to be just out of reach. It's captured the imagination of many for at least a century: in 1918, the Oakland Tribune reported (in a section I wish all newspapers would bring back called "New and Interesting Facts from Science and Life" ) that "the new car will be all glass-enclosed and controlled entirely by a set of push buttons. It will have no clutch, gears or transmission, will sit low, have small clearance and punctureless tires."  

QuoteIt's striking that 1918's Motor Car of the Future doesn't look – or operate – all that different in concept from the 2013 one (or, for that matter, from the 1957 one, which was projected to run on electricity). Our collective visions of the future seem almost always to draw from a limited visual tool kit.

QuoteThe driverless car doesn't look any more futuristic today (in fact, it pretty much looks like ... a car), but what is radically different now is that the means to make that car drive autonomously have been figured out. For example, Google's driverless cars – the ones you hear the most about – have completed over 300,000 autonomous-driving miles accident-free. Many experts, from architects to automobile executives, predict the ascendancy of the autonomous vehicle within three generations. Allstate is preparing actuarial tables; Ford, BMW, Audi, Mercedes-Benz and Nissan, among other car manufacturers, in an unusual shift toward long-term planning, are seeing the writing on the wall and have developed working prototypes. (This may be a smart strategy, given how the United States is trending; in China, meanwhile, car ownership is growing by more than 10 percent annually.)
Opinions expressed here on AAROADS are strictly personal and mine alone, and do not reflect policies or positions of MWCOG, NCRTPB or their member federal, state, county and municipal governments or any other agency.


realjd

Can't happen soon enough IMO. I'm excited for the day I can tell my car to drive me to Key West, turn on my ipad, and open a beer. Or tell it to drive me to DC after work and then sleep all night arriving in the morning.

Brandon

Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

I'll pass on driverless cars as the people are bad enough as it is.
"If you think this has a happy ending, you haven't been paying attention." - Ramsay Bolton, "Game of Thrones"

"Symbolic of his struggle against reality." - Reg, "Monty Python's Life of Brian"

PHLBOS

Quote from: Brandon on July 24, 2013, 01:42:13 PM
Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

I'll pass on driverless cars as the people are bad enough as it is.
Amen to that. :thumbsup:

I don't have time to look for any Youtube clips (to see if it's there), but this topic reminds me of an old episode of The Jetsons  (the episode involving Astro's (Tralfaz's) original owner coming to reclaim him).  In one scene, the original owner's attorney (Withers), falling from the dogwalk, radios his remote-controlled car to catch him before he hits the ground.  While the car indeed automatically comes and successfully catches him, it then takes off and crashes into a building wall.

Immediately following the crash, an injured Withers grumblingly states (to his banged-up car), "Of all the dumb cars.  Where were you when the electronic brains were being passed out?

That scene still makes me laugh to this day.
GPS does NOT equal GOD

realjd

Quote from: Brandon on July 24, 2013, 01:42:13 PM
Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

I'll pass on driverless cars as the people are bad enough as it is.

How often do modern aircraft, which are heavily computerized and automated, crash from automation failure as opposed to pilot failure or mechanical failure? Almost never. How often do you hear about hospital patients dying because of software failures in medical equipment? Almost never. Software standards for safety and life critical applications goes through a much more stringent analysis and certification process than consumer PC software. And closed, embedded systems like autopilot and car automation are significantly more reliable than a home PC simply because it is a closed environment with known hardware. There will be no driver conflicts, configuration errors, or other similar issues because of this, and that's why PCs usually crash.

If you're honestly curious as to the stringent software standards for life critical applications, the FAA's DO-178B is a good place to start.

Besides, computers can't drive drunk, text and drive, fall sleep behind the wheel, change lanes without checking the blind spot, run a red light, or do any of the other dangerous actions humans do every day behind the wheel. I trust a well written computer algorithm much more than I trust the majority of American drivers out there.

wxfree

I enjoy driving, the act of controlling a vehicle, feeling its nuances, using it to understand the road below and get a demonstration of the laws of physics.  In spite of that appreciation, I think this is a necessary step.  People have already shown that driving is a low priority and that they will put other activities first, even when in the driver's seat.  People drive while distracted, drive while fatigued, go around curves too fast, follow too closely, overcorrect during a skid, and otherwise drive unsafely and inefficiently.

Imagine how well traffic would flow if the movements of all of the vehicles were coordinated.  Cars approaching the desired exit would be to the right, cars passing through town would be to the left, all moving at appropriate and coordinated speeds.  No car would slow down because of a minor curve or accident on the other side of the highway, eliminating many of the waves that form in traffic.

I can even see higher speeds being allowed.  A computer is much more reliably "reasonable and prudent" than a person is.  Free from emotion, and capable of analyzing much more data, the computer could judge a particular road and set of conditions, both on the road and in the car, and select an objectively safe speed, possibly very fast.  At night the car could have "thermal vision" and see potential wildlife hazards much better than a human, and respond in an immediate and well-calculated way.

The driving world would not only be safer, but more efficient with fewer traffic problems.  We're still a ways off from this goal, but it's approaching.
I'd like to buy a vowel, Alex.  What is E?

All roads lead away from Rome.

Brandon

Quote from: wxfree on July 24, 2013, 10:41:42 PM
People ... go around curves too fast...

I've rarely seen such a thing.  In my experience, they go around curves far too slow.
"If you think this has a happy ending, you haven't been paying attention." - Ramsay Bolton, "Game of Thrones"

"Symbolic of his struggle against reality." - Reg, "Monty Python's Life of Brian"

jeffandnicole

Quote from: Brandon on July 24, 2013, 01:42:13 PM
Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

How often does the blue screen of death occur on modern computers?  And how often is it because people have loaded software or hardware onto a computer which eventually caused the blue screen of death?

PHLBOS

Quote from: realjd on July 24, 2013, 06:01:10 PMHow often do modern aircraft, which are heavily computerized and automated, crash from automation failure as opposed to pilot failure or mechanical failure? Almost never. How often do you hear about hospital patients dying because of software failures in medical equipment? Almost never. Software standards for safety and life critical applications goes through a much more stringent analysis and certification process than consumer PC software. And closed, embedded systems like autopilot and car automation are significantly more reliable than a home PC simply because it is a closed environment with known hardware. There will be no driver conflicts, configuration errors, or other similar issues because of this, and that's why PCs usually crash.

If you're honestly curious as to the stringent software standards for life critical applications, the FAA's DO-178B is a good place to start.

Besides, computers can't drive drunk, text and drive, fall sleep behind the wheel, change lanes without checking the blind spot, run a red light, or do any of the other dangerous actions humans do every day behind the wheel. I trust a well written computer algorithm much more than I trust the majority of American drivers out there.
A couple things to consider (and yes I realize that this is off-thread topic):

1.  Even while the Auto-Pilot mode is on, a pilot still needs to be present in the cockpit seat.  I believe that's an FAA requirement.

2.  As we recently have come to learn regarding the Asiana Flight 214 crash landing in SFO several weeks ago, a pilot's/co-pilot's over-relying on automated controls can have disastrous results.  Granted, that one's more user-error-based; but it's the perfect example of over-reliance of automated controls giving a false sense of security.
GPS does NOT equal GOD

SteveG1988

Quote from: realjd on July 24, 2013, 06:01:10 PM
Quote from: Brandon on July 24, 2013, 01:42:13 PM
Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

I'll pass on driverless cars as the people are bad enough as it is.

How often do modern aircraft, which are heavily computerized and automated, crash from automation failure as opposed to pilot failure or mechanical failure? Almost never. How often do you hear about hospital patients dying because of software failures in medical equipment? Almost never. Software standards for safety and life critical applications goes through a much more stringent analysis and certification process than consumer PC software. And closed, embedded systems like autopilot and car automation are significantly more reliable than a home PC simply because it is a closed environment with known hardware. There will be no driver conflicts, configuration errors, or other similar issues because of this, and that's why PCs usually crash.

If you're honestly curious as to the stringent software standards for life critical applications, the FAA's DO-178B is a good place to start.

Besides, computers can't drive drunk, text and drive, fall sleep behind the wheel, change lanes without checking the blind spot, run a red light, or do any of the other dangerous actions humans do every day behind the wheel. I trust a well written computer algorithm much more than I trust the majority of American drivers out there.

Mission Critical hardware is always out of date intentionally, how often does your car engine control module go wrong, how often does your car crash its computer.

For example the 1980s Ford EEC-IV used all the way upto 1995 used a 1970s processor from intel, several autopilots made in the 2000s used the 386 and 486, the space shuttle flew with 1970s systems, The international space station uses similar hardware for day to day operations, but real science is done on Thinkpads.

From wikipedia on the Space Shuttle computer:

A concern with digital fly-by-wire systems is reliability. Considerable research went into the Shuttle computer system. The Shuttle used five identical redundant IBM 32-bit general purpose computers (GPCs), model AP-101, constituting a type of embedded system. Four computers ran specialized software called the Primary Avionics Software System (PASS). A fifth backup computer ran separate software called the Backup Flight System (BFS). Collectively they were called the Data Processing System (DPS).

The design goal of the Shuttle's DPS was fail-operational/fail-safe reliability. After a single failure, the Shuttle could still continue the mission. After two failures, it could still land safely.

The four general-purpose computers operated essentially in lockstep, checking each other. If one computer failed, the three functioning computers "voted" it out of the system. This isolated it from vehicle control. If a second computer of the three remaining failed, the two functioning computers voted it out. In the unlikely case that two out of four computers simultaneously failed (a two-two split), one group was to be picked at random.

The Backup Flight System (BFS) was separately developed software running on the fifth computer, used only if the entire four-computer primary system failed. The BFS was created because although the four primary computers were hardware redundant, they all ran the same software, so a generic software problem could crash all of them. Embedded system avionic software was developed under totally different conditions from public commercial software: the number of code lines was tiny compared to a public commercial software, changes were only made infrequently and with extensive testing, and many programming and test personnel worked on the small amount of computer code. However, in theory it could have still failed, and the BFS existed for that contingency. While the BFS could run in parallel with PASS, the BFS never engaged to take over control from PASS during any Shuttle mission.

The software for the Shuttle computers was written in a high-level language called HAL/S, somewhat similar to PL/I. It is specifically designed for a real time embedded system environment.

The IBM AP-101 computers originally had about 424 kilobytes of magnetic core memory each. The CPU could process about 400,000 instructions per second. They had no hard disk drive, and loaded software from magnetic tape cartridges.

In 1990, the original computers were replaced with an upgraded model AP-101S, which had about 2.5 times the memory capacity (about 1 megabyte) and three times the processor speed (about 1.2 million instructions per second). The memory was changed from magnetic core to semiconductor with battery backup.

http://en.wikipedia.org/wiki/Space_Shuttle#Flight_systems

Roads Clinched

I55,I82,I84(E&W)I88(W),I87(N),I81,I64,I74(W),I72,I57,I24,I65,I59,I12,I71,I77,I76(E&W),I70,I79,I85,I86(W),I27,I16,I97,I96,I43,I41,

agentsteel53

Quote from: jeffandnicole on July 25, 2013, 08:31:39 AMAnd how often is it because people have loaded software or hardware onto a computer which eventually caused the blue screen of death?

I can just imagine people downloading some virus onto their car when it shows them a "click here to win an iPod!" epileptic ad.

in general, I can just imagine my driverless car coming with ads. 

shudder.
live from sunny San Diego.

http://shields.aaroads.com

jake@aaroads.com

realjd

Quote from: PHLBOS on July 25, 2013, 08:40:39 AM
Quote from: realjd on July 24, 2013, 06:01:10 PMHow often do modern aircraft, which are heavily computerized and automated, crash from automation failure as opposed to pilot failure or mechanical failure? Almost never. How often do you hear about hospital patients dying because of software failures in medical equipment? Almost never. Software standards for safety and life critical applications goes through a much more stringent analysis and certification process than consumer PC software. And closed, embedded systems like autopilot and car automation are significantly more reliable than a home PC simply because it is a closed environment with known hardware. There will be no driver conflicts, configuration errors, or other similar issues because of this, and that's why PCs usually crash.

If you're honestly curious as to the stringent software standards for life critical applications, the FAA's DO-178B is a good place to start.

Besides, computers can't drive drunk, text and drive, fall sleep behind the wheel, change lanes without checking the blind spot, run a red light, or do any of the other dangerous actions humans do every day behind the wheel. I trust a well written computer algorithm much more than I trust the majority of American drivers out there.
A couple things to consider (and yes I realize that this is off-thread topic):

1.  Even while the Auto-Pilot mode is on, a pilot still needs to be present in the cockpit seat.  I believe that's an FAA requirement.

2.  As we recently have come to learn regarding the Asiana Flight 214 crash landing in SFO several weeks ago, a pilot's/co-pilot's over-relying on automated controls can have disastrous results.  Granted, that one's more user-error-based; but it's the perfect example of over-reliance of automated controls giving a false sense of security.

I disagree on #2. How was it related to automation? The pilot was not using the plane's autoland feature and manually flew it into the ground.

PHLBOS

Quote from: realjd on July 25, 2013, 11:04:34 AMHow was it related to automation? The pilot was not using the plane's autoland feature and manually flew it into the ground.
Autopilot may have not been a correct choice of words on my part.  Auto-throttle failure seems to be one culprit here.   

FWIW, from the latest Wikipedia account (Bold emphasis added):

http://en.wikipedia.org/wiki/Asiana_Airlines_Flight_214

Excerpt:

All three pilots told NTSB investigators that they were relying on the 777's automated devices for speed control during final descent. The relief first officer also stated to NTSB investigators that he had called out "sink rate" to call attention to the rate at which the plane was descending during the final approach. The South Korean transport ministry confirmed that this "sink rate" warning was repeated several times during the last minute of the descent.
GPS does NOT equal GOD

cpzilliacus

Quote from: Brandon on July 24, 2013, 01:42:13 PM
Just what we need, a million accidents as soon as the computers in them crash and show the blue screen of death.

I'll pass on driverless cars as the people are bad enough as it is.

I do not think that the Google driverless car software relies on Microsoft Windows (I could be wrong about that, but since I am talking about Google here, it makes sense that they would avoid products from Redmond).
Opinions expressed here on AAROADS are strictly personal and mine alone, and do not reflect policies or positions of MWCOG, NCRTPB or their member federal, state, county and municipal governments or any other agency.

cpzilliacus

Quote from: SteveG1988 on July 25, 2013, 09:16:51 AM

Mission Critical hardware is always out of date intentionally, how often does your car engine control module go wrong, how often does your car crash its computer.

Software too.  I believe at least some of the ATC software used by the FAA dates back to the 1960's.  I worked on the FAA's Advanced Automation System (AAS), which was to replace all of the old code with new code all written in Ada(tm).

AAS crashed and burned in the early 1990's after over 10 years of mismanagement by the FAA.
Opinions expressed here on AAROADS are strictly personal and mine alone, and do not reflect policies or positions of MWCOG, NCRTPB or their member federal, state, county and municipal governments or any other agency.

Duke87

Having the computer drive for you takes the fun out of it.

Also, if the computer not only drives for you but selects a route for you, no sale.
If you always take the same road, you will never see anything new.

vdeane

Quote from: Duke87 on July 25, 2013, 07:53:56 PM
Also, if the computer not only drives for you but selects a route for you, no sale.
Agreed.  I expect that driverless cars will be the end of roadgeeking.  Even if we can select a route, given all the routing difficulties on Google (I particularly can't get it to do loops right), clinching will be MUCH harder.  You'll need, god forbid, an actual reason for going on the road.
Please note: All comments here represent my own personal opinion and do not reflect the official position of NYSDOT or its affiliates.

realjd

Quote from: PHLBOS on July 25, 2013, 02:44:06 PM
Quote from: realjd on July 25, 2013, 11:04:34 AMHow was it related to automation? The pilot was not using the plane's autoland feature and manually flew it into the ground.
Autopilot may have not been a correct choice of words on my part.  Auto-throttle failure seems to be one culprit here.   

FWIW, from the latest Wikipedia account (Bold emphasis added):

http://en.wikipedia.org/wiki/Asiana_Airlines_Flight_214

Excerpt:

All three pilots told NTSB investigators that they were relying on the 777's automated devices for speed control during final descent. The relief first officer also stated to NTSB investigators that he had called out "sink rate" to call attention to the rate at which the plane was descending during the final approach. The South Korean transport ministry confirmed that this "sink rate" warning was repeated several times during the last minute of the descent.


I would argue that it wasn't a software failure but rather the pilot improperly setting the software (auto throttle in this case). An autopilot will be more than happy to fly a plane into a mountain but that isn't the fault of the autopilot...

If the pilot had performed an autoland (which the 777 is capable of but is not commonly used, especially for VFR) then the plane most likely would not have crashed.

The old term "Garbage in, garbage out" applies to cases like this.

I'll also point out that fly-by-wire is common with modern aircraft, civilian and military, and has been for quite some time. Drive-by-wire is starting to become more common with vehicles. There are few if any reports of that software failing. Even if a pilot is hand flying an aircraft, that doesn't mean he or she is directly manipulating the control surfaces.

But ignoring implementation details, I like the concept of self driving cars. As much as I enjoy the act of driving and roadgeeking, as a whole society would be better and my personal quality of lift would be higher if we had self driving cars IMO.

Alps

Quote from: vdeane on July 25, 2013, 08:33:14 PM
Quote from: Duke87 on July 25, 2013, 07:53:56 PM
Also, if the computer not only drives for you but selects a route for you, no sale.
Agreed.  I expect that driverless cars will be the end of roadgeeking.  Even if we can select a route, given all the routing difficulties on Google (I particularly can't get it to do loops right), clinching will be MUCH harder.  You'll need, god forbid, an actual reason for going on the road.
I can't see driverless cars ever being a requirement.

NE2

It might become a requirement for Interstates, at least where no alternate exists.
pre-1945 Florida route log

I accept and respect your identity as long as it's not dumb shit like "identifying as a vaccinated attack helicopter".

Duke87

And even then, a car with auto-pilot capability should still have the option to turn it off.
If you always take the same road, you will never see anything new.

vdeane

Quote from: Steve on July 26, 2013, 07:14:19 PM
Quote from: vdeane on July 25, 2013, 08:33:14 PM
Quote from: Duke87 on July 25, 2013, 07:53:56 PM
Also, if the computer not only drives for you but selects a route for you, no sale.
Agreed.  I expect that driverless cars will be the end of roadgeeking.  Even if we can select a route, given all the routing difficulties on Google (I particularly can't get it to do loops right), clinching will be MUCH harder.  You'll need, god forbid, an actual reason for going on the road.
I can't see driverless cars ever being a requirement.
Doesn't a lot of the coordination between cars (eg all going the same speed to avoid jams) require it?
Please note: All comments here represent my own personal opinion and do not reflect the official position of NYSDOT or its affiliates.

wxfree

I certainly agree that a driver should have a way to disengage auto-control.  We'd still need skilled drivers to be able to take control in case of a system failure, and this would require ongoing practice.  But there may be difficulties with this.  If we're looking for the improved safety and efficiency brought by unemotional and well-calculated driving, how much would that be impaired by having a few cars driven by people doing stupid things and making bad decisions?

I can imagine someone being impatient and wanting to go faster or pass someone, taking manual control, and making inefficient and unsafe maneuvers that all the computer driven cars have to respond to.  After a few people do this, and others do the same in response, traffic flow starts to degenerate and everyone starts losing the benefit of this amazing technology.

My basic worldview is libertarian, but rationally, not extremistly, libertarian.  I wouldn't want to see rules mandating auto-controls, but I question how well the system will work without it.  Maybe traffic will mostly be able to adapt and it won't be a problem, or maybe auto-controls would be required on certain roads while manual driving is allowed on others.  This will have to be figured out, but that'll be a good problem to have.
I'd like to buy a vowel, Alex.  What is E?

All roads lead away from Rome.

Joe The Dragon

and still maintenance needs both on road and road side stuff will need manual control

Crazy Volvo Guy

#24
Quote from: realjd on July 24, 2013, 12:08:04 PMCan't happen soon enough IMO. I'm excited for the day I can tell my car to drive me to Key West, turn on my ipad, and open a beer. Or tell it to drive me to DC after work and then sleep all night arriving in the morning.

I am not.  Being a passenger does not excite me.  I love cars, namely OLD cars (mine are 23 and 29 years old) -and I love DRIVING.  In order for driverless cars to work as intended (improve traffic flow, reduce accidents, blah blah) they will HAVE to be driverless, constantly communicating with each other.  Meaning - I won't be allowed to drive anymore.  I'd quite frankly rather off myself.  I can't say the "OMFG DRIVERLESS!!!!1!1!111" attitude surprises me, though, given the nearly society-wide mindless obsession with media consumption these days.

But I don't know that it will ever happen.  Technological advancement suggests it will, but seeing how it would be the signature of the death warrant for the traffic enforcement business and industry, I suspect those folks will fight it tooth and nail.  And some of them - namely the insurance companies - have the cash behind them to get what they want.

I just see it as another bit of mindless progress (i.e. "progress for the sake of progress") personally, that alone is enough for me to find it utterly repulsive.  The way I see it?  If you want to be a passenger so badly, so you can consume your media/alcohol/etc, lobby for more transit services, then get your tail on those transit services and leave the roads for those of us who like driving and/or who know how to drive like we mean it, and/or who would at least like to be able to drive down any road we like.  Don't lobby for driverless cars that will ultimately cause the demise of even being able to do something the rest of us enjoy.
I hate Clearview, because it looks like a cheap Chinese ripoff.

I'm for the Red Sox and whoever's playing against the Yankees.



Opinions expressed here on belong solely to the poster and do not represent or reflect the opinions or beliefs of AARoads, its creators and/or associates.