News:

Per request, I added a Forum Status page while revamping the AARoads back end.
- Alex

Main Menu

Google upset at California proposal regarding self-driving cars

Started by vdeane, December 17, 2015, 08:20:54 PM

Previous topic - Next topic

vdeane

Please note: All comments here represent my own personal opinion and do not reflect the official position of NYSDOT or its affiliates.


Brandon

"If you think this has a happy ending, you haven't been paying attention." - Ramsay Bolton, "Game of Thrones"

"Symbolic of his struggle against reality." - Reg, "Monty Python's Life of Brian"

kj3400

Well I guess they're going to actually have to hire someone to drive the camera car around California for streetview now.
Call me Kenny/Kenneth. No, seriously.

ET21

I'm waiting for them to say "But it's safer than humans", then see a pic of one of the cars up a street light or through a window of a shop
The local weatherman, trust me I can be 99.9% right!
"Show where you're going, without forgetting where you're from"

Clinched:
IL: I-88, I-180, I-190, I-290, I-294, I-355, IL-390
IN: I-80, I-94
SD: I-190
WI: I-90
MI: I-94, I-196
MN: I-90

jakeroot

Google does have a point. Self driving cars provide mobility options for those not able to drive themselves. But on the other hand, I'm not sure if more cars on the road (chiefly from those who could not drive but now own a self-driving car) is really going to help things. Then again, I'm not certain that the way to regulate that is by banning non-licenced drivers from operating a self-driving car.

corco

This doesn't have to be the law forever. Once we're at the point where self-driving cars are a proven, safe technology that can realistically be used by anyone, the law can and should certainly be revisited to allow non-licensed drivers to pilot the car.

In the meantime, while this technology still is very much in development, it's absolutely reasonable to require a licensed driver to be behind the wheel of a car with the ability to take control.

SSOWorld

Self-driving cars will never be a proven  safe technology.  Any vehicle of that type - especially one that has no way to rescue like the one Google is proposing in said article - is one that is a recipe for disaster.
Scott O.

Not all who wander are lost...
Ah, the open skies, wind at my back, warm sun on my... wait, where the hell am I?!
As a matter of fact, I do own the road.
Raise your what?

Wisconsin - out-multiplexing your state since 1918.

empirestate

Quote from: SSOWorld on December 18, 2015, 06:30:09 AM
Self-driving cars will never be a proven  safe technology.  Any vehicle of that type - especially one that has no way to rescue like the one Google is proposing in said article - is one that is a recipe for disaster.

Yes, but hopefully it serves up a smaller portion of disaster than the current recipe.

I know it seems counter-intuitive not to have a human failover in the driver's seat–my initial reaction was wary as well–but once you think about it, taking over for the car in the event of an emergency is when a human would be the most likely to screw it up and possibly make a situation much worse.

Because autonomous cars will be able to respond much more quickly than humans to avoid accidents, a huge percentage of incidents will have already been predicted and avoided by the car's systems before any human occupants will even have noticed they were happening. Still, every time an autonomous car does have an accident, there will be the speculation that a human driver could have avoided it, and of course such speculation will overlook the vast abatement of accidents overall that autonomous cars will have allowed.

Now, I'm speaking of the technology in its maturity, which is still in the future. As the cars are tested and developed, yes, certainly there will need to be human supervision. And even when the product is firmly established, we'll still need ways to override the car when its systems have genuinely failed–the trick will be allowing the override in such cases but not others, so that drivers don't override the vehicle in a panic and thus contribute to an accident that the car would otherwise have avoided–and for incidental or recreational use of the vehicle. (It sounds like Google may be thinking that these uses would be reserved for a different product.)

I certainly do understand people's skepticism about the technology, and I can't fault them for it, but as the technology gradually develops and becomes more visible, opposition to it will naturally dwindle as the generations progress, just as it did with motorized vehicles in general.

jeffandnicole

I think it's most aggravating when people bring up "The Blue Screen of Death".  If you are still getting that on your home computer, it's time to upgrade from Windows 3.1.

And that really leads into a bigger problem: Many people compare all sorts of technology with what they are used to, which is typically a mass produced cheap home computer that they try stuffing way too much data on.  Even something we encounter everyday - a traffic light - is operated by computers, and rarely do they have issues.  And when they do have issues, they have backup programs in place within the computer system.  For instance, you're not going to find the light green in all directions.  Even most modern cars are highly computerized.  There's a lot beyond those dials one sees on the dash. 

There are some completely computerized transportation systems in place that are entirely computerized.  Monorails, for example.  There are some airports that have automatic monorail systems running between terminals without a human operated.   

It's almost amazing that many people complain or are surprised we don't have flying cars yet, but then are afraid of a thought of a computerized car on the road. 

empirestate

Quote from: ET21 on December 18, 2015, 12:50:02 AM
I'm waiting for them to say "But it's safer than humans", then see a pic of one of the cars up a street light or through a window of a shop

Yeah, me too. People will use one such picture to support their belief that these cars are more dangerous, even though there are many more pictures of human-driven cars up street lights or through shop windows that will have stopped existing. And it makes sense; it's hard to illustrate something by the lack of illustration available; it's the old climate-change snowball in a different form.

triplemultiplex

All I know is that human drivers are proven to be an unsafe technology.
My god, if the US responded proportionally to the number of traffic fatalities the way they do to a handful of deaths by terrorist attack, it'd be the fucking Jetsons up in here.

They'd throw trillions of dollars into self-driving cars and grade-separations and moving trucks and passenger cars to different roadways....
"That's just like... your opinion, man."

jakeroot

Quote from: SSOWorld on December 18, 2015, 06:30:09 AM
Self-driving cars will never be a proven  safe technology.  Any vehicle of that type - especially one that has no way to rescue like the one Google is proposing in said article - is one that is a recipe for disaster.

Jesus man. Put your tin hat on. No need to go full technophobe. There is great potential that has yet to be exploited.

Did you know that many planes land themselves? That pilots often both sleep in the cockpit? Shit is automated these days. And there's zero reason to believe that humans are any better. We wreck stuff all the time. We should be giving algorithms a chance to see if they are any better.

Quote from: corco on December 18, 2015, 01:28:42 AM
This doesn't have to be the law forever. Once we're at the point where self-driving cars are a proven, safe technology that can realistically be used by anyone, the law can and should certainly be revisited to allow non-licensed drivers to pilot the car.

Is there some sort of red-ribbon that we get to cut? When is it determined that they are a proven technology? Like all technology, it's something that is developed over a very long period of time. My fear is that it's going to be exceedingly difficult to pass any legislation in the future that will allow for self driving cars, sans-licenced drivers. It's easy to ban something (people are easily influenced by fear) but not as easy to legalize something.

GaryV

Quote from: SSOWorld on December 18, 2015, 06:30:09 AM
Self-driving cars will never be a proven  safe technology

I don't know about that.  But until it is "proven safe technology", no manufacturer will release it for public sale.  Can you imagine the field day lawyers would have if one causes a serious or fatal accident?

US 41

Quote from: GaryV on December 18, 2015, 10:23:11 PM
Quote from: SSOWorld on December 18, 2015, 06:30:09 AM
Self-driving cars will never be a proven  safe technology

I don't know about that.  But until it is "proven safe technology", no manufacturer will release it for public sale.  Can you imagine the field day lawyers would have if one causes a serious or fatal accident?

If one thing technologically goes wrong with a self driving car it could lead to a disaster, especially in busy city intersections.
Visited States and Provinces:
USA (48)= All of Lower 48
Canada (9)= AB, BC, MB, NB, NS, ON, PEI, QC, SK
Mexico (9)= BCN, BCS, CHIH, COAH, DGO, NL, SON, SIN, TAM

cl94

And then you get the Google car that was pulled over for going too slow. If every self-driving car is going to drive like an 80 year old woman, it will be more dangerous because it will not only create road rage, but also a speed differential on highways. A large speed differential is often more dangerous than every driver going well above the speed limit.
Please note: All posts represent my personal opinions and do not represent those of my employer or any of its partner agencies.

jeffandnicole

Yep...Google cars have gone 1.3 million miles...and got pulled over 1 time.

Most people never drive a million miles in their lives.  And the chances of anyone never getting pulled over or in an accident are very small.

Duke87

Quote from: empirestate on December 18, 2015, 09:07:05 AMI know it seems counter-intuitive not to have a human failover in the driver's seat–my initial reaction was wary as well–but once you think about it, taking over for the car in the event of an emergency is when a human would be the most likely to screw it up and possibly make a situation much worse.

I get the sense that this requirement is the lawyers talking. They say "we need a licensed human driver to be able take over in the event of an emergency", what they actually mean is "we need a licensed human driver to be able to be held liable in the event of an accident".

If a car with no human driver injures a pedestrian, who does the pedestrian get to sue? Until the answer to this kind of question can be figured out it is necessary to maintain a human driver as an available scapegoat.

Quoteeven when the product is firmly established, we'll still need ways to override the car when its systems have genuinely failed–the trick will be allowing the override in such cases but not others, so that drivers don't override the vehicle in a panic and thus contribute to an accident that the car would otherwise have avoided–and for incidental or recreational use of the vehicle. (It sounds like Google may be thinking that these uses would be reserved for a different product.)

Manual override is not necessarily required for situations where the auto-pilot fails and the car becomes unable to move itself. Most of these situations could probably be handled the same way any other critical equipment failure already is - call a tow truck. Although if someone in the car is capable of driving it the "old fashioned" way, a manual override would certainly be useful for that.

Recreational use of the vehicle is a much more reasonable argument for having the manual override. That said, I could easily see this ultimately working the same way that current manumatic transmissions do - where you can take "manual" control of steering and whatnot, but the car will reject your inputs if it deems them unsafe.

If you always take the same road, you will never see anything new.

jakeroot

Quote from: Duke87 on December 19, 2015, 12:27:42 AM
Quote from: empirestate on December 18, 2015, 09:07:05 AMI know it seems counter-intuitive not to have a human failover in the driver's seat–my initial reaction was wary as well–but once you think about it, taking over for the car in the event of an emergency is when a human would be the most likely to screw it up and possibly make a situation much worse.

I get the sense that this requirement is the lawyers talking. They say "we need a licensed human driver to be able take over in the event of an emergency", what they actually mean is "we need a licensed human driver to be able to be held liable in the event of an accident".

If a car with no human driver injures a pedestrian, who does the pedestrian get to sue? Until the answer to this kind of question can be figured out it is necessary to maintain a human driver as an available scapegoat.

This seems perfectly reasonable to me, but then wouldn't Google be supporting this legislation? That way, if one of their self-driving cars hits someone, they won't get sued (directly)?

Then again, any self driving car will likely contain some sort of black box to determine who was driving at the time (human or machine). Even if there was a human in the car, there's no certainty that he was operating the vehicle at the time of the collision, thus the manufacturer could still be held liable.

english si

Quote from: jeffandnicole on December 19, 2015, 12:26:09 AMYep...Google cars have gone 1.3 million miles...and got pulled over 1 time.
Which tells us nothing without knowing how pull-overy the police in the Bay Area are, or how active a presence the cops are.

If it was England, you could do the same with a death trap waiting to happen, provided it hadn't yet crashed.

SSOWorld

Quote from: jakeroot on December 18, 2015, 08:41:13 PM
Quote from: SSOWorld on December 18, 2015, 06:30:09 AM
Self-driving cars will never be a proven  safe technology.  Any vehicle of that type - especially one that has no way to rescue like the one Google is proposing in said article - is one that is a recipe for disaster.

Jesus man. Put your tin hat on. No need to go full technophobe. There is great potential that has yet to be exploited.
Did you know that many planes land themselves? That pilots often both sleep in the cockpit? Shit is automated these days. And there's zero reason to believe that humans are any better. We wreck stuff all the time. We should be giving algorithms a chance to see if they are any better.
Well I'll stick to my view on the subject. The safe way of making a safe car is to have a driver that knows how to drive.  Planes can run on autopilot - but landing? - good thing I prefer to avoid flying...

Technology development is more about giving someone money-earning work anyway.
Scott O.

Not all who wander are lost...
Ah, the open skies, wind at my back, warm sun on my... wait, where the hell am I?!
As a matter of fact, I do own the road.
Raise your what?

Wisconsin - out-multiplexing your state since 1918.

empirestate


Quote from: SSOWorld on December 19, 2015, 05:38:51 AM
Well I'll stick to my view on the subject. The safe way of making a safe car is to have a driver that knows how to drive.  Planes can run on autopilot - but landing? - good thing I prefer to avoid flying...

Yes, landing too; much of that process is automated. In fact, human intervention in the automated landing sequence has been known to crash planes; it happened at Buffalo in '09.

Of course, you're welcome to your own view; we're just pointing out that it may not align with the facts. We don't have the facts yet on automated cars, but automation has already increased safety in a lot of other areas.


iPhone

vdeane

Quote from: jakeroot on December 19, 2015, 01:24:29 AM
This seems perfectly reasonable to me, but then wouldn't Google be supporting this legislation? That way, if one of their self-driving cars hits someone, they won't get sued (directly)?
Google's strategy is to go whole hog all at once and go from nothing to fully self-driving cars with no manual option (indeed, these cars are already on the road in the area Google has mapped extensively), with their primary customer being people who hate driving and want to just play with their smartphone the entire ride.  In fact, I would bet that's why Google is developing these cars in the first place: to tap into the time people spend commuting to get them to look at advertising in their search results, gmail, etc.  This legislation essentially kills that plan and favors the automakers, who have been taking a strategy of adding more and more automated features that could eventually add up to be a self-driving car.

Personally, one thing I'm worried about is hacking.  Cars can already be hacked to do things like disable the breaks.  A self-driving car will be able to navigate too, exponentially increasing the capability of government and hackers to interfere with your trip.  Imagine someone hacking into your car, using the sensors to determine if there's stuff in the trunk, and if there is, drive the car to their house in the middle of nowhere to shoot you and take your stuff.  Then they have the car drive itself into a lake to get rid of the evidence.  Or imagine the government thinking your driving pattern on a route clinching trip looks "suspicious" and stopping the car and locking you in until the police can arrive.
Please note: All comments here represent my own personal opinion and do not reflect the official position of NYSDOT or its affiliates.

jakeroot

Quote from: vdeane on December 19, 2015, 05:36:14 PM
Quote from: jakeroot on December 19, 2015, 01:24:29 AM
This seems perfectly reasonable to me, but then wouldn't Google be supporting this legislation? That way, if one of their self-driving cars hits someone, they won't get sued (directly)?

Google's strategy is to go whole hog all at once and go from nothing to fully self-driving cars with no manual option (indeed, these cars are already on the road in the area Google has mapped extensively), with their primary customer being people who hate driving and want to just play with their smartphone the entire ride.  In fact, I would bet that's why Google is developing these cars in the first place: to tap into the time people spend commuting to get them to look at advertising in their search results, gmail, etc.  This legislation essentially kills that plan and favors the automakers, who have been taking a strategy of adding more and more automated features that could eventually add up to be a self-driving car.

I'm not entirely sure that's true...in fact, I'd say that couldn't be farther from Google's goal. There might be an underlying benefit for Google in terms of the increased smartphone usage w/o having to operate a vehicle, but I think the primary motive is to remove the variable from the equation (that is, the skill of the driver) and replace it with something that is less prone to errors (that is, a computer). A bunch of these computers, talking to each other, poses a huge safety benefit. As well, there is the additional benefit of providing mobility to those otherwise unable to move freely about their area, and who have no choice but to be chauffeured.

Is there a hacking risk? Absolutely. But most cars can already be hacked to be driven without driver intervention. This is not a new issue. There are plenty of examples on Youtube. Ever wonder why we both drive a stick? Can't hack those! I do know, for a fact, that hacking is a huge problem right now without much of a solution, so I'm not sure it's a legitimate worry in terms of allowing non-licenced drivers to operate a self-driving car. Is your comment on hacking related to the OP? Not necessarily, but I have pretended that it is.

Duke87

Quote from: jakeroot on December 19, 2015, 01:24:29 AM
Then again, any self driving car will likely contain some sort of black box to determine who was driving at the time (human or machine). Even if there was a human in the car, there's no certainty that he was operating the vehicle at the time of the collision, thus the manufacturer could still be held liable.

Not necessarily. If the rules say the driver is supposed to take manual control to avoid a crash, you could hold the driver liable for failing to take manual control when they're supposed to.

Of course it is ridiculous to expect that a human who's been sitting there for an extended period of time uninvolved in the process is going to be reasonably able to take control at the drop of a hat if need be. The "driver" is going to not be paying close attention to their surroundings at that point and even if the car yells at them they need to take control, their control is going to be poor in quality if they've just been jolted from focusing on their phone. The human brain does not shift gears that quickly.

Quote from: vdeane on December 19, 2015, 05:36:14 PM
Personally, one thing I'm worried about is hacking.  Cars can already be hacked to do things like disable the breaks.  A self-driving car will be able to navigate too, exponentially increasing the capability of government and hackers to interfere with your trip.  Imagine someone hacking into your car, using the sensors to determine if there's stuff in the trunk, and if there is, drive the car to their house in the middle of nowhere to shoot you and take your stuff.  Then they have the car drive itself into a lake to get rid of the evidence.

This seems like an awful lot of bother to go through just to steal "stuff in the trunk". I'd be more concerned about criminals using it as a means of kidnapping people. Of course, if criminals can hack cars so can law enforcement. If you thought Lo Jack was cool, wait till you see police hack a getaway car to disable it or make it drive to where they are. I could easily see this being a double edged sword.

And of course, if it became a problem of any significance, the automakers would have to beef up their cybersecurity in order to stop the bad press. But it might take something nasty happening first, like it took a massive theft of credit card numbers from a retailer's system to convince companies in the US that it was necessary to put chips in their cards.

QuoteOr imagine the government thinking your driving pattern on a route clinching trip looks "suspicious" and stopping the car and locking you in until the police can arrive.

This particular privacy concern doesn't even need a car to be self driving, it just needs a car's route history to be remotely traceable. They can always catch up with you later.

It does get me thinking, though, if I were to concoct an algorithm that sorted through route histories to try and find "suspicious" driving patters, what would I base that on? Not an easy question to answer, but I imagine it'd be less about where someone drives and more about where they stop. If you stop in a sensitive location without a reason to be there, that'd be the thing to look for.

After all, while driving in circles may be weird, someone who does this and does not stop anywhere of interest cannot, logically, be up to anything particularly nefarious. If anything, having the tracking data showing "yeah, this person really is just driving in circles" would help dissuade suspicion. There's nothing suspicious about actually driving in circles, what raises suspicion is that because it's an odd thing to do, a cop might not believe you if you tell them that's what you're doing.

Of course that's assuming some logic remains in the process. I wouldn't put it past the paranoia machine to deem driving in circles to be cause for sticking someone on a watchlist in and of itself purely because it's an out of the ordinary behavior.
If you always take the same road, you will never see anything new.

AlexandriaVA

I'll leap for joy the day that my life and property are not in the hands of distracted, impaired, and unskilled drivers. I can then also use my time for doing something productive, such as sleeping, reading, or merely being alone with my thoughts, compared to the attention needed to properly and safely operate an automobile.