Autonomous cars? I think not

NASHIE
Posts: 1193
Joined: Tue Jun 04, 2013 9:16 pm
Location: Perth, WA

Re: Autonomous cars? I think not

Postby NASHIE » Wed Oct 17, 2018 12:05 am

Thoglette wrote:
NASHIE wrote:I wasnt very clear, but i was commenting on our backwater mining town Perth :wink:
But we have an autonomous bus. Or so the RAC WA claim!
:lol: last time i rode past it had 2 traffic warden vehicles in escort.

User avatar
Cheesewheel
Posts: 1209
Joined: Wed Nov 16, 2011 9:22 pm

Re: Autonomous cars? I think not

Postby Cheesewheel » Wed Oct 24, 2018 6:26 am

https://tech.co/mapping-driverless-car- ... ia-2018-10

An overview of autonomous car crashes in california (the state's mandatory reporting of all and any autonomous car crashes offers a good body of incidents to examine).

Takeaways are that most crashes occur at low speeds (1-10 mph) and a big cause of accidents are the cars being overly cautious ... so they tend to freak out a bit at intersections and such, pottering around like learner drivers, which commonly leads to rear ending from the humans. The article also points out specific infrastructure that exacerbates this hyper-cautiousness of autonomous cars (such as stop signs). So there are road design issues are required to be looked at in order to make large numbers of such cars driving on our roads feasible.
Go!Run!GAH!

User avatar
Thoglette
Posts: 6605
Joined: Thu Feb 19, 2009 1:01 pm

ABC Chart of the day

Postby Thoglette » Thu Oct 25, 2018 11:49 am

Chart of the day: Who do we want self-driving cars to spare on the road?
Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Thu Oct 25, 2018 12:47 pm

find_bruce wrote:I thought the whole point of autonomous cars is that they would be better at spotting unpredictable hazards than humans, who aren't very good at that
The emphasis is on unpredictable. Computers aren't good at predicting human behaviour.

Good sensors could mean that autonomous cars are better at spotting hazards. But humans are still betting at predicting the behavior of UNPREDICTABLE hazards. An unpredictable hazard would often be another person, but could equally be wildlife.

A pedestrian walking up to an intersection is a hazard. But if they are looking towards you then you can assess that the pedestrian will likely stop. If it was a toddler or a distracted pedestrian then the appropriate reaction might be different.

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 1:19 pm

Thoglette wrote:Chart of the day: Who do we want self-driving cars to spare on the road?
Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.
"Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course,"

There is a third option: brake hard. Or better still, slow down pre-emptively. That is what a defensive driver does, so it should be possible to build that into self-driving cars.

So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 1:48 pm

AdelaidePeter wrote:So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.
It isn't exaggerated it is a very real issue. It doesn't matter if the likelihood of the scenario is one in a ten million, it doesn't change the programming needs.

The cars need to be programmed to make a choice. No choice is not an option. Thus should the scenario arise the program needs to be able to choose the 'least worst option'.

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 2:19 pm

human909 wrote:
AdelaidePeter wrote:So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.
It isn't exaggerated it is a very real issue. It doesn't matter if the likelihood of the scenario is one in a ten million, it doesn't change the programming needs.

The cars need to be programmed to make a choice. No choice is not an option. Thus should the scenario arise the program needs to be able to choose the 'least worst option'.
How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice? Especially, one in which the human driver was driving defensively in the first place.'

p.s. I'm not saying it never happens. I'm just saying it is so rare that it is almost not worth worrying about. Yes driverless cars have problems that need solving, but I think this one is a long way down the list and gets far too much airtime. (Far more important, to list one relevant to us, is reliably detecting bicycles).

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 2:53 pm

AdelaidePeter wrote:How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice?
Have a look at our road toll. Some of them would include bad choices. EG:
https://www.thecourier.com.au/story/547 ... -of-night/
AdelaidePeter wrote:I'm just saying it is so rare that it is almost not worth worrying about.
It seems you are still failing to understand the issue. It doesn't matter how rare it is, the AI still needs to be programmed to make decision. THAT is the dilemma, back at the programming decision. The AI needs to have a decision making process programmed into it.

You say it isn't worth worrying about yet there are already programmers programming cars to MAKE THESE DECISIONS. This isn't an imaginary issue. This is here and now.

Oops somebody died because the programming put driver expediency ahead of the safety of a pedestrian.
Note it has been recognised that the software DID detect the pedestrian as an intersecting object but chose not to react to the pedestrian.

The software detected Elaine Herzberg, a 47-year-old woman who was hit by a semi-autonomous Volvo operated by Uber, as she was crossing the street but decided not to stop right away. That’s in part because the technology was adjusted to be less reactive or slower to react to objects in its path that may be “false positives” — such as a plastic bag.
Those rides can be clumsy and filled with hard brakes as the car stops for everything that may be in its path. According to The Information, Uber decided to adjust the system so it didn’t stop for potential false positives but because of that was unable to react immediately to Herzberg in spite of detecting her in its path.

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 3:40 pm

human909 wrote:
AdelaidePeter wrote:How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice?
Have a look at our road toll. Some of them would include bad choices. EG:
https://www.thecourier.com.au/story/547 ... -of-night/
The article is talking about the ethical dilemma when the driver (or AI software) must choose which person to kill. I'm still looking for an example when a human, who was driving defensively, had to make that decision. Neither of your examples fits that.

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 4:00 pm

Sorry I don't have a database of all the worlds collisions and all ethical decisions made by people. But like I keep saying. It doesn't matter the likelihood of the scenario, the AI still must be programmed to make choices.

These ethical choices have already resulted in a death. This situation an even poorer ethical choice. The AI chose comfort and expediency of the occupant over the risk to another persons life.

User avatar
Thoglette
Posts: 6605
Joined: Thu Feb 19, 2009 1:01 pm

It's not the cars that are wrong, it's the roads. And the laws

Postby Thoglette » Sat Oct 27, 2018 10:59 pm

Driverless cars promise a future without fatal crashes, if Victoria's roads are ready

The usual wild promises of Nivarna, again with the public paying to enable private and/or foreign companies to make a buck.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: It's not the cars that are wrong, it's the roads. And the laws

Postby human909 » Sun Oct 28, 2018 9:05 am

Thoglette wrote:Driverless cars promise a future without fatal crashes, if Victoria's roads are ready

The usual wild promises of Nivarna, again with the public paying to enable private and/or foreign companies to make a buck.
I don't think that Nivarna is at all close. But on possibly promising aspect is the real and present concern about automatic vehicles increasing congestion. Given the automation and GPS is already there it wouldn't surprise me to see a move toward proper road use and congestion charging at the beginning with driverless vehicles.

Road use charging would be fantastic for cyclists and great for people who really need to drive. Congestion is the enemy of motorists.

I'm being optimistic, but road authorities have already been publicly discussing the idea. Hard politically to get it happening but driverless vehicles is the foot in the door.

User avatar
Thoglette
Posts: 6605
Joined: Thu Feb 19, 2009 1:01 pm

Seminar Software on Wheels: Driver Awareness and CAN drive Trial

Postby Thoglette » Tue Nov 13, 2018 7:26 pm

https://engineersaustralia.org.au/Event ... rive-trial

*Webinar available*

Date 14/11/2018 - 12:00 pm to 01:30 pm ( AEST )
Registration Closes 14/11/2018 11:45 am
Venue: Seeing Machines Offices, 80 Mildura St, Fyshwick, Webinar Available
EA Blurb wrote: The ACT Government, together with Seeing Machines, has established an AV study - initiated the CAN drive - trial which will, through observing driver behaviour in an automated vehicle setting, help us better understand when and why, from both a safety and a regulatory perspective, a driver should be in control rather than the automated vehicle, and help to manage the transition from one to the other with reduced risk.

The CAN Drive trial CAN drive supports a growing appetite internationally to understand issues such as when and how drivers will use automated driving functions and how it might impact their awareness of the environment around them, as well as their ability to take control of steering and speed functions from the vehicle when required, and at short notice.

A panel discussion will be held at Seeing Machines offices, Fyshwick, and broadcast live via a National Webinar.

Speaker 1: Mr Andrew McCredie, ACT Government AV Trial Governance Committee: Why CAN drive trial is being conducted.

Speaker 2: Mr Ken Kroeger, Chairman, Seeing Machines Ltd: What Seeing Machines are doing, and what have found so far.
Speaker 3: Mr James Goodwin, Chief Executive Officer, Australasian New Car Assessment Program: How this work impacts ANCAP's safety rating system.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

opik_bidin
Posts: 968
Joined: Sun Jun 24, 2018 5:45 pm

Re: Autonomous cars? I think not

Postby opik_bidin » Wed Nov 14, 2018 4:34 pm

https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash. Here are the ethical considerations

Image

User avatar
Comedian
Posts: 9166
Joined: Mon Aug 09, 2010 7:35 pm
Location: Brisbane

Re: Autonomous cars? I think not

Postby Comedian » Wed Nov 14, 2018 4:39 pm

opik_bidin wrote:https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash. Here are the ethical considerations

Image
Dogs above cats. :lol:

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: Autonomous cars? I think not

Postby AdelaidePeter » Wed Nov 14, 2018 5:26 pm

opik_bidin wrote:https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash
Does it?

I'm still waiting for someone to produce a real life example of when a human has had to make this sort of decision.

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 6:23 pm

AdelaidePeter wrote:Does it?
Yes it does.
I'll post it again.
https://www.theguardian.com/technology/ ... zona-tempe
AdelaidePeter wrote:I'm still waiting for someone to produce a real life example of when a human has had to make this sort of decision.
You conveniently ignored all the REAL LIFE examples given.

Here are a few more:
https://www.thecourier.com.au/story/547 ... -of-night/
https://www.inverelltimes.com.au/story/ ... -accident/


But like I keep saying but you seem to ignore it. It doesn't matter if it has NEVER EVER happened. It is likely that no female doctor pushing a stroller has been killed because a driver had to swerve to avoid a clown running onto a street.

But regardless, it needs to be in the AI's programming to make decisions in such circumstances.

The AI doesn't need to know about clowns or doctors or stollers but it DOES need to know some things. You do need to program for it to recognise dangers and prioritise damage outcomes. Because these choices are being made EVERY SINGLE DAY.

If you didn't have such priorities you might get AI cars swerving for a chunk of polystyrene on the road and killing a child. Or inversely not swerving for a child because the damage it could do to the chunk of polystyrene on the footpath.

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: Autonomous cars? I think not

Postby AdelaidePeter » Wed Nov 14, 2018 6:48 pm

human909 wrote:
AdelaidePeter wrote:Does it?
Yes it does.
I'll post it again.
https://www.theguardian.com/technology/ ... zona-tempe
AdelaidePeter wrote:I'm still waiting for someone to produce a real life example of when a human has had to make this sort of decision.
You conveniently ignored all the REAL LIFE examples given.

Here are a few more:
https://www.thecourier.com.au/story/547 ... -of-night/
https://www.inverelltimes.com.au/story/ ... -accident/
Again, none of them were a choice between deciding which person to kill ("who dies in a fatal crash").

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 6:58 pm

AdelaidePeter wrote:Again, none of them were a choice between deciding which person to kill ("who dies in a fatal crash").
Again. The Tesla car DID choose not to take action to the detected object. AKA it chose to kill. Sure the AI didn't know it was choosing to kill, but that's the thing about AI it does exactly what it is programmed to do.

And again. It regardless. Autonomous cars still need to be programmed to make decisions and choices in ANY and ALL situations.

Here is another google result. But really I'm getting sick of this.
https://thewest.com.au/news/south-west/ ... -ya-394858

It doesn't matter if it is death or a scratch of paint. The point is that autonomous vehicles need to be able to assess priorities.

AdelaidePeter
Posts: 1230
Joined: Wed Jun 07, 2017 11:13 am

Re: Autonomous cars? I think not

Postby AdelaidePeter » Wed Nov 14, 2018 7:43 pm

human909 wrote:
AdelaidePeter wrote:Again, none of them were a choice between deciding which person to kill ("who dies in a fatal crash").
And again. It regardless. Autonomous cars still need to be programmed to make decisions and choices in ANY and ALL situations.
Of course they have to make decisions. I never said they didn't (I hope!). But not all decisions are ethical dilemmas. The study - at least in my mind - is about decisions which are ethical dilemmas. It's not an ethical dilemma to decide whether to stop for a pedestrian or run her down. If a car can't get that right (as in the Arizona case), that's just a shoddy implementation, it's nothing to do with dilemmas.
human909 wrote: Here is another google result. But really I'm getting sick of this.
https://thewest.com.au/news/south-west/ ... -ya-394858
OK, that's a true real life example of an ethical dilemma. Very sad. I still think (as I think I said the other week) that such dilemmas are incredibly rare, and there are plenty of higher priorities for autonomous car makers. But at least you've given one sort of situation they will have to consider.

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 8:02 pm

AdelaidePeter wrote: If a car can't get that right (as in the Arizona case), that's just a shoddy implementation, it's nothing to do with dilemmas.
Not true. It has EVERYTHING to do with dilemmas.

As already posted they proritised the owners comfort and expediency over caution. That is EXACTLY an ethical dilemma. The original programming was overly cautious at the expense of comfort and expediency. This was changed and a person died. The balance of those priorities is absolutely a dilemma. It is the very dilemma that EVERY motorist faces every day.

We can drastically reduce the risk of death by ensuring all vehicles, autonomous or not travel, no faster than 1kph. But priorities need to be balanced. These ethical dilemas are here and now and have been since before AI and before cars. AI just makes the ethical choices more explicit because they are there in black and white, (1s and 0s).
AdelaidePeter wrote:
OK, that's a true real life example of an ethical dilemma. Very sad. I still think (as I think I said the other week) that such dilemmas are incredibly rare, and there are plenty of higher priorities for autonomous car makers. But at least you've given one sort of situation they will have to consider.
No I haven't given one situation they have to consider.
You continue to miss the point and clearly aren't trying to even understand the basics of decision making or programming.

They don't need to consider ONE type of situation. They need to consider EVERY and ALL potential types of situations. Hence the ethical choices need to be programmed in from the very beginning. If not explicitly, then they are implicitly programmed in such as the case above.

User avatar
Thoglette
Posts: 6605
Joined: Thu Feb 19, 2009 1:01 pm

Re: Autonomous cars? I think not

Postby Thoglette » Wed Nov 14, 2018 8:27 pm

human909 wrote:We can drastically reduce the risk of death by ensuring all vehicles, autonomous or not travel, no faster than 30kph on urban roads without hard barriers
Just saying.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 8:56 pm

Thoglette wrote:
human909 wrote:We can drastically reduce the risk of death by ensuring all vehicles, autonomous or not travel, no faster than 30kph on urban roads without hard barriers
Just saying.
Yep. A less extreme and more and maybe more understandable dilemma currently being faced more explicitly by our road authorities.

The extreme ones that seem to confound AdelaidePeter so much are just as relevant in philosophy, ethics and AI decision making. The extreme cases are just less likely, but they highlight more strongly the choices being made.

User avatar
Howzat
Posts: 850
Joined: Wed Aug 15, 2012 7:08 pm

Re: Autonomous cars? I think not

Postby Howzat » Wed Nov 14, 2018 8:59 pm

human909 wrote:If not explicitly, then they are implicitly programmed in such as the case above.
These chin-stroking "thought" pieces are just warmed-over variations on the old "trolley problem" that has been boring high school philosophy students for years.

Unavoidable accidents are not going to be accompanied by artificial choices of who to run over, even if the runaway autonomous car can tell the difference between a) a group of girl guides with puppies and b) a group of boy scouts with kittens. :roll:

If you'd like to ponder a moral dilemma - how about this: imagine you're a tech billionaire and you've been telling your investors that you're going to "disrupt" the world's transport markets with driverless technology, and you plan to launch an autonomous taxi service in, um, six months! You want to run trials on public streets, and you can persuade public officials with the promise of tech jobs for the city and board jobs for the officials. But your whiny nerd programmers express modest concerns over readiness of the technology.

So do you put the cars on the public roads or not? It's random lives vs a boatload of money.... hmmm nerds...anonymous strangers, probably cyclists too...loads of money... oooh dilemma...

human909
Posts: 9810
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 9:03 pm

Howzat wrote:These chin-stroking "thought" pieces are just warmed-over variations on the old "trolley problem" that has been boring high school philosophy students for years.
Well, yeah that is the point.
Howzat wrote:Unavoidable accidents are not going to be accompanied by artificial choices of who to run over, even if the runaway autonomous car can tell the difference between a) a group of girl guides with puppies and b) a group of boy scouts with kittens. :roll:
No. They will be accompanied by REAL choices.

But well before that those REAL choices will have already been programmed into the AI. That is where the crux of the matter. And no amount 'look over there' distractions avoids that CURRENT reality.
Howzat wrote:So do you put the cars on the public roads or not? It's random lives vs a boatload of money.... hmmm nerds...anonymous strangers, probably cyclists too...loads of money... oooh dilemma...
Well yeah. Congratulations you have identified another ethical dilemma that humans faces everyday.


Like I said it gets a whole lot more stark when the ethical choices are right there explicit written in code. It is no longer chin stroking "trolley problems" when your AI would make a choice in that scenario base on what is written in code.

Who is online

Users browsing this forum: No registered users