Autonomous cars? I think not

NASHIE
Posts: 666
Joined: Tue Jun 04, 2013 9:16 pm
Location: Perth, WA

Re: Autonomous cars? I think not

Postby NASHIE » Wed Oct 17, 2018 12:05 am

Thoglette wrote:
NASHIE wrote:I wasnt very clear, but i was commenting on our backwater mining town Perth :wink:

But we have an autonomous bus. Or so the RAC WA claim!


:lol: last time i rode past it had 2 traffic warden vehicles in escort.

User avatar
Cheesewheel
Posts: 1108
Joined: Wed Nov 16, 2011 9:22 pm

Re: Autonomous cars? I think not

Postby Cheesewheel » Wed Oct 24, 2018 6:26 am

https://tech.co/mapping-driverless-car- ... ia-2018-10

An overview of autonomous car crashes in california (the state's mandatory reporting of all and any autonomous car crashes offers a good body of incidents to examine).

Takeaways are that most crashes occur at low speeds (1-10 mph) and a big cause of accidents are the cars being overly cautious ... so they tend to freak out a bit at intersections and such, pottering around like learner drivers, which commonly leads to rear ending from the humans. The article also points out specific infrastructure that exacerbates this hyper-cautiousness of autonomous cars (such as stop signs). So there are road design issues are required to be looked at in order to make large numbers of such cars driving on our roads feasible.
Go!Run!GAH!

User avatar
Thoglette
Posts: 4292
Joined: Thu Feb 19, 2009 1:01 pm

ABC Chart of the day

Postby Thoglette » Thu Oct 25, 2018 11:49 am

Chart of the day: Who do we want self-driving cars to spare on the road?
Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Thu Oct 25, 2018 12:47 pm

find_bruce wrote:I thought the whole point of autonomous cars is that they would be better at spotting unpredictable hazards than humans, who aren't very good at that

The emphasis is on unpredictable. Computers aren't good at predicting human behaviour.

Good sensors could mean that autonomous cars are better at spotting hazards. But humans are still betting at predicting the behavior of UNPREDICTABLE hazards. An unpredictable hazard would often be another person, but could equally be wildlife.

A pedestrian walking up to an intersection is a hazard. But if they are looking towards you then you can assess that the pedestrian will likely stop. If it was a toddler or a distracted pedestrian then the appropriate reaction might be different.

AdelaidePeter
Posts: 465
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 1:19 pm

Thoglette wrote:Chart of the day: Who do we want self-driving cars to spare on the road?
Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.


"Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course,"

There is a third option: brake hard. Or better still, slow down pre-emptively. That is what a defensive driver does, so it should be possible to build that into self-driving cars.

So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 1:48 pm

AdelaidePeter wrote:So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.

It isn't exaggerated it is a very real issue. It doesn't matter if the likelihood of the scenario is one in a ten million, it doesn't change the programming needs.

The cars need to be programmed to make a choice. No choice is not an option. Thus should the scenario arise the program needs to be able to choose the 'least worst option'.

AdelaidePeter
Posts: 465
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 2:19 pm

human909 wrote:
AdelaidePeter wrote:So I think this talk of moral dilemmas for self-driving cars, is very much exaggerated.

It isn't exaggerated it is a very real issue. It doesn't matter if the likelihood of the scenario is one in a ten million, it doesn't change the programming needs.

The cars need to be programmed to make a choice. No choice is not an option. Thus should the scenario arise the program needs to be able to choose the 'least worst option'.


How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice? Especially, one in which the human driver was driving defensively in the first place.'

p.s. I'm not saying it never happens. I'm just saying it is so rare that it is almost not worth worrying about. Yes driverless cars have problems that need solving, but I think this one is a long way down the list and gets far too much airtime. (Far more important, to list one relevant to us, is reliably detecting bicycles).

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 2:53 pm

AdelaidePeter wrote:How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice?

Have a look at our road toll. Some of them would include bad choices. EG:
https://www.thecourier.com.au/story/547 ... -of-night/

AdelaidePeter wrote:I'm just saying it is so rare that it is almost not worth worrying about.

It seems you are still failing to understand the issue. It doesn't matter how rare it is, the AI still needs to be programmed to make decision. THAT is the dilemma, back at the programming decision. The AI needs to have a decision making process programmed into it.

You say it isn't worth worrying about yet there are already programmers programming cars to MAKE THESE DECISIONS. This isn't an imaginary issue. This is here and now.

Oops somebody died because the programming put driver expediency ahead of the safety of a pedestrian.
Note it has been recognised that the software DID detect the pedestrian as an intersecting object but chose not to react to the pedestrian.


The software detected Elaine Herzberg, a 47-year-old woman who was hit by a semi-autonomous Volvo operated by Uber, as she was crossing the street but decided not to stop right away. That’s in part because the technology was adjusted to be less reactive or slower to react to objects in its path that may be “false positives” — such as a plastic bag.


Those rides can be clumsy and filled with hard brakes as the car stops for everything that may be in its path. According to The Information, Uber decided to adjust the system so it didn’t stop for potential false positives but because of that was unable to react immediately to Herzberg in spite of detecting her in its path.

AdelaidePeter
Posts: 465
Joined: Wed Jun 07, 2017 11:13 am

Re: ABC Chart of the day

Postby AdelaidePeter » Thu Oct 25, 2018 3:40 pm

human909 wrote:
AdelaidePeter wrote:How often does a human have to make the choice? Can you offer a real example where a human driver had to make the choice?

Have a look at our road toll. Some of them would include bad choices. EG:
https://www.thecourier.com.au/story/547 ... -of-night/


The article is talking about the ethical dilemma when the driver (or AI software) must choose which person to kill. I'm still looking for an example when a human, who was driving defensively, had to make that decision. Neither of your examples fits that.

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: ABC Chart of the day

Postby human909 » Thu Oct 25, 2018 4:00 pm

Sorry I don't have a database of all the worlds collisions and all ethical decisions made by people. But like I keep saying. It doesn't matter the likelihood of the scenario, the AI still must be programmed to make choices.

These ethical choices have already resulted in a death. This situation an even poorer ethical choice. The AI chose comfort and expediency of the occupant over the risk to another persons life.

User avatar
Thoglette
Posts: 4292
Joined: Thu Feb 19, 2009 1:01 pm

It's not the cars that are wrong, it's the roads. And the laws

Postby Thoglette » Sat Oct 27, 2018 10:59 pm

Driverless cars promise a future without fatal crashes, if Victoria's roads are ready

The usual wild promises of Nivarna, again with the public paying to enable private and/or foreign companies to make a buck.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: It's not the cars that are wrong, it's the roads. And the laws

Postby human909 » Sun Oct 28, 2018 9:05 am

Thoglette wrote:Driverless cars promise a future without fatal crashes, if Victoria's roads are ready

The usual wild promises of Nivarna, again with the public paying to enable private and/or foreign companies to make a buck.


I don't think that Nivarna is at all close. But on possibly promising aspect is the real and present concern about automatic vehicles increasing congestion. Given the automation and GPS is already there it wouldn't surprise me to see a move toward proper road use and congestion charging at the beginning with driverless vehicles.

Road use charging would be fantastic for cyclists and great for people who really need to drive. Congestion is the enemy of motorists.

I'm being optimistic, but road authorities have already been publicly discussing the idea. Hard politically to get it happening but driverless vehicles is the foot in the door.

User avatar
Thoglette
Posts: 4292
Joined: Thu Feb 19, 2009 1:01 pm

Seminar Software on Wheels: Driver Awareness and CAN drive Trial

Postby Thoglette » Tue Nov 13, 2018 7:26 pm

https://engineersaustralia.org.au/Event ... rive-trial

*Webinar available*

Date 14/11/2018 - 12:00 pm to 01:30 pm ( AEST )
Registration Closes 14/11/2018 11:45 am
Venue: Seeing Machines Offices, 80 Mildura St, Fyshwick, Webinar Available

EA Blurb wrote:The ACT Government, together with Seeing Machines, has established an AV study - initiated the CAN drive - trial which will, through observing driver behaviour in an automated vehicle setting, help us better understand when and why, from both a safety and a regulatory perspective, a driver should be in control rather than the automated vehicle, and help to manage the transition from one to the other with reduced risk.

The CAN Drive trial CAN drive supports a growing appetite internationally to understand issues such as when and how drivers will use automated driving functions and how it might impact their awareness of the environment around them, as well as their ability to take control of steering and speed functions from the vehicle when required, and at short notice.

A panel discussion will be held at Seeing Machines offices, Fyshwick, and broadcast live via a National Webinar.

Speaker 1: Mr Andrew McCredie, ACT Government AV Trial Governance Committee: Why CAN drive trial is being conducted.

Speaker 2: Mr Ken Kroeger, Chairman, Seeing Machines Ltd: What Seeing Machines are doing, and what have found so far.
Speaker 3: Mr James Goodwin, Chief Executive Officer, Australasian New Car Assessment Program: How this work impacts ANCAP's safety rating system.
Stop handing them the stick! - Dave Moulton
"People are worthy of respect, ideas are not." Peter Ellerton, UQ

opik_bidin
Posts: 97
Joined: Sun Jun 24, 2018 5:45 pm

Re: Autonomous cars? I think not

Postby opik_bidin » Wed Nov 14, 2018 4:34 pm

https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash. Here are the ethical considerations

Image

User avatar
Comedian
Posts: 5522
Joined: Mon Aug 09, 2010 7:35 pm
Location: Brisbane

Re: Autonomous cars? I think not

Postby Comedian » Wed Nov 14, 2018 4:39 pm

opik_bidin wrote:https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash. Here are the ethical considerations

Image

Dogs above cats. :lol:

AdelaidePeter
Posts: 465
Joined: Wed Jun 07, 2017 11:13 am

Re: Autonomous cars? I think not

Postby AdelaidePeter » Wed Nov 14, 2018 5:26 pm

opik_bidin wrote:https://twitter.com/wef/status/1058675216027660288

https://www.weforum.org/agenda/2018/10/ ... programmed

A self-driving car has a choice about who dies in a fatal crash


Does it?

I'm still waiting for someone to produce a real life example of when a human has had to make this sort of decision.

human909
Posts: 9047
Joined: Tue Dec 01, 2009 11:48 am

Re: Autonomous cars? I think not

Postby human909 » Wed Nov 14, 2018 6:23 pm

AdelaidePeter wrote:Does it?

Yes it does.
I'll post it again.
https://www.theguardian.com/technology/ ... zona-tempe

AdelaidePeter wrote:I'm still waiting for someone to produce a real life example of when a human has had to make this sort of decision.

You conveniently ignored all the REAL LIFE examples given.

Here are a few more:
https://www.thecourier.com.au/story/547 ... -of-night/
https://www.inverelltimes.com.au/story/ ... -accident/


But like I keep saying but you seem to ignore it. It doesn't matter if it has NEVER EVER happened. It is likely that no female doctor pushing a stroller has been killed because a driver had to swerve to avoid a clown running onto a street.

But regardless, it needs to be in the AI's programming to make decisions in such circumstances.

The AI doesn't need to know about clowns or doctors or stollers but it DOES need to know some things. You do need to program for it to recognise dangers and prioritise damage outcomes. Because these choices are being made EVERY SINGLE DAY.

If you didn't have such priorities you might get AI cars swerving for a chunk of polystyrene on the road and killing a child. Or inversely not swerving for a child because the damage it could do to the chunk of polystyrene on the footpath.

Who is online

Users browsing this forum: Daus, human909, Ratts