New: Try my AI Bot new film



Why Mercedes’ decision to let its self-driving cars kill pedestrians is probably the right thing to do (says Bloomberg )

In an interview with Car and Driver, the manager of driver assistance systems at Mercedes-Benz Christoph von Hugo has revealed that the company's future autonomous vehicles will always put the driver first. In other words, it would choose to run the child over every time.

Although the fact someone has to make this choice feels uncomfortable, it would be more dangerous if they didn't, because unless a self-driving vehicle is told what to do when a child runs into the road, it won't do anything.

Previously, manufacturers have been quiet about what would happen under these circumstances, until Mercedes-Benz's announcement at the Paris Auto Show this month. According to von Hugo, all of the company's future Level 4 and Level 5 self-driving cars will be programmed with the decision to save the people they carry over anything else.

"If you know you can save at least one person, at least save that one. Save the one in the car," von Hugo said in the interview. "If all you know for sure is that one death can be prevented, then that's your first priority."”

Why Mercedes’ decision to let its self-driving cars kill pedestrians is probably the right thing to do
via Instapaper

Read More

Carmageddon is Coming – Future Crunch – great read via Medium

“An overlapping confluence of three different technological waves — the smartphone, the electric battery and artificial intelligence — have created the conditions for a technological disruption so profound it’s going to change almost everything about the way we move in modern society.”

Carmageddon is Coming – Future Crunch – Medium
via Instapaper

Read More

The long, winding road for driverless cars – must read via The Economist

“Level 3 autonomous driving is even more controversial. The main difference is that, while the driver must still remain vigilant and ready to intervene in an emergency, responsibility for all the critical safety functions is shifted from the driver to the car. This has a lot of engineers worried. Experience has not been good with control systems that relegate the operator to a managerial role outside the feedback loop, with the sole function of interceding in the case of an emergency.

It was this sort of thinking that allowed an accident at a nuclear power plant at Three Mile Island, in 1979, to escalate into a full-blown meltdown. Plant operators failed to react correctly when a valve stuck open and caused the reactor to lose cooling water. They then made matters worse by overriding the automatic emergency cooling system, thinking there was too much water in the reactor rather than too little. The accident report blamed inadequate operator training and a poorly designed computer interface.

Similar human failings have led to countless airline accidents—most recently, the Asiana Airlines crash at San Francisco in 2013. Over-reliance on automation and lack of systems understanding by the pilots when they needed to interevene were cited as major factors contributing to the Asiana crash. Some carmakers fear that—even more than reactor operators or professional pilots—untrained motorists may only compound the problem when suddenly required to take control of an otherwise fully automated system. Ford believes it is better to skip Level 3 altogether, and go straight to Level 4, even if it takes longer.”

The long, winding road for driverless cars
via Instapaper

Read More


* indicates required
latest book