Dan O’Dowd, CEO of a software company that posted the video earlier this month, thinks the National Highway Traffic Safety Administration should ban “full self-driving” until Tesla CEO Elon Musk, “prove you won’t kill children.”
That’s when Cupani, who runs an auto shop focused on imports and Teslas, got involved and recruited his son. While he describes himself as a “BMW guy,” Cupani says the software doesn’t compare to what Tesla offers.
“Some people look at it and say ‘oh this crazy dad, what is he doing?'” Cupani told CNN Business.
“Well, I do a lot of things like that, but I’ll make sure my son doesn’t get hit.”
Cupani revved up the Tesla from across the lot and turned on “full self-driving,” reaching 35 mph (56 km/h). The Tesla braked steadily and came to a complete stop, well ahead of his son.
“This guy Dan says he’s an expert at this, an expert at that,” Cupani said.
“Well, I’m an automotive expert, future technology, professional driving instructor.”
The passionate defenses and critics of “full self-driving” highlight how the technology has become a flash point in the industry.
Ralph Nader, whose criticism of the auto industry in the 1960s helped fuel the creation of the National Highway Traffic Safety Administration, joined a chorus of critics of “full self-driving” this month.
But it’s also yet another example of the unintended consequence of deploying unfinished, disruptive technology in the wild, and shows how far some Tesla believers are willing to go to defend it and the company.
Enough people seemed to be running their own experiments that a government agency took the extraordinary step of warning people not to use children to test car technology.
“Consumers should never attempt to create their own test scenarios or use real people, and especially children, to test the performance of vehicle technology,” the NHTSA said in a statement Wednesday. The agency called this approach “highly dangerous.”
Earlier this month, California resident Tad Park saw that another Tesla enthusiast wanted to try “full autonomous driving” with a child and offered two of his children.
Park told CNN Business that it was “a little difficult” to get his wife to agree. She agreed when she promised to drive the vehicle.
“I’m never going to push the limits because my kids are so much more valuable to me than anything else,” Park said.
“I am not going to risk their lives in any way.”
Park said he didn’t feel comfortable doing a 40-mph speed test, like the one O’Dowd did with the dummies, with his children.
Toronto resident Franklin Cadamuro created a “box boy,” a boy’s form crafted from old Amazon cardboard boxes.
His Tesla slowed as he approached the “boy in the box.” He then sped up again and hit his cardboard mannequin. Cadamuro speculated that this might be because the cameras couldn’t see the short boxes once they were immediately in front of the bumper, and thus forgot they were there.
Cadamuro said his video started out as entertainment. But he wanted people to see that “full self-driving” isn’t perfect.
“I think a lot of people have two extreme thoughts about the ‘full self-driving’ beta,” Cadamuro said.
“People like Dan think he’s the worst thing in the world. I know some friends who think he’s almost perfect.”
Cadamuro said he also ran other tests in which his Tesla, traveling at higher speeds, effectively drove around the “box boy.”
According to Raj Rajkumar, a Carnegie Mellon University professor who researches autonomous vehicles, it will generally be more difficult to detect smaller objects, such as young children, quickly and accurately than it will be to detect large, adult objects for a computer vision system like the one Tesla vehicles are used.
The more pixels an object occupies in a camera image, the more information the system has to detect features and identify the object. The system will also be affected by the data it is trained on, such as the number of images of young children it is exposed to.
“Computer vision with machine learning is not 100 percent foolproof,” Rajkumar said.
“Just like diagnosing a disease, there are always false positives and negatives.”
Tesla did not respond to a request for comment and does not generally engage with the professional media.
Some Tesla supporters had criticized the use of O’Dowd cones as lane markings in its original test, which may have limited the sedan’s ability to maneuver around the dummy.
Others claimed that O’Dowd’s test driver had forced the Tesla to hit the dummy by pressing on the accelerator, which was not seen in the videos O’Dowd posted.
Some Tesla enthusiasts also pointed to blurred messages on the Tesla vehicle’s screen as an indication that O’Dowd’s test driver was pressing the accelerator to tamper with the tests.
O’Dowd told CNN Business that the blurry messages referred to the supercharging not being available and uneven tire wear.
CNN Business was unable to independently verify what the message said, as O’Dowd did not provide sharper video of what happened in the car during testing.
O’Dowd is the founder of Project Dawn, an effort to make computers safe for humanity. He ran unsuccessfully as a candidate for the US Senate this year in a campaign focused solely on his criticism of “total self-driving.”
NHTSA is currently investigating Tesla’s driver assistance technology, so changes are possible.
How fast can the overtaking car travel?
“The software that controls the lives of billions of people in self-driving cars should be the best software ever written,” O’Dowd said.
“We’re using absolute rules of Wild West chaos and we’ve come up with something that’s so terrible.”
Leave a Reply