A Tesla fan tested the child-detection capabilities of the automaker’s Full Self-Driving software with two children.
He did it after another test showed that the self-driving technician collided with a child’s doll.
The test shows that FSD stops for children at low speeds, but is not subject to US testing standards.
A Tesla fan tested the automaker’s Full Self-Driving (FSD) software with a real child after a video went viral of the electric car crashing into a toddler-sized mannequin.
Tesla critic Dan O’Dowd’s marketing campaign, which showed that Tesla’s FSD collided with a child-sized mannequin, quickly sparked outrage in the Tesla community — and angered one Tesla fan, in particular, @ WholeMarsBlog.
Last week, Omar Qazi, the person who runs Tesla fan account @WholeMarsBlog, made a call on Twitter for a San Francisco parent willing to use their child for a test of the software. Qazi said he was eager to refute the results of O’Dowd’s test and show that he was “lying about life-saving technology.”
The Twitter post quickly garnered a response online, and The Verge’s transportation editor even published an open letter urging the Tesla fan to cancel the test.
On Sunday, Qazi, who is known to interact with Elon Musk on Twitter and has more than 130,000 followers, posted a YouTube video in which he tested FSD using real children.
The Tesla fan did not respond to an Insider request for comment ahead of publication, but said on Twitter, the test would be safe because a human would take over if the Tesla didn’t brake.
Volte Equity CEO Tad Park took two of his children for the test with Qazi on a residential street.
“I’m confident I can trust FSD with my kids and I’m also in control of the wheel so I can brake at any time,” Park said in the video.
In one test, Park’s daughter was standing in the middle of the road and Tesla FSD seemed to recognize the child from a standstill about three car lengths away. In the video, the car drove forward at a speed of five miles per hour and stopped, refusing to move forward until the child was taken out of orbit.
Park did a similar test with his five-year-old son. The boy was walking across the street when the Tesla approached him from the same distance at less than 10 miles per hour. The Tesla seemed to slow down until after the child crossed the street.
“It was a little unnerving at first, but I knew it would detect and stop,” Park said in the video. “I think it’s very important that this is there. I think it will save a lot of children’s lives.”
The video contains multiple tests showing that the software recognizes both a dummy and an adult male in the middle of the road. However, the test is never run at a speed of more than about 20 miles per hour. In comparison, the test that Qazi criticizes was conducted at a speed of 40 miles per hour from a distance of 120 meters.
Qazi isn’t the first Tesla fan to try to replicate the test. After the video was released, several FSD drivers took to the streets to see if the system could recognize a child-sized dummy, with mixed results.
It is important to note that none of the tests were conducted under the supervision of a US regulator, but rather independently – meaning they were not subject to the same testing standards.
While FSD claims to be fully self-driving, in reality it works as an optional add-on that allows Teslas to automatically change lanes, drive up and down highways, recognize stop signs and traffic lights, and park. Tesla has told drivers that the system will not replace a licensed driver and is instructing them to keep their hands on the wheel and be prepared to take over when the system is running.
Read the original article on Business Insider