your current location is:Home > carHomecar

Musk is angry! Tesla's legal department ordered to delete the video of the dummy child hitting the whole network

  • joy
  • 2022-08-26 14:48:42
  • 337 read
  Musk was really angry this time and asked Tesla's legal affairs to be held accountable to the end.  Bec...

  Musk was really angry this time and asked Tesla's legal affairs to be held accountable to the end.

  Because a video of less than 1 minute "distorts Tesla's level of autonomous driving technology."

  It also "harmed Tesla's business interests and spread defamatory information to the public."

  In the warning letter, it was written in black and white that if the video is not taken down, further legal action will be taken.

  The reason why the attitude is so serious is also related to the adverse effects of the video after it was released.

  Not only did a bunch of people express disappointment and doubts about Tesla's autonomous driving, but it even triggered Tesla owners to try their "methods", so extreme that they tested their children.

  What's in the video?

  The child dummy stands motionless at the zebra crossing, what will Tesla FSD do?

  The actual measurement result is: pretend not to see it, and hit it directly. Didn't even brake to slow down.

  In order to ensure fairness, it was retested immediately.

  With FSD beta 10.12.2 enabled, the driver did not press the accelerator pedal at this time.

  The picture on the Tesla central control screen did not identify the dummy.

  I saw this Tesla directly knocked over the child dummy at a speed of 38.62km/h.

  Immediately after the crash, the Tesla FSD automatically exited.

  The above is one of the videos that Tesla issued a warning letter to ask the original author of the video, Dawn Project, to take it down.

  It should be added that the test site is located at a zebra crossing outside a school in Santa Barbara, California, USA.

  The child dummy is clearly visible to the naked eye wearing a yellow safety vest. There are no ice cream cones around the dummy, and there's plenty of room for sharp turns.

  In addition, the Dawn Project organization also advertises on American TV to promote the potential harm of Tesla FSD.

  The organization was established mainly to resist various insecure software systems. There is such a striking slogan on their official website:

  "The first danger we have to address is Musk's reckless drive to deploy unsafe self-driving cars on the road."

  In response to the video, Tesla legal counsel Dinna Eskin said:

The so-called evaluation abuses and distorts the true capabilities of Tesla's (autonomous driving) technology, and ignores the evaluation results of the vast number of independent third-party organizations and the real experience of consumers.

  The warning letter issued by Tesla justly stated that the video "harmed Tesla's business interests and spread defamatory information to the public" and "your actions actually put consumers at risk."

  Dawn Project founder Dan O'Dowd (Dan O'Dowd) is also the CEO of American software company Green Hills Software.

  After receiving the warning letter, he reacted strongly, tweeting that Musk had threatened to sue him.

  What effect did the video have?

  After the video was released, it caused waves of influence online.

  First, a group of people pointed the finger at Tesla's automatic driving, raising various criticisms and questions.

  For example, "Can Tesla detect an adult-sized dummy? ... If it can't handle complex scenarios, it's not really ready."

  For example, "Is Tesla going to stop with an ice cream cone instead of a child dummy?"

  There are also netizens who are not optimistic about Tesla's pure visual solution:

Relying on pure computer vision poses a safety hazard to the road, and it should be mandatory for these self-driving cars to be equipped with radar or lidar.

  But some netizens rushed out to defend Tesla, questioning the authenticity of the video:

1. Tesla does not say that it has the ability to drive fully autonomously; 2. During the FSD period, you should focus on the whole process and be ready to take over; 3. AI does not recognize the human body model as a pedestrian; 4. The driver Drive the car too fast on purpose.

  The video even caused unexpected repercussions. A large number of Tesla owners tried their own "methods" to the extreme to test their own children.

  Reviews like this flocked to the attention of YouTube officials.

  In the end, the video released by the Tesla fan blogger @Whole Mars Catalog has been deleted by YouTube.

  YouTube spokeswoman Elena Hernandez also responded publicly:

YouTube does not allow content that shows minors engaging in or encouraging minors to engage in dangerous activities.

  And before that, the media, including Wired magazine, called on even the most iron-clad Tesla users, should not use real children to test...

  Too extreme.


TAG: No label

Article Comments (0)

    • This article has not received comments yet, hurry up and grab the first frame~


Top