Tesla’s Full Self-Driving Beta Test: Is A Scary Experiment No Other Drivers Have Consented To
The public is taking unreasonable risks with Tesla’s full self-driving beta test. And these risks could potentially be deadly. For those who don’t know, the full self-driving beta test is a rollout of software which allows select users the use of autopilot on non-highway streets.
Get The Full Series in PDF
Get the entire 10-part series on Charlie Munger in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.
Q3 2020 hedge fund letters, conferences and more
FSD beta rollout happening tonight. Will be extremely slow & cautious, as it should.
David Einhorn’s Greenlight Capital funds returned 5.9% in the third quarter of 2020, compared to a gain of 8.9% for the S&P 500 in the same period. This year has been particularly challenging for value investors. Growth stocks have surged as value has struggled. For Greenlight, one of Wall Street’s most established value-focused investment funds, Read More
— Elon Musk (@elonmusk) October 20, 2020
Real People Will Now Test Tesla’s Self-Driving Beta Update
Customers with Tesla’s Early Access Program will received the update. This effectively allows these users to access the autonomous autopilot system for city streets. As The Verge stated, “the early access program is used as a testing platform to help iron out software bugs.”
Iron out software bugs. On city streets. With real people who never consented to this science experiment. And the people running the science experiment? Normal, untrained customers of Tesla. If that doesn’t sound like a science project gone mad, watch this video.
Take 41 seconds and watch this video.
cc @Tweetermeyer @PAVECampaign @AlexRoy144 $TSLAQ pic.twitter.com/4neqQxJPwr
— TC (@TESLAcharts) October 25, 2020
And if you still don’t think this is crazy check out this video, where a beta test car stops in the middle of an intersection, causing a car behind it to honk its horn.
“Full Self Driving” beta test car stops in the middle of an intersection, causing the car behind it to honk. Then the #Tesla cuts across a solid white line to make the turn.
(Video of Kristen Yamamoto) cc: @Tweetermeyer $TSLA $TSLAQ pic.twitter.com/jOB9O26nrC
— Greta Musk (@GretaMusk) October 25, 2020
Look, we are not haters of technology. And to be frank, the development of a self-driving car is a wonderful technology and complete game changer. But rolling a software beta test out on non-consenting drivers and using random untrained customers to run the experiment is just reckless.
As Zero Hedge reported in March in their article titled: Attention NHTSA: Second Tesla In A Week Has Plowed Through Storefront In Coachella Valley. It is only a matter of time before a Tesla in autopilot mode will crash and kill someone. Tesla’s autopilot does not prevent accidents. Show me high tech software and I will show you a bug that will infect it.
Brian W. Kernighan stated this perfectly, “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.”
Tesla’s full self-driving beta test? It’s a mad science experiment gone wrong.
This article first appeared on The Stonk Market