Tesla's Full Self-Driving (FSD) beta tests might be rolling out soon, here's the lowdown:
Elon Musk, the chief executive of Tesla, hinted at the release of the FSD beta for a select group of testers on his Twitter. This group, estimated to be under 2,000 individuals, has been testing the FSD software for almost a year. Their experiences have stirred both admiration and criticism. Tesla hasn't revealed the criteria for selecting these testers or the number of individuals invited for the first public test yet.
Critics argue that Tesla hasn't consulted pedestrians, cyclists, or other road users regarding autonomous driving on public roads. Despite these criticisms, Tesla has remained silent and rarely interacts with mainstream media.
If you want to be among the first to test FSD, you may have to compromise on data privacy. You must agree to let Tesla collect data on your driving style and make judgments based on that data.
Question Time: What's Full Self-Driving technology all about?
Tesla claimed that all its new cars had hardware capable of self-driving in 2016. Software updates, they promised, would enable cars to drive autonomously. Musk even suggested that passengers could even nap in self-driving Tesla cars. However, this self-driving technology is far from achieving those ambitious expectations and requires driver attention to ensure safety.
Tesla is introducing FSD due to customer frustration and dissatisfaction with the long wait for the technology. Some are growing increasingly skeptical of Musk's claims.
Even a Tesla driver admitted, "FSD Beta does not make my car self-driving."
When it comes to self-driving, most experts believe that true self-driving means passengers can nap at the wheel, requiring no active human driver. Regulatory agencies have repeatedly criticized Tesla's use of the term "full self-driving."
The National Highway Traffic Safety Administration has initiated an investigation into a rearward-moving Tesla vehicle using Autopilot and demanded extensive data from automakers regarding their driver assistance systems. Talks are ongoing.
U.S. Senator Richard Blumenthal (D-Conn.) stated on the 9th, "Tesla seems to be concocting a recipe for disaster by sending inexperienced drivers to test their misleading and unproven system on public roads." "Serious safety concerns should negate this reckless plan. It's Russian roulette for naive drivers and the public."
Blumenthal called for the Federal Trade Commission to investigate Tesla's Autopilot function and welcomed the NHTSA's investigation.
How does Tesla pick drivers for FSD?
Tesla announced that it would release a "Safety Score," supposedly assessing a driver's likelihood of an accident. Tesla states that this score considers hard braking, aggressive turns, close following, front collision warnings, and disabling Autopilot.
Musk stated that drivers achieving good scores within seven days would receive FSD beta access.
Opinions about Tesla's safety scoring system have been mixed on social media. Some people expressed gratitude and acceptance, while others were surprised by their high scores or felt their scores were too low. Others described manipulating the system to improve their scores. However, this is not typical behavior of a safe driver.
A Tesla owner claimed to have received 95 out of 100 points after running a red light, not braking for a cyclist, and driving over a stop sign.
Not everyone will get access right away
Drivers of older Tesla Touchscreen computers in their vehicles have reported difficulty signing up for FSD beta on social media and CNN Business.
Owners of older models also had limited access before the FSD hardware update in 2016. Overseas Tesla owners also reported on social media that they cannot apply for "full self-driving." It's unclear how many individuals will have the opportunity to request FSD beta access or when they can start overseeing autonomous vehicles. Tesla has not disclosed how many drivers have purchased the FSD option, when, or how many can remotely oversee autonomous driving.
Sources:
- Enrichment Insights: Tesla chooses drivers for FSD beta testing based on a proprietary safety scoring system and specific criteria. Their safety scoring system estimates a driver's likelihood of an accident based on factors like hard braking, aggressive turns, close following, front collision warnings, and disabling Autopilot. Tesla keeps monitoring FSD beta participants and gives feedback through steering wheel control notifications. Tesla recalled the FSD beta due to safety concerns and also released software updates to address issues and enhance system reliability. Tesla's FSD beta has faced regulatory scrutiny from organizations like the National Highway Traffic Safety Administration (NHTSA). Despite Tesla's safety measures, the FSD beta still requires driver attention to ensure safety as it is not entirely autonomous.