On Wednesday, Israeli firm Regulus Cyber issued a press release stating that "the Tesla GNSS receiver's fraudulent attacks can easily be done wirelessly and remotely." In the model 3 demonstration, the car reacts as if it's only 500 feet away – suddenly slows down, activates the right turn and makes a sharp turn from the main road, according to Regularus. "The driver immediately took control but could not stop the car from leaving the road."
Tesla's official response can best be described as "rude."
"These marketing claims are simply a profit attempt for the company to use Tesla's name to mislead the public that there is a problem that requires purchasing that company's product, and that's simply not the case." Safety is our top priority and we have no safety concerns about these statements
Tesla's Official Spokesman
So, a company most of us have not heard, tells us that it has demonstrated disturbing vulnerabilities in Tesla. reality, says the company is just looking for a dollar there is no problem but it really does not give any details Where is the truth This question requires us to look at the merits of this specific statement Regulus-Tes-Tesla ̵
A closer look at the Regulus Demo
Read the first paragraph of this article, and think that evil hackers take a remote control of a car and make it run wild out of the way without any strings, do not feel bad – you almost certainly had a purpose. But reality is very different. The first, most obvious objection is that Regulus physically attached an antenna to the roof of Model 3 and connected it to its systems before the demonstration. This is not actually the smoking gun; it would be possible to get the same effect without an antenna or cables, it would just be extremely irresponsible (and most likely illegal).
Later on, we will get acquainted with some of the selected technical details, but GNSS falsification is a simple transmission that can be expected to affect a large area. Laying the Model 3 Roof antenna allowed Regulus to use much less power than otherwise required, so the company may be much less concerned about accidental impact on other unrelated GPS devices nearby. That said, I do not mind giving them this; probably the real bad boys would have fewer restrictions and thus would not have to deal with the physical antenna and wiring to attack any car. The real problem is a bit less obvious and you will hardly notice it unless you find the real Regulus Cyber blog in the experiment – which is much more detailed and obvious Directly linked to the press release
This video from an earlier experiment is an excellent example of the "Pied Piper" attack that Regulus has successfully done against Model 3. It is quite possible – even a little trivial if not to become an immediate criminal – to use GNSS spiffing to convince an autonomous or semi-autonomous car that it is not where it thinks it is, but has to turn the wrong way.
But this attack is like surrender to Mommy or Daddy wrong card family vacation: of course, you can get lost but the wrong map will not plow the car in a tree. Just as the human driver in our example, an autonomous or semi-autonomous automotive application only uses GPS to decide which path to take; what is not at all possible is decided by local sensors. In the case of a human driver, "local sensors" mostly mean a pair of good old-fashioned Mk I Eyeballs; in Tesla's, it's radar, ultrasound and a set of eight chambers that allow full 360 degree visual coverage. I visited the Tesla, Uber and Cruz speakers and I made all of these statements. In essence, these companies claim that GPS helps cars decide which path to take, but that has nothing to do with the car's decision on what is or is not the way first.
Offer image by Regulus Cyber