News

Police Controlling Your Autonomous Car Could Add Security Risks, Experts Say

Hackers can hijack your ride

  • Police officers were recently seen towing a self-driving cruiser taxi whose lights were apparently not on.
  • Cruise is testing computer vision and sound detection AI to help its vehicles respond to emergency vehicles.
  • Security experts say hackers can take advantage of mechanisms used by police to control self-driving cars.

Close-up view of the back of a General Motors Cruise driverless car

Smith/Gado/Getty Images Collection

Experts say self-driving cars that police can control remotely in an emergency can pose safety risks.

In an incident recently posted on Instagram, officers were seen stopping a self-driving cruiser taxi whose lights were apparently not on. The video shows the cruiser stopped, but it’s unclear if any automatic systems are activated. Observers say the incident shows that policies need to be created to govern interactions with police as self-driving vehicles become more common.

“It’s not just that law enforcement shouldn’t have remote controls [capabilities] because [it] It will eventually fall into the wrong hands, but technology that enables remote control should not be fitted to production cars, even if it is a disabled feature,” said Brian Contos, Head of security at Phosphorus Cybersecurity, to Lifewire in an email interview.

“Having these technical capabilities, even if not enabled, may make them vulnerable to future exploitation by a malicious actor, such as re-routing the vehicle to a different destination, causing unsafe vehicle operation or disabling door locks.”

Auto control ?

In the video, a cruiser car was pulled to the side of the road when it was flagged down by an officer before an intersection. The officer tries to open the driver’s side door, but Cruise begins to drive down the road without stopping for the second time.

To stroll wrote on twitter Regarding the incident, “Our AV drove to the police vehicle and then pulled it to the nearest safe location to stop traffic as planned. An officer contacted Cruise personnel and no citation was given.

But in the future, Contos suggested, it could force self-driving automakers to implement ways for police to control their cars. He cited the case where the FBI tried to gain backdoor access to the iPhone to circumvent Apple’s strong encryption, but stressed that the problem with this approach could not be limited to a single entity, such as law enforcement, with a backdoor.

“This is basically a security hole that you deliberately put into your code,” Contos said. “So when you put that backdoor in your software, you create a huge hole in your security that other actors could potentially exploit. The backdoor is a backdoor, period. The same goes for the car, it’s just a much larger system.”

Contos said attackers could trigger vehicle breakdowns on the road, which could lead to the vehicle being held hostage until the owner or manufacturer pays a ransom.

know your rights

If the police want to check your self-driving car, unfortunately, you may not have a legal footing to stand on, civil rights attorney Christopher Collins told Lifewire via email.

“From a legal standpoint, police already have the right to tow vehicles on a very low standard called reasonable suspicion,” Collins said. “They can almost always point to objective criteria to justify why they suspect that particular vehicle should be stopped.”

Driving in a car on the highway at sunset

Lu Shao Ji/Getty Images

The FBI is already investigating the impact of self-driving cars on police departments. The bureau wrote on its website that police leaders need to plan for the growing number of robot cars that will impact their operations.

“In the transition from human-powered vehicles to driverless vehicles, [autonomous vehicles] The FBI will likely be programmed to comply with traffic laws and enforcement devices such as traffic lights,” the FBI writes. “It also seems likely [level] 4 and 5 [self-driving] systems will adhere to these restrictions more strictly than human operators, reducing the priority of law enforcement within a law enforcement agency.

“Having these technical capabilities may lend itself to future exploitation, even if not enabled…”

But in the case of autonomous vehicles without passengers, such as a street sweeper, waste disposal truck or similar municipal government vehicles, Contos said police should have remote control capability. “This use case makes a lot of sense,” he added.

Contos also suggested that if an autonomous vehicle gets out of hand, police can use the analogous measures they use today, like flattening tires with sharp strips or squeezing the autonomous vehicle with police cars.

“If a passenger has a medical emergency, they can stop the car and access the passenger with Slim Jim or break the window if the doors are locked,” Contos said.


See more

Police Controlling Your Autonomous Car Could Add Security Risks, Experts Say

Hackers might be able to seize your ride

Police were recently spotted pulling over an autonomous Cruise taxi because it allegedly didn’t have its headlights on. 
Cruise is testing computer vision and sound detection AI to help its cars respond to emergency vehicles. 
Security experts say hackers could take advantage of mechanisms that police use to control autonomous cars.
Smith Collection / Gado / Getty Images

Self-driving cars that police can remotely control in emergencies could create security risks, experts say.

In a recent incident posted on Instagram, police were spotted pulling over an autonomous Cruise taxi because it allegedly didn’t have its headlights on. The video shows the Cruise car coming to a stop, though it’s unclear if any automated systems were activated. Observers say the incident shows that policies governing police interactions will have to be established as autonomous vehicles become more common.

“Not only should law enforcement not have remote control [capabilities] because [it] will ultimately fall into the wrong hands, but technology that allows remote control should not be installed on production automobiles even if it is a disabled feature,” Brian Contos, the chief security officer of Phosphorus Cybersecurity, told Lifewire in an email interview. 

“Just having these technical capabilities, even if not activated, could lend themself to future exploitation by a nefarious actor such as redirecting the vehicle to a different destination, causing the vehicle to operate unsafely, or disabling door locks.”

Self-Policing?

In the video, a Cruise car pulled over to the side of the road when signaled by an officer ahead of an intersection. The officer tries to open the driver-side door, but the Cruise vehicle begins to drive down the road before stopping a second time. 

Cruise wrote on Twitter about the incident, saying, “our AV yielded to the police vehicle, then pulled over to the nearest safe location for the traffic stop, as intended. An officer contacted Cruise personnel, and no citation was issued.”

But in the future, Contos suggested autonomous car makers could be forced by law enforcement to install ways for police to control their cars. He cited the case in which the FBI tried to get backdoor access to the iPhone to bypass Apple’s robust encryption but noted the problem with this approach is that a backdoor cannot be limited to just one entity, such as law enforcement.

“It’s basically a vulnerability which you are deliberately adding into your code,” Contos said. “So once you build that backdoor into your software, you have created a big gaping hole in your security that other actors could potentially exploit. A backdoor is a backdoor, period. The same is true with a car, it’s just a much bigger system.”

Contos speculated that attackers could trigger vehicle malfunctions on the road, which could lead to vehicles being held hostage until the owner or manufacturer paid a ransom.

Know Your Rights

Unfortunately, you might not have a legal leg to stand on if the police want to control your autonomous car, civil rights attorney Christopher Collins told Lifewire via email. 

“From a legal standpoint, police already have the right to pull over vehicles under a very low standard called reasonable suspicion,” Collins explained. “They can almost always point to some objective criteria to justify why they suspected this particular vehicle needed to be stopped.”

Lu ShaoJi / Getty Images

The FBI is already looking into how autonomous cars will affect policing. The bureau wrote on its website that police admins need to plan for the increasing number of robot cars that will impact their work. 

“In the transition from human-operated to driverless vehicles, [autonomous vehicles] most likely will be programmed to obey traffic laws and control devices, such as stoplights,” the FBI writes. “It also appears probable that [level] 4 and 5 [self-driving] systems will adhere to those constraints more precisely than human operators, lessening the priority of traffic enforcement within a law enforcement agency.”

“Just having these technical capabilities, even if not activated, could lend themself to future exploitation…”

But Contos said that in the case of autonomous vehicles such as a street sweeper, waste disposal truck, or similar vehicle owned by a city government that doesn’t have passengers, police should have remote control capability. “That use case makes perfect sense,” he added.

Contos also proposed that if an autonomous vehicle is out of control, police could use the same analog measures they use today, such as flattening the tires with spike strips or trapping the autonomous automobile with police cars. 

“If a passenger is suffering a medical emergency, they can pull the car over, and if the doors are locked, gain access to the passenger with a Slim Jim or break the window,” Contos said.

#Police #Controlling #Autonomous #Car #Add #Security #Risks #Experts

Police Controlling Your Autonomous Car Could Add Security Risks, Experts Say

Hackers might be able to seize your ride

Police were recently spotted pulling over an autonomous Cruise taxi because it allegedly didn’t have its headlights on. 
Cruise is testing computer vision and sound detection AI to help its cars respond to emergency vehicles. 
Security experts say hackers could take advantage of mechanisms that police use to control autonomous cars.
Smith Collection / Gado / Getty Images

Self-driving cars that police can remotely control in emergencies could create security risks, experts say.

In a recent incident posted on Instagram, police were spotted pulling over an autonomous Cruise taxi because it allegedly didn’t have its headlights on. The video shows the Cruise car coming to a stop, though it’s unclear if any automated systems were activated. Observers say the incident shows that policies governing police interactions will have to be established as autonomous vehicles become more common.

“Not only should law enforcement not have remote control [capabilities] because [it] will ultimately fall into the wrong hands, but technology that allows remote control should not be installed on production automobiles even if it is a disabled feature,” Brian Contos, the chief security officer of Phosphorus Cybersecurity, told Lifewire in an email interview. 

“Just having these technical capabilities, even if not activated, could lend themself to future exploitation by a nefarious actor such as redirecting the vehicle to a different destination, causing the vehicle to operate unsafely, or disabling door locks.”

Self-Policing?

In the video, a Cruise car pulled over to the side of the road when signaled by an officer ahead of an intersection. The officer tries to open the driver-side door, but the Cruise vehicle begins to drive down the road before stopping a second time. 

Cruise wrote on Twitter about the incident, saying, “our AV yielded to the police vehicle, then pulled over to the nearest safe location for the traffic stop, as intended. An officer contacted Cruise personnel, and no citation was issued.”

But in the future, Contos suggested autonomous car makers could be forced by law enforcement to install ways for police to control their cars. He cited the case in which the FBI tried to get backdoor access to the iPhone to bypass Apple’s robust encryption but noted the problem with this approach is that a backdoor cannot be limited to just one entity, such as law enforcement.

“It’s basically a vulnerability which you are deliberately adding into your code,” Contos said. “So once you build that backdoor into your software, you have created a big gaping hole in your security that other actors could potentially exploit. A backdoor is a backdoor, period. The same is true with a car, it’s just a much bigger system.”

Contos speculated that attackers could trigger vehicle malfunctions on the road, which could lead to vehicles being held hostage until the owner or manufacturer paid a ransom.

Know Your Rights

Unfortunately, you might not have a legal leg to stand on if the police want to control your autonomous car, civil rights attorney Christopher Collins told Lifewire via email. 

“From a legal standpoint, police already have the right to pull over vehicles under a very low standard called reasonable suspicion,” Collins explained. “They can almost always point to some objective criteria to justify why they suspected this particular vehicle needed to be stopped.”

Lu ShaoJi / Getty Images

The FBI is already looking into how autonomous cars will affect policing. The bureau wrote on its website that police admins need to plan for the increasing number of robot cars that will impact their work. 

“In the transition from human-operated to driverless vehicles, [autonomous vehicles] most likely will be programmed to obey traffic laws and control devices, such as stoplights,” the FBI writes. “It also appears probable that [level] 4 and 5 [self-driving] systems will adhere to those constraints more precisely than human operators, lessening the priority of traffic enforcement within a law enforcement agency.”

“Just having these technical capabilities, even if not activated, could lend themself to future exploitation…”

But Contos said that in the case of autonomous vehicles such as a street sweeper, waste disposal truck, or similar vehicle owned by a city government that doesn’t have passengers, police should have remote control capability. “That use case makes perfect sense,” he added.

Contos also proposed that if an autonomous vehicle is out of control, police could use the same analog measures they use today, such as flattening the tires with spike strips or trapping the autonomous automobile with police cars. 

“If a passenger is suffering a medical emergency, they can pull the car over, and if the doors are locked, gain access to the passenger with a Slim Jim or break the window,” Contos said.

#Police #Controlling #Autonomous #Car #Add #Security #Risks #Experts


Synthetic: Ôn Thi HSG

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *

Back to top button