• A driver in North Carolina, US was watching a movie while driving his Tesla Model S when it careened into a police car, authorities said. 
  • The man was charged with two crimes: not moving over for police, and watching TV while driving. 
  • There have been many past incidents of careless drivers putting too much faith in the driver-assistance software, Autopilot, which has been criticized for its name.
  • Visit Business Insider's homepage for more stories.

A Tesla slammed into a North Carolina state trooper's cruiser as its driver, a doctor, watched a movie on his phone, authorities said.

The crash occurred around midnight on a rural stretch of a divided four-lane highway, where the officer was assisting a local sheriff's deputy in responding to a separate crash on the side of the roadway, when the Tesla ploughed into the cars, totaling both.

The driver, identified by police as Devainder Goli, a doctor from Raleigh, was charged with violating a state law that says motorists must move over for stopped emergency responders and for watching television while operating a vehicle. He could not be reached for comment.

Luckily no one was injured, the highway patrol said.

"We were doing a simple lane closure and then death's suddenly there at our footsteps," Sheriff Kieth Stone told CBS 17. "It shows automation is never going to take the place of our motoring public paying attention, not texting, not being on the phone, but focusing on what you were doing: driving."

It's far from the first time a Tesla has careened off the road or into other another vehicle while operating on Autopilot, the company's name for its driver-assistance software that is not yet fully self driving. The rash of incidents come despite clear warnings and instructions, including alerts if a driver's hands leave the wheel for more than a few seconds.

That's led to criticism from public safety officials, industry experts, and consumer groups that say the name — and CEO Elon Musk's rhetoric about its functionality — is misleading, and could be exacerbating the unsafe practices.

"The people who misuse Autopilot, it's not because they're new to it and don't understand it," Musk told Automotive News earlier this month. "The people who first use Autopilot are extremely paranoid about it. It's not like, 'If you just introduced a different name, I would have really treated it differently.' If something goes wrong with Autopilot, it's because someone is misusing it and using it directly contrary to how we've said it should be used."

Read the original article on Business Insider