A 26-year-old Oregon teacher was murdered when a Tesla crossed the center line and collided with a car.

A 26-year-old Oregon teacher was murdered when a Tesla crossed the center line and collided with a car.

An out-of-control situation Tesla crossed the center line on a beautiful Oregon road and collided with a Hyundai driven by a young instructor, killing both drivers.

The Tesla Model Y, driven by mortgage consultant Fredrick Scheffler II, 49, lost control and veered into oncoming traffic on Saturday along the Sunset Highway at Necanicum, close to the Pacific Coast.

Kyle Riegler, a 26-year-old music teacher at the adjacent Seaside Middle and High Schools, was driving a Hyundai Santa Fe when Scheffler’s 2020 automobile collided with him.

Scheffler, a married Portland father of one, died immediately after the crash.

Riegler was was airlifted from the crash site, a stretch of the state’s heavily wooded Sunset Highway, near the Pacific Coast, to Oregon Health & Science University in Portland, where he died that night.

Tesla driver Fredrick Scheffler II
Teacher Kyle Riegler

Saturday’s deadly horror-smash saw Tesla driver Fredrick Scheffler II, left, lose control of his Model Y and crash into teacher Kyle Riegler, right, killing both men

Riegler was a popular teacher at Seaside Middle and High Schools, with staff and students there distraught by his sudden death

It remains unclear what caused the smash, and cops are investigating.

According to an initial investigation by state troopers determined Scheffler’s Tesla ‘lost control and crossed into the eastbound lane and collided with an eastbound grey Hyundai Tucson van’ driven by Riegler.

They’ve yet to say whether it might be linked to Tesla’s Autopilot self-driving feature, which has caused multiple crashes in the US since 2019 and killed at least one person.

Glitches in the feature are suspected to have been caused by emergency responders’ vehicles confusing the Tesla’s sensors, although it’s unclear if any such vehicles were in the vicinity of Saturday’s collision.

Rieger’s family, who flew in from his native St. Louis upon hearing of the crash, were able ‘to meet with the medical team and spend time with Kyle’ before he passed, Seaside school principal Jeff Roberts wrote in a letter to students at the staff.

In the letter, the school staffer conceded that Rieger’s injuries ‘were too much to overcome.’

‘Despite heroic efforts from the medical team at OHSU, the injuries sustained by Mr. Rieger were too much to overcome and I am sorry to have to share that he passed away,’ Roberts wrote.

‘When things like this happen people have all different kinds of reactions which may include shock, sadness, fear or even anger.

‘Grief is a very personal process and there is no “right” way to experience loss,’ he asserted.

‘Although Kyle was here for a short time, he had already had a great impact on students and colleagues alike and had a bright future ahead of him.’

Sheffler reportedly worked as a branch manager at Portland firm Premier Mortgage Resources.

The accident occurred near milepost 10 on the historic highway, which has segments following the famed Oregon Trail, and leads from Portland to the the Oregon Coast.

A black 2020 Tesla Model Y - the make of the Tesla involved in the Saturday crash. Cops did not say if the collision stemmed from the Tesla's Autopilot or Full-Self Driving features

The incident comes less than a week after it was revealed that a doctor who burned to death in his crashed Tesla in 2019 likely died because the vehicle’s electronic door handles didn’t pop out, leaving rescuers unable to save him from the flames.

The family of Dr. Omar Awan, the father of five who died in the February 2019 crash, is now suing the Elon Musk-run EV maker, saying the company is to blame for the burning and smoke inhalation due to the door’s supposedly faulty handles.

The car firm’s vehicles have special door handles which sit flush to the car’s body. They retract into the vehicle while it is moving, but can pop out when it stops to let people get inside.

After the crash, initial rescuers on the scene were confused by the car’s door handles, which hadn’t popped out, and were unable to gain entry, NBC6 reported.

First responders then arrived on the scene, and were also left stumped. Teslas are meant to have two back-up safety mechanisms.

Saturday's smash saw the Tesla lose control and veer into oncoming traffic along this stretch of the state's Sunset Highway, close to the Pacific Coast near Necanicum. The accident saw the highway closed down for more than three hours Saturday

The first sees the door handles pop out automatically in the event of an accident, unless it happens very abruptly.

A second option can allow first responders to break the car’s window, and manually pop the doors out using a special mechanism inside.

However, neither happened, and Awan’s family insist it is because his 2016 model car was faulty.

According to the police report, Awan was speeding at the time of the crash, with his car going around 79mph.

He was also found to be over the drink-drive limit. Tesla has denied the claims made in the lawsuit, says the car was functioning perfectly, and alleges that Awan was to blame for his own death.

But his family are standing firm, and insisting he could have been saved were it not for the car’s door handles, which they insist did not function properly.

After the crash, first responders claimed they were unable to open the door to the burning car because the couldn’t find a door handle.

Dr. Omar Awan, 48, died in February 2019 after crashing his Model S on South Flamingo Road in Broward County, Florida

Dr. Omar Awan, 48, died in February 2019 after crashing his Model S on South Flamingo Road in Broward County, Florida

The luxury, futuristic car company said the handles should have automatically popped out, unless power was ‘abruptly’ cut off from the vehicle, NBC South Florida reported. Tesla also said that there is ‘mechanical back-up in place that can only be accessed from the inside of the vehicle.’

First responders are reportedly trained to ‘break the window and open the vehicle from the inside,’ Tesla said.

‘They couldn’t find a door knob to try to go in that way and unfortunately were not able to extract the driver,’ a police spokesperson told NBC South Florida at the time. ‘They attempted to break the window in order to get the subject out, but were not successful. The flames were too strong, too big at that point.’

Firefighters were unable to reach the doctor because the retractable door hands didn't pop out. Tesla claims it should have unless the power to the car was 'abruptly' cut off

The family is now saying that Awan ‘could have been saved’ if the Model S’ design wasn’t ‘defective.’

Beside the ‘failed’ door handles, the family is claiming the luxury car has an ‘unreasonably dangerous fire risk,’ the lawsuit said.

In February, Tesla was forced to recall nearly 54,000 vehicles equipped with its ‘Full Self-Driving’ software, after it allowed vehicles to run through stop signs at low speeds, without coming to a complete halt.

The company also had to recall over 800,000 vehicles after seat belt reminder chimes for several of its allegedly vehicles failed to sound when the vehicles are started and the driver isn’t buckled up.

Safety advocates and automated vehicle experts say Tesla is pushing the boundaries of safety to see what it can get away with, but now NHTSA is pushing back.

In November, NHTSA said it was looking into a complaint from a California Tesla driver that the ‘Full Self-Driving’ software caused a crash.

The driver complained to the agency that a Model Y went into the wrong lane and was hit by another vehicle.

The SUV gave the driver an alert halfway through the turn, and the driver tried to turn the wheel to avoid other traffic, according to the complaint. But the car took control and ‘forced itself into the incorrect lane,’ the driver reported.

NHTSA also is investigating why Teslas using the company’s less-sophisticated ‘Autopilot’ partially automated driver-assist system have repeatedly crashed into emergency vehicles parked on roadways.

The agency opened the investigation in August 2021, citing 12 crashes in which Teslas on Autopilot hit parked police and fire vehicles. In the crashes under investigation, at least 17 people were hurt and one was killed.

Tesla has since said in its earnings release that ‘Full Self-Driving’ software is now being tested by owners in nearly 60,000 vehicles in the US. It was only about 2,000 in the third quarter. The software, which costs $12,000, will accelerate Tesla’s profitability, the company said.

How does Tesla’s Autopilot work?

Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car.

The sensor and camera suite provides drivers with an awareness of their surroundings that a driver alone would not otherwise have.

A powerful onboard computer processes these inputs in a matter of milliseconds to help what the company say makes driving ‘safer and less stressful.’

Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver.

It does not turn a Tesla into a self-driving car nor does it make a car autonomous.

Before enabling Autopilot, driver must agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your car.’

Once engaged, if insufficient torque is applied, Autopilot will also deliver an escalating series of visual and audio warnings, reminding drivers to place their hands on the wheel.

If drivers repeatedly ignore the warnings, they are locked out from using Autopilot during that trip.

Any of Autopilot’s features can be overridden at any time by steering or applying the brakes.

The Autopilot does not function well in poor visibility.

 

The recall is the 15th done by Tesla since January 2021, according to NHTSA records, with almost all of the more than one million vehicles sold in the US requiring an update due to a recall.

The les-advanced Autopilot feature, meanwhile, is found on all Tesla’s cars and allows the vehicles to perform basic driving functions such as steering, accelerating and braking autonomously.

Late last year, Musk tasked engineers with building a car that relied only on cameras for its autopilot feature, ditching industry-mandated radar and sensors, sources told The Times.

At the time, Schuyler Cullen, who headed a team that explored autonomous-driving possibilities at Samsung, told The Times that Musk’s cameras-only approach was destined to fail and not based on science.

‘Cameras are not eyes! Pixels are not retinal ganglia! The F.S.D. computer is nothing like the visual cortex!’ Cullen declared to the outlet.

Meanwhile, Amnon Shashua, CEO of Mobileye, a former Tesla supplier that has been testing technology similar to the carmaker’s, said Musk’s camera-only plan could work, but asserted that other sensors will likely be needed in the meantime.

He also stated that Musk is known to exaggerate the capabilities of his company’s proprietary technologies, but that his statements should be taken with a grain of salt.

‘One should not be hung up on what Tesla says,’ Shashua told The Times. ‘Truth is not necessarily their end goal. The end goal is to build a business.’

Musk unveiled Autopilot 2.0 in October 2016, as well as the commercial in which the crash took place.

The CEO announced at a news conference that all new Tesla vehicles would include the cameras, computers, and other technological features to accomplish ‘Full Self Driving’ – a term that suggests that the cars could operate safely on their own.

His statements, however, took the carmaker’s staffers by surprise, with many believing that Musk was making a promise that was impossible to keep, two people who worked on the project told The Times.

Sterling Anderson, who headed the project and has since started his own autonomous car company, called Aurora, reportedly told Tesla’s marketing and sales teams that it would be irresponsible to refer to the company’s Autopilot technology as ‘autonomous’ or ‘self-driving,’ saying it would mislead the public.

Despite this warning from the senior staffer, however, Tesla was soon using the term ‘Full Self Driving’ as a standard way of describing its Autopilot feature.

By 2017, Tesla had begun to roll out a more advanced version of its Autopilot, which was, of course, dubiously tabbed ‘Full Self-Driving.’

The feature, which cost $10,000 for consumers at the time, is still in its Beta stage as of May, even after more than four years of extensive updates.

The National Highway Traffic Safety Administration is currently investigating 11 accidents involving Teslas.