In June 2017, I joined Starsky Robotics to transform and manage their teleoperations program. Working with a talented group of engineers and drivers, we transformed a prototype to a fully operational system. In September 2017, our remote driver completed an end-to-end unmanned run on the Forida turnpike.
First remote-driven / autonomous tractor driven in live traffic.
Sr. UX Designer
Starsky Robotics is working on an artificial intelligence system for big-rig trucks that run self-sufficient on highways and driven from a remote operations center from the gate to the highway and back.
An end-to-end run means that a remote drive (teleoperator) sitting at a control station would remote drive a truck off the lot, through streets and traffic, and onto a highway.
Next, the teleoperator would switch on the autonomous system and supervise the journey until the truck reached the exit.
At that point, the teleoperator would resume control of the vehicle, exit the highway, and navigate through more street traffic until arriving at the final destination.
The founders had given the team a challenge: demonstrate an unmanned end-to-end run... in less than 100 days.
In June, Bloomberg published an article on Starsky, including images of a working, remote control station prototype. It was a racing harness for video games with three small display screens and a large trucker wheel for control located in the center of the workspace.
My initial focus was on improving the hardware and software design for the teleoperation station, but quickly my responsibility extended to managing teleoperations.
I needed to transform the prototype to something efficent enough to not only make drivers feel comfortable using the system, they had to be confident in live traffic. A safety driver would always be present in the truck's driver seat, but we needed it to work without any interventions.
I took an iterative approach to the project. Nothing on this scale had been done before, so we needed to experiment in small steps.
Due to the company’s confidentiality, the specifics of my work can’t be disclosed. I’ve limited the case study to obvious solutions and public information / media.
I moved the prototype to an isolated room, experimented with different off-the-shelf hardware controllers, and expanded the view area to be "life size". Everything was modular and adjustable, so drivers of any size could adjust their seating, wheel, and view to match their needs.
Over the weeks, I worked with the drivers and software engineers to make multiple updates to the control interfaces, providing drivers with greater feedback and and control.
One interesting tweek I think I can talk about was when we adjusted the position of the view cameras in the truck cab.
Initially, the cameras were placed high above the safety driver, so they looked down on the road. This meant the remote driver saw more of the road in front of the truck than the distant horizon.
Think about it. When driving your car, if you look at the road directly in front of you (or out the side windows) you'll see that the landscape flashes by quickly. You become overly aware of your velocity because it seems like you're going really fast.
Our remote drivers were having a similar experience — driving 15-25 mph felt really fast, even though they were actually going slower than the traffic around them.
By moving the cameras over and down next to the safety driver's head, the remote drivers' eyes were directed back on the horizon and the slower-progressing road ahead. As a result of this small change, their perception was more natural. They had greater confidence and remote driving speeds increased to match in-seat driving without any extra training.
Driving a truck remotely is a very strange experience at first. To develop their confidence, I focused on three areas: training, evaluation, and communication.
Direct communication between the remote driver and the safety driver is critical. We made sure they had a direct communication line to each other, independent of all other systems.
Over time, we also established specific communication protocols. For example, since I was new to this I started them with a system I was already familiar with: rock-climbing. So they'd say "on belay", "belay on", "driving", "drive" for each hand-off. Soon this evolved to a simpler message: "you have control", "I have control". This made it clear to both drivers who was controlling the vehicle at all times.
I can't get into further specifics about driver training, but clearly we went from small, controlled environments into greater and greater challenges, taking small steps to maintain driver confidence.
As for evaluation, there are many types of drivers and personalities. Over time I determined who could learn quickest and communicate most efficiently.
During all of this, I was still a designer. I oversaw the redesign of Starsky's website, designed the large skins for the trucks with one of the engineers, and finalized the company logo and colors.
Obviously we managed to succeed with our demo, and we continued to work on the making improvements. Testing the teleop system at times could be scary. Like most systems there were bugs. Move fast and break things is fine for app startups, but not necessarily applicable to driving big trucks with software.
When I joined Starsky, I was looking for new challenges and I certainly found them. Not only did I get to work on cutting-edge hardware and software, I got to work with different groups of very talented drivers, engineers, and founders. At the very least, the experience reminded me how much I missed working directly with people.
I think there's great potential for teleoperations for all autonomous vehicles (not just trucks) with companies like Phantom Auto. The primary purpose would be to allow remote drivers to step in when the autonomous system encounters an obstacle such as road construction or emergency diversions. But a more interesting purpose might be to take over when a vehicle's driver is alone and impaired. Feeling tired? Too much to drink? Let a remote driver take you home.
Remote driving is something we can do today, years before autonomous systems actually mature.