Starsky Robotics is an autonomous trucking company working to make trucks autonomous on the highway and remote controlled by drivers for the first and last mile.
Truck driving is a demanding job with long hours of isolation far away from home, resulting in high turnover and a shortage of drivers. Starsky Robotics is trying to change that by producing an autonomous + remote system so that drivers would work side by side in control stations, operating dozens of trucks remotely, and be back at home with their families at the end of the day.
The system has two modes. Teleoperation (remote driving) is used for complex tasks such as driving from the gate through busy streets and getting on the highway. The system is then switched over to autonomous mode for long highway stretches.
Before I joined in June 2017, Starsky Robotics had raised a large seed round and were looking forward to a raising a much larger series A. The founders wanted to demonstrate a working MVP by the end of summer.
Our challenge was to make our truck complete an end-to-end run in live traffic, controlled exclusively by a remote driver and the autonomous system, with zero interventions.
While Starsky's team was mostly focused on autonomous software and hardware, I was primarily responsible for delivering the remote driving portion of the system.
I managed all facets of teleoperations including hardware/software research & design, system design, driver training & evaluation, and resource allocation. I also conducted user research through observation and feedback, interviews, and team strategy sessions.
Starsky was a skunkworks operation, so I had a lot of flexibility. I applied a heuristic, lean UX approach, breaking the project down to its most basic functions and iterating on each until it “just worked.” I’d then move onto the next branch of the technology tree, compounding results together over time into greater functionality.
When I started working at Starksy Robotics, they had a rough prototype built on a video game racing harness. A camera in the truck provided a live video stream to the control station, and the station would return user input telemetry for steering, gas, and brakes.
The prototype they had in place when I started was a modified video game racing harness. A camera placed over the driver seat in the truck would stream live video to the control station, and the station would return user inputs for steering, gas, and brakes.
The visual interface stretched across three 20-inch monitors. The video was a 180° panoramic feed from the truck cab interior. This view was small and distorted (eg. the windshield was only seven inches high), and the video signal exhibited significant artifacts and signal loss while in motion.
The control station sat in a busy work area so drivers had to contend with surrounding noise and activity. Communication between the remote driver and safety driver was difficult because it was relayed through engineers on cell phones, making response time slow and potentially dangerous.
I started by moving the control station to an isolated room with space for my workstation and observers (eg. tech demos). Drivers could concentrate on developing their skills and I could make observations, get direct feedback, and make changes on the fly.
It was important that the tools and environment provided an experience similar to driving a truck, so I upgraded all of the hardware to match the size and ergonomics of a driver seat, controls, and view.
Additionally, microphones and speakers were installed in the control room and truck so everyone at each end could talk directly.
Video signal was the hardest challenge. We experimented with various resolutions, data rates, formats, and hardware.
To fix the distorted view, we used a matrix algorithm to warp the video so the windshield and side windows matched the screens proportionally.
We also changed the position of the video camera in the truck to reduce angular velocity (the illusion that you’re driving faster than you actually are).
The most critical — yet missing — feature in the user interface was state indicators. I immediately added prominent warning alerts and status mode indicators to the control station and safety driver’s HUDs.
Next, I established verbal communication protocols. For control hand-offs between the safety driver in the truck and the remote driver in the office, both drivers would verbally confirm who was driving: ”you have control", "I have control".
For signal failures, the remote driver practiced announcing “no signal!” so the safety driver could respond. We’d run drills at the start of each training session to reinforce and normalize responses
To provide a better understanding of the controls, we added new gauges and telemetry for brakes, gas, gear, and RPMs to the control station.
Towards the end, I was also working on an audio system to send engine sounds to a Subpac backpack subwoofer. The vibrations were meant to provide a more natural feel for acceleration and deceleration.
To make the control station more familiar, I installed a popular truck driving simulation game on the station. Drivers would come in early mornings and play for hours.
Once they were comfortable with the interface, they’d move on to controlling an actual truck on a closed lot.
We experimented early on, trying to figure out which basic skills carried over naturally (steering, stopping distance), which skills would require extensive retraining (turn velocity, braking), and the potential courses and obstacles for developing those skills.
Once drivers were confident controlling a vehicle, we'd allow them to drive on the surrounding side streets. They could practice lane-keeping, intersections, turns, and get used to low traffic.
In time, they moved on to more advanced tasks, eventually driving with a loaded trailer through busy city streets and entering & exiting highways.
The test run would take place in Florida, so we set-up a second teleoperation center close by. This required redesigning and expanding our control system into a multi-facility network that allowed any station in any center to control any truck.
New indicators and safety features had to be implemented to prevent conflicts between the distant stations. We didn’t want a someone in San Francisco to be able to press a button and accidentally hijack a truck being teleoperated in Florida. But we also needed to be able to remotely override a connection when someone in Florida forgot to turn it off.
The final complexity was implementing the switch between teleoperation and autonomous modes.
Until this point, we tested those two systems independently: teleoperation was done on closed lots and streets while autonomous was only on highways. Needless to say, it was a major challenge bridging these features together.
The entire team traveled to Florida and spent a week doing practice runs and finalizing the controls. It was long hours and little sleep, but we pulled it off.
In September 2017, Starsky Robotics completed the first unmanned end-to-end run. A film crew documented the experience in The Long Haul:
I did a truck wrap!
Starsky RoboticsAutonomous vehicle research & design
Exercise GoalsUX design exercise
Disney Movies AnywhereDesign & development
DayframeDesign & development
HD WidgetsDesign & development
App StatsDesign & development
CloudskipperDesign & development
Flash apps2000 - 2011
Elemental design1994 - 2003