Starsky Robotics is working to make trucks autonomous on the highway and remote controlled by drivers for the first and last mile.
Truck driving is a demanding job with long hours of isolation far away from home, resulting in high turnover and a shortage of drivers. Starsky Robotics is trying to change that by creating a remote + autonomous system that allows drivers to operate a fleet of trucks from a control center. Drivers could work together, operate a dozen trucks each, and be home with their families after a shift.
The system has two modes. Teleoperation (remote driving) is used for complex tasks such as driving from the gate through busy streets and getting on the highway. The system is then switched over to autonomous mode for long highway stretches.
Before I joined in June 2017, Starsky Robotics had raised a large seed round and were looking forward to a raising a larger Series A.
By the end of summer, they wanted to demonstrate an MVP that could complete an end-to-end run in live traffic and be controlled exclusively by a remote driver and the autonomous system.
Although my title was Senior UX Designer, I was primarily responsible for delivering the remote driving portion of the run.
I managed all facets of teleoperations including hardware / software research & design, system design, driver training & evaluation, and resource allocation. I also conducted user research through, observation and feedback, interviews, and strategy sessions.
Starsky was a skunkworks operation, so I was free to operate without the overhead of documenting and seeking approval. (This is why we were able to achieve so much in so little time.)
I tend to follow a lean UX approach based on heuristics and trial-and-error. I broke the project down to its most basic function and iterated until it “just worked.” I’d then move onto the next branch of the technology tree, compounding results together over time into greater functionality.
When I started working at Starksy Robotics, they had a rough prototype built on a video game racing harness. A camera in the truck provided a live video stream to the control station, and the station would return user input telemetry for steering, gas, and brakes.
The prototype they had in place when I started was a modified video game racing harness. A camera placed over the driver seat in the truck would stream live video to the control station, and the station would return user inputs for steering, gas, and brakes.
The visual interface stretched across three 20-inch monitors. The video was a 180° panoramic feed from the truck cab interior. This image was small and distorted (the windshield was only seven inches high), and would often be corrupted with artifacts and signal loss while in motion.
Remote drivers operated in a busy work area with surrounding noise and activity. Direct communication between the remote driver and safety driver was hampered because it was relayed through engineers on cell phones. This made response time slow and potentially dangerous.
I started by moving the control station to an isolated room with space for my workstation and observers (eg. tech demos). Drivers could concentrate on developing their skills and I could make observations, get direct feedback, and make changes on the fly.
It was important that the tools and environment provided an experience similar to driving a truck, so I followed through by upgrading all of the hardware to match the size and ergonomics of a driver seat, controls, and view. It ended up being an off-the-shelf, modular set-up for cost-effectiveness and flexibility.
Microphones and speakers were installed in the control room and truck so everyone could talk directly.
Video signal was the biggest complication. We tried different resolutions, data rates, formats, and transmission hardware. To fix the size distortion, we used a matrix algorithm to warp the video so the windshield and side windows matched the screens proportionally. We also changed the position of the video camera in the truck to reduce angular velocity (the illusion that you’re driving faster than you actually are).
The most critical - yet missing - feature in the user interface was state indicators. Who was controlling the vehicle? The safety driver? Remote driver? Computer? What about signal loss or a system crash? We immediately added prominent warning alerts and status mode indicators to the control station and safety driver’s HUDs.
We established verbal communication protocols. For hand-offs, both drivers would verbally confirm who was driving: ”you have control", "I have control". The remote driver would announce “no signal!” when the alert appeared. We’d run drills at the start of each training session to reinforce and normalize responses.
Based on feedback from the drivers, we also added control gauges and telemetry for brakes, gas, gear, and RPMs to the control station.
Towards the end, I was also working on an audio system that could relay engine sound and vibrations using a Subpac to give the remote driver a more natural feel for driving.
The control station was very intimidating for new drivers. To make it more accommodating (i.e. fun), I installed a popular truck driving simulation game on the station. Drivers would come in early most mornings and play for hours.
Once they were comfortable with the interface, they’d move on to controlling an actual truck on a closed lot.
We experimented early on, trying to figure out which basic skills carried over naturally (steering, stop distance), which skills would require extensive retraining (turn velocity, braking), and the potential course maps for developing those skills.
Once drivers were confident controlling a vehicle, we'd allow them to drive on the surrounding side streets. They could practice lane-keeping, intersections, turns, and get used to low traffic.
In time, they moved on to more advanced tasks like driving with a loaded trailer through busy city streets and entering & exiting highways.
The big run would take place in Florida, so we needed to set-up a second teleoperation center nearby. This required redesigning and expanding our control system into a multi-facility network that allowed any station in any center to control any truck.
New indicators and safety features had to be implemented to prevent conflicts between the distant stations. We didn’t want a someone in San Francisco to be able to press a button and accidentally hijack a truck being teleoperated in Florida. But we also needed to be able to remotely override that connection in case someone in Florida forgot to turn it off after a session.
The final complexity was implementing the switch between teleoperation and autonomous modes. Until this point, we tested those two systems independently: teleoperation was done on closed lots and streets while autonomous was only on highways. Needless to say, it was a major challenge bridging these features together.
The entire team traveled to Florida and spent a week doing practice runs and finalizing the controls. It was long hours and little sleep, but we pulled it off.
In September 2017, Starsky Robotics completed the first unmanned end-to-end run. A film crew documented the experience in “The Long Haul”.
I did a truck wrap...