Starsky Robotics was working on an autonomous vehicle solution with a clever twist. They would use a remote-driver (aka teleoperation) for anything their AI system couldn't handle.
I was responsible for making the teleoperation program succeed. I started with a rough prototype that could drive 10 mph on a straight road. By the time I finished, we had two teleoperation centers driving trucks in live traffic.
We needed to demonstrate a “gate-to-gate” trucking run using only the teleoperation and autonomous systems.
I was Senior UX Designer and the only employee working specifically on the teleoperation program.
I designed the system hardware & software, developed the training program, and coordinated with the founders and autonomous team for access to resources.
The HUD displayed video streamed from a 180° camera mounted in the truck. At the bottom of the screen are indicators for speed and steering-wheel rotation.
Driving tests took place in a busy open-office environment. The remote driver and safety driver in the truck could not talk directly. All communication was relayed by engineers on cell phones.
Driving figure eights in the new teleoperation station
I relocated the control station to a private room. I upgraded all the equipment to make the driving experience more true to life. Microphones and speakers were installed so the drivers could talk directly.
Poor video visibility was an ongoing challenge. Some of my solutions included:
UX Research & Design
Teleoperation control station HUD
Safety driver HUD
The system was prone to problems like disconnections and signal loss. I immediately added error alerts and system status indicators. They were prominently displayed in the teleoperation control station and safety driver HUD.
We also established verbal communication during handoffs. Drivers were required to confirm "I have control / you have control.” They practiced various safety drills at the start of every training session.
Early training exercise diagrams
I installed American Truck Simulator in the control station so new remote-drivers could get used to the interface before handling a real truck.
Hands-on training was incremental. We started with slow figure-eights, handoffs, and safety drills. Drivers gradually learned how to apply the brakes and make turns without tossing everyone around in the truck.
Over time, we were able to move from a closed lot to low-traffic areas. After that was driving with a trailer, driving on the highway, on-ramps, off-ramps, and eventually in street traffic.
Waiting to make a left turn in Florida
The initial prototype was a one-to-one system with one truck linked to the station. A developer had to update code so we could switch between trucks.
I separated the admin settings from the driver station so the station only contained video, indicators, and controls. The admin station slowly expanded into managing multiple trucks and tracking their location, speed, fuel levels, and cell strength.
A month before the demo, I opened a new teleoperation center in Florida. This required us to develop new safety checks to prevent conflicts between operation centers.
Omnicell: FigmaDesign library
Obvious: MVPApp design
Obvious: WireframesWireframe design
OmnicellDesign library & interactive development
Starsky RoboticsUX R&D for autonomous trucking
Disney Movies AnywhereAndroid Design & Development
Star Wars APIAndroid Exercise
Fitness goals appDesign Exercise
HD WidgetsAndroid App
App StatsAndroid App
Flash / FlexCool stuff
Elemental DesignRave flyers