Dashcam Maker Motive Touts AI but Relies on Humans
Company sells dashcams for trucking companies to monitor drivers but has 400 Pakistani workers vetting its AI results.
Art by Mike SullivanLast summer, a manager at startup Motive Technologies sent an urgent Slack message to 400 employees based in Pakistan. “WE HAVE A PROBLEM,” it said in all caps.
Motive sells AI-powered dashcams that allow trucking companies to monitor drivers and send alerts about crashes and other safety issues. In August, the dashcams recorded a string of collisions, but customers never received the alerts.
The Takeaway
- Dashcam maker Motive employs 400 Pakistani workers to vet AI output.
- Human annotators missed AI-flagged collisions, failing to alert customers.
- Motive’s IPO pitch centers on AI, despite significant human intervention.
Powered by Deep Research
The AI system had detected the crashes and flagged them to the roughly 400 Pakistani workers the company employs to vet AI output. But those employees didn’t spot the crashes in the stream of video feeds they were supposed to review.
When Motive learned what had happened, it dug deeper and found more problems. “I want to transparently share—the radius of the problem is large,” a manager told employees in a Slack message viewed by The Information. “We reviewed a targeted 2 days of data and found bigger issues. A lot of clear misses.”
Motive is set to begin pitching its IPO to investors this week, with its AI front and center. That puts the work of its staff, known as data annotators, close to the center of one of the tech industry’s first IPOs of 2026.
Motive says its combination of AI and human workers makes its system the most accurate on the market, helping it win more business from large customers, including trucking company Western Express and delivery managers such as FedEx. The “human-AI feedback loop,” Motive stated in its prospectus, “ensures our AI system rapidly validates predictions, corrects errors, and adapts to edge-cases quickly.”
The failure of the human part of the loop, and the company’s belief that its AI models need a backstop, raises questions about whether Motive can deliver consistently accurate results.
Motive declined to comment, citing the blackout period ahead of the upcoming offering.
Although artificial intelligence has leaped forward since the launch of ChatGPT, there are many areas of the industry that require humans to work behind the scenes to support high-profile AI models. Trainers employed by firms such as Scale AI and Surge AI work to make large language models more accurate and more humanlike. Waymo cars still sometimes get help from human operators in tough situations, “much like phone-a-friend,” Waymo has said. Some companies, such as Builder.AI, have gotten caught claiming the work of human employees was really AI.
The work of Motive’s Pakistan-based data annotators appears to be even more central: They review every snippet of footage AI models flag for review, which can total as many as 1,000 per day for each employee, according to interviews with Motive’s data annotators. Workers are paid roughly $125 a month.
Human-in-the-Loop Video
If Motive’s AI dashcam system detects a potentially dangerous situation, it alerts truck drivers immediately, without human intervention. The data annotators’ job is to ensure that only verified alerts about safety incidents get sent to their customer’s fleet managers back in the office.
Motive’s marketing materials describe its human-in-the-loop video review as unique to the company, and executives have said it is helping Motive win business. Its CEO and co-founder—Shoaib Makani, a Pakistani American former venture capitalist—said in an online interview last August that using “human validation in real time” prevents false positives or other incorrect information from going back to customers. “It has to be bulletproof,” he said. “Over time we can actually remove the human validation step.”
Motive will be under pressure from investors who expect it to show profits in the coming years. The company burned $74 million of cash in the first nine months of last year, mostly due to marketing expenses, and had more than $240 million of debt at the end of the year. The company didn’t disclose data annotation costs in its IPO prospectus, but a person close to the company said those costs were immaterial. Assuming $125 a month pay per worker, annotator salaries for 400 staff members would be less than $1 million a year.
Motive’s competitors say they don’t use humans for validation as frequently. Publicly traded Samsara, which is larger than Motive and profitable, said it employs only 50 data annotators for an add-on service it offers in rare cases when customers ask for human review of all videos. It also uses them in the development of features such as flagging collisions with pedestrians. “It’s not scalable with humans,” said Johan Land, a Samsara engineering executive.
Ariel Seidman, CEO and co-founder of Hivemapper, a smaller competitor that also sells an AI-powered dashcam, said his startup stopped using human reviewers about 18 months ago and now uses different AI models to check the work. “Humans were making mistakes because they were bored with the task we gave them, and they just clicked through,” he said. “We found the AI models had gotten better than the humans. That doesn’t mean they’re perfect, but humans are also not perfect.”
The dashcam industry was born out of regulations put in place in the last decade that required companies to monitor commercial drivers’ working hours. That evolved into dual-facing cameras for managers to monitor drivers, which provide records of events that may be used in litigation against drivers or companies. That market has grown because trucking companies “fundamentally don’t trust their drivers,” said Seidman. “Everything stems from a lack of trust.”
Motive jumped into the dashcam market in 2021, taking on several competitors that got there first. Executives decided to hire full-time data annotators in Pakistan, where the company already had engineers, so that it could more quickly train its systems on unusual or confusing situations its cameras would capture.
Trained in U.S. Traffic Laws
Today, Motive’s annotators are full-time employees who make up about 9% of the company’s workforce. They work in shifts around the clock in two of Pakistan’s major cities, Islamabad and Lahore, and have received training on U.S. traffic laws. A former Motive executive credited the strategy to helping the company catch up in commercial dashcam sales.
According to people who have worked at Motive and internal company documents, the AI models still often make mistakes, and the workers are under pressure to quickly and accurately verify videos that show serious incidents. For instance, the system sometimes flags red signs, such as Dairy Queen logos, as stop signs. It has flagged drivers who put objects near their mouths as smoking. The workers mark these flagged videos as invalid.
The flow of data often overwhelms annotators because the AI, on average, flags more than 100 potential crashes for every one real incident, according to Motive’s website and data annotators who spoke to The Information.
Assessing the accuracy of such videos requires annotators to rely on their own judgment. A Motive product manager based in Pakistan told annotators in an August Slack message that if they were confused about whether what they had seen was a collision or a close call, they should “try enabling video sound” or looking at the “driver reaction.” “Driver yelling profanities can relate to ‘something have happened’,” the manager said in a Slack message viewed by The Information.
A Motive manager warned staff in August that annotators had missed several truck collisions that AI had flagged. That meant Motive’s customers didn’t get alerted. “Those were not edge cases,” the manager wrote in the Slack message. “Those were clear misses.”
Still, Motive’s approach has won fans among customers. Jon Somerville, fleet safety manager at waste and recycling company Veit, credits Motive for improving the behavior of his drivers. The technology has saved time and money over the last three years he has been using it in his fleet. He said Motive’s dashcams are more accurate than those he tried from competitors. “In-cab events, like cellphones or seat belts, it’s relatively accurate,” Somerville said.
The instances where the dashcam has inaccurately alerted Somerville to events have been minor. It has mistaken a dangling radio microphone for a cellphone, for instance, or has failed to see an orange seat belt that was camouflaged by a trucker’s similarly colored shirt, he said.
‘Feels Like AI’
Motive’s march to public markets comes at a time when many tech companies are boasting about their AI systems. Motive mentions “AI” more than 200 times in its IPO prospectus, more than other software or internet companies like Figma and Reddit that have gone public with an AI bent recently.
The annotators’ work at Motive raises a question debated by entrepreneurs and researchers: Is it really AI if you need a human to check everything?
Some executives don’t think so. Manu Sharma, CEO of Labelbox, a data-labeling startup, called Motive’s system archaic. “It’s using humans to tag things [in] real time so it feels like AI for [the] end user,” he said.
Others say the system, known as human in the loop, is a pragmatic way to address a more awkward reality that gets less attention in optimistic Silicon Valley—that AI models still make too many mistakes to work unassisted. “The models, as good as they are today, aren’t quite there to solve long-tail problems,” said Ulrik Stig Hansen, president and co-founder of Encord, which builds software for companies like Motive that manages these kinds of human annotators. “There’s a long way to go,” he added.
Motive’s customers generally sign three-year contracts to keep tabs on whether their drivers are safe on the roads. The company is focusing in particular on winning more large customers, which represent a bigger prize for Wall Street investors. Sales and marketing are also central to Motive’s growth. It spends more than 47 cents of every $1 of revenue it generates on bringing in new business, the IPO prospectus shows.
The competitive pressure has seeped through to Motive’s data annotators. Managers at Motive have affixed digital banners alerting data annotators to all incoming videos from important prospective customers. “CRITICAL—this is an important trial, please be careful,” the banner reads, according to a manager’s Slack message to staff and employees who have seen the banner.
“Our biggest competitors forward the collision to customers in small amount of time (under 2 min),” a Motive manager wrote in the Slack message. “If we don’t beat them, clients will shift which is a loss for us.
“No confusion about it,” he added.
Cory Weinberg is deputy bureau chief responsible for finance coverage at The Information. He covers the business of AI, defense and space, and is based in Los Angeles. He has an MBA from Columbia Business School. He can be found on X @coryweinberg. You can reach him on Signal at +1 (561) 818 3915.
Michael Roddan is a reporter at The Information based in San Francisco covering banking and financial services. He can be reached at [email protected] or on Twitter at @michaelroddan. You can also message him on any encrypted app at +1 347 864 0601 or on Signal @michaelroddan.99