Most vehicle concepts unveiled at tech conferences are exercises in branding. Trinity, the three-wheeled electric microcar from Will.i.am’s startup of the same name, is trying to be something different: a vehicle designed around an AI agent first, with the hardware built to support it — not the other way around.
Shown as a prototype at CES in January and demonstrated again at Nvidia’s GTC conference in March, Trinity is a single-passenger, self-balancing electric vehicle built for urban commuting. It goes from zero to 60 miles per hour in under two seconds, tops out at 120 mph, and offers a 150-mile range. But the performance specs are almost beside the point. The core product proposition is the conversational AI embedded in it — a Nvidia-powered agent that can see the road, understand the trip, and handle the cognitive overhead of getting somewhere while the driver focuses on actually driving.
Trinity’s design philosophy inverts the standard automotive approach. Most vehicles are designed from the mechanical platform outward, with software added as an afterthought. Trinity started from the agent and built the hardware around giving it rich context and clear ways to act.
An agent that works while you drive — and after you park
The AI in Trinity isn’t for driving. The vehicle is fully human-controlled, and that’s deliberate. The agent is for everything else: sending messages, finding parking, answering questions about landmarks, logging trip mileage for expenses, adjusting the playlist based on speed. A single voice command like “find parking near this meeting, text them my ETA and log the mileage for expenses” is designed to trigger a multi-step agentic workflow executed without further input from the driver.
What makes the integration different from a phone-based assistant, according to Will.i.am, is sensor context. Trinity’s multiple cameras give the AI direct awareness of speed, direction, cabin conditions, and surrounding environment — pedestrians, traffic lights, storefronts, other vehicles. A phone mounted in a cupholder can hear you; Trinity’s agent can see the road and reason about what it sees. He describes the phone as a “guest” in the driving experience, while the Trinity agent is “the host that sees, reasons and acts continuously from door to door.”
He also extended this to what happens after arrival: the vehicle, parked in a lot, can continue executing tasks while its owner is in a meeting. The phrase he used at GTC — “a super-employee” working on your behalf — positions Trinity less as a car and more as a productivity platform that happens to have wheels.
The hardware behind it
Trinity was built with a deliberately American supply chain. The AI runs on Nvidia silicon. Design and fabrication came from West Coast Customs. Self-balancing and robotics expertise was provided by DEKA Research & Development, the company behind Segway. The three-wheeled form factor — slim body, arched roofline, motorcycle silhouette — is weatherproof and climate-controlled, with a studio-grade audio system that reflects Will.i.am’s background.
Manufacturing will happen in Boyle Heights, the working-class Los Angeles neighborhood where William Adams grew up. The factory is also intended to double as a school teaching robotics and agentic AI systems — a community investment component that’s been a recurring theme in Adams’ tech ventures through his i.am Angel Foundation.
Trinity is betting that the most valuable improvements in a vehicle will come from software and agent upgrades rather than hardware revisions — a model closer to a smartphone than a traditional car. If that’s right, the platform becomes more useful over time without requiring owners to buy a new vehicle.
500 units, $30,000, August 2027
The initial production run is capped at 500 vehicles, with first deliveries targeted for August 2027 and pricing aimed at under $30,000. That’s a deliberate positioning — accessible enough to be taken seriously as a product, limited enough to manage execution risk on a first production run.
Will.i.am has been direct about his larger ambition: fewer two-ton SUVs carrying one person a few miles, more compact electric vehicles that are easier to park, faster to charge, and networked with each other and urban infrastructure. It’s a micromobility argument he’s making partly through product and partly through the Boyle Heights manufacturing choice — demonstrating that this kind of vehicle can be built in American communities that have traditionally been excluded from the technology economy.
He’s also on record supporting AI regulation, having been among the 700-plus signatories in 2024 calling for a pause on superintelligence development until adequate safety frameworks exist. The combination — building agentic AI products while advocating for guardrails on the technology’s outer limits — reflects a consistent position: AI as a tool for human augmentation, not replacement. Trinity, a vehicle that requires a human to drive it while an AI handles everything around it, is a fairly literal expression of that philosophy.
Sources
Dezeen / Rima Sabina Aouf, “Will.i.am unveils three-wheeled EV with AI assistant” (January 2026)
BFM Business / P.F. with AFP, “Will.i.am dévoile Trinity, un cerveau sur roue” (March 2026)









Leave a Reply