What is Hark? Explore Brett Adcock’s new AI startup building a full-stack of hardware and foundation models to replace the smartphone with proactive agentic intelligence.
What is Hark? Explore Brett Adcock’s new AI startup building a full-stack of hardware and foundation models to replace the smartphone with proactive agentic intelligence.
As of late March 2026, the AI hardware graveyard is already crowded with the “vibe-heavy” but “utility-light” relics of 2024. However, a new player has emerged that is forcing Silicon Valley to rethink the entire “agentic” architecture.
Hark, the latest venture from billionaire serial founder Brett Adcock (the force behind Figure AI and Archer Aviation), has officially exited stealth. With $100 million in personal funding and a design team led by the man who shaped the iPhone, Abidur Chowdhury, Hark isn’t just building another gadget—it is building a “full-stack” ecosystem designed to offload the human mental workload.
Here is the “why” behind the foam and the technical grit of the Hark architecture.
Most AI startups fail because they build on “rented land”—layering an interface on top of a third-party model (like GPT-4) and shoving it into generic hardware. Hark is taking the Vertical Integration approach.
Adcock’s thesis is that for an AI agent to be truly “proactive,” it cannot suffer from the latency or “context blindness” of a standard API call. Hark is building:
By building the hardware and the model together, Hark can optimize the Inference-at-the-Edge—allowing the agent to “see” and “hear” with minimal lag and maximum privacy.
The biggest technical hurdle in 2026 is moving AI from a Reactive Tool (it only acts when you ask) to an Anticipatory Agent.
Hark’s engineering team, recruited from the likes of Tesla and Meta, is focusing on Highly Personalized Memory Systems. Unlike a standard chatbot that “forgets” you after a session, Hark’s architecture is designed to build a long-term “User Context Graph.”
The recruitment of Abidur Chowdhury (ex-Apple) is a strategic move to solve the “Hardware Friction” that killed previous AI pins. The industry has learned that users don’t want to look like “cyborgs.”
Hark’s hardware is rumored to be “invisible”—designed to blend into the home or the body as a seamless interface. The goal is to move the Cognitive Load away from a screen and into the environment. If the interface is too clunky, the “grit” of the AI doesn’t matter; Hark is betting that world-class industrial design is the only way to make 24/7 agentic assistance socially acceptable.

To power these foundation models, Hark is not playing small. The company has secured thousands of Nvidia HGX B200 GPUs, set to come online in April 2026. This massive compute power is being used to train multimodal models that don’t just “read” text, but understand the “physics of intent”—interpreting a user’s tone of voice, environmental noise, and visual cues to provide high-fidelity responses.
Hark is the most ambitious attempt yet to move AI out of the cloud and into the “physical layer.” While the smartphone remains the king of the 2020s, Hark is building the “Agentic Mesh” that could finally make the screen obsolete. It isn’t about “chatting” with an AI; it’s about employing a system that thinks like you—and sometimes, ahead of you.
Keep in touch with our news & offers