A little‑known startup just raised $25 million to tackle one of the industry’s worst‑kept secrets: most enterprise AI projects never make it past the pilot stage. With a novel “chain‑of‑work” approach and human‑augmented processing, Maisa AI says it can turn failures into successes.
A new enterprise AI failure rate report dropped quietly last week, and the numbers shocked even seasoned technologists. Researchers at MIT’s NANDA initiative found that 95 percent of generative AI pilots in large organisations flop before reaching production. On X, the hashtag #AIFailureRate trended as frustrated engineers shared stories of stalled proofs‑of‑concept and mounting technical debt. Into this chaos stepped Maisa AI, a year‑old startup from Spain and the Bay Area, which announced a $25 million seed round on Wednesday. Its mission: build accountable AI agents that actually work in the real world.
Why AI Pilots Fail
The MIT report attributes the high failure rate to several factors: opaque “black box” models that businesses can’t audit, lack of human oversight, hallucination‑prone outputs and a failure to integrate AI into existing workflows. Most companies experiment with generative AI but abandon projects when models generate unreliable results or raise compliance issues. In response, some organisations are experimenting with agentic systems that can be supervised and adapted over time.
Enter Maisa AI
Maisa AI was founded by David Villalón and Manuel Romero, alumni of Spanish AI startup Clibrain. Frustrated by hallucinations and the black‑box nature of existing tools, the duo created a platform that focuses on accountability and user control. Their product, Maisa Studio, is a model‑agnostic platform where enterprises can build digital workers by describing tasks in natural language. The key innovation is the “chain‑of‑work,” a process in which the AI outlines each step it will take and asks the user for approval before proceeding. This approach aims to replace vibe‑based prompting with a more transparent, auditable workflow.
Maisa also introduced the Knowledge Processing Unit (KPU), a deterministic module that limits hallucinations by cross‑checking information against verified sources and ensures tasks proceed logically. Together, the chain‑of‑work and KPU form HALP (Human‑Augmented LLM Processing), which treats users like teachers guiding students at a blackboard. The system asks clarifying questions, outlines its reasoning and waits for human sign‑off before executing tasks.
A Different Kind of Automation
Unlike vibe‑coding platforms like Cursor or Lovable, which emphasise quick results and low friction, Maisa prioritises trust and accountability. As Villalón told TechCrunch, “Instead of using AI to build the responses, we use AI to build the process that needs to be executed to get to the response.” That distinction resonates with regulated industries where mistakes can be costly. Maisa’s early clients include a major bank and companies in car manufacturing and energy.
Maisa’s system can deploy on the company’s secure cloud or on‑premise for clients with stricter requirements. The platform integrates with existing systems and aims to act more like a next‑generation robotic process automation (RPA) tool than a black‑box AI. Villalón believes this transparency will appeal to enterprises burned by previous AI failures.
Funding and Growth Plans
The $25 million seed round was led by European VC firm Creandum, with participation from NFX, Village Global and U.S. firm Forgepoint Capital through its joint venture with Banco Santander. Maisa plans to use the funds to grow from 35 to 65 employees and expand its customer base across Europe and the United States. The company sees the new Maisa Studio as a way to turn its bespoke approach into a self‑serve product that can scale.
The Social Media Angle
On LinkedIn and developer communities, Maisa’s announcement prompted heated discussions. Enterprise engineers were eager to try a platform that might finally get projects out of the pilot stage. Others were sceptical, arguing that no tool can fix organisational dysfunction or poor data quality. On Reddit, a thread on r/ArtificialIntelligence dissected Maisa’s HALP framework, comparing it favourably to vibe coding tools but questioning whether its deterministic KPU would limit creativity.
Critics also pointed out that Maisa’s approach shifts the burden back to humans; by requiring sign‑offs at each step, it may slow projects down. Supporters responded that speed isn’t everything when building systems for banking or healthcare. As one commenter wrote, “I’d rather have a slower AI I can audit than a fast one that hallucinates my company into a compliance violation.”
Broader Implications
Maisa’s rise reflects a growing demand for trustworthy AI. Regulators in Europe and the U.S. are drafting rules that could force companies to prove their models are fair and auditable before deployment. Large enterprises, wary of reputational damage, are seeking vendors that prioritise compliance. Analysts say Maisa’s funding signals investor interest in AI solutions that go beyond flashy demos and focus on operational realities. As recent events like the Character AI privacy backlash show, trust and transparency are critical to user adoption; ignoring them can lead to mass exodus and reputational damage.
FAQs
Why do most enterprise AI projects fail?
According to MIT’s NANDA initiative, 95 percent of generative AI pilots fail because models are opaque, hallucinate and lack integration with human workflows.What is Maisa AI’s chain‑of‑work?
It’s a process where the AI outlines each step it will take to complete a task and seeks human approval before executing. This aims to provide transparency and control.What does HALP stand for?
HALP means Human‑Augmented LLM Processing. It combines the chain‑of‑work with the Knowledge Processing Unit to reduce hallucinations and involve users in decision‑making.Who backs Maisa AI?
The startup’s $25 million seed round was led by Creandum, with NFX, Village Global and Forgepoint Capital’s European joint venture with Banco Santander participating.How is Maisa AI different from vibe‑coding tools?
Vibe‑coding platforms prioritise speed and low friction. Maisa focuses on accountability, transparency and compliance, offering deterministic processes and on‑premise options.