The AI Trust Gap Has a Number Now: 8 Percent. What Two Weeks of Data Tell Us About the Future of Consumer Platforms
Richard Lee
April 15, 2026 · 9 min read
For two weeks, we have been tracking a pattern across every consumer vertical Mubboo covers. Dune7 and Flesh & Bone surveyed 1,000 US travelers: 71 percent want an AI booking assistant, but only 2 percent want that assistant to act with full autonomy. IBM and the National Retail Federation surveyed 18,000 shoppers across 23 countries: 45 percent use AI for product research, only 12 percent trust AI to make the purchase. The Retail Technology Show polled 1,000 UK consumers: 53 percent distrust AI-generated social content, with Gen Z — the most AI-fluent generation — reaching 62 percent distrust.
On April 14, Expedia Group gave the pattern a definitive number. Working with YouGov, they surveyed 5,700 adults across the US, UK, and India between March 10 and 25 — the largest study of AI trust in travel to date. Fifty-three percent of travelers are comfortable letting AI suggest options. Only 8 percent trust AI to handle the booking itself.
Eight percent.
Expedia's Chief AI and Data Officer, Xavi Amatriain, did not mince words: "Travelers don't have a technology problem with AI. They have a trust problem." He is right. And the problem extends far beyond travel.
This article is about what that 8 percent means — for travel, for shopping, for local services, and for every platform, including Mubboo, that sits between AI's recommendation and the consumer's wallet.
Eight percent of travelers trust AI to book. The rest want help deciding — then want to commit the money themselves.
Six studies, one pattern: the complete trust gap dataset
I built this table over two weeks of daily reporting. Each row represents a separate study, a different sample, a different research firm, and a different way of asking the same question. The consistency is what makes the finding so hard to dismiss.
| Study | Sample | What AI Helps With | Trust AI to Transact | |-------|--------|-------------------|---------------------| | Expedia/YouGov (April 14) | 5,700 — US, UK, India | 53% accept suggestions | 8% trust to book | | Dune7/Flesh & Bone (April 8) | 1,000 — US | 71% want AI assistant | 2% want full autonomy | | IBM-NRF (January 7) | 18,000 — 23 countries | 45% use AI for research | 12% trust to purchase | | Travel & Tour World (April 12) | Global trends | 33% use AI for discovery | 13% trust to transact | | Acosta Group (December 2025) | US shoppers | 70% used AI tools | 12% trust AI to buy | | Global Rescue (January 2026) | Experienced travelers | 22% willing to use AI | 79% uncomfortable with autonomous |
The numbers vary by study, sample size, and phrasing. The gap does not. In every case, consumer enthusiasm for AI-assisted discovery runs 33 to 69 percentage points ahead of willingness to let AI complete a transaction. Booking.com reports that 89 percent of travelers want AI involved in trip planning. Global Rescue finds 79 percent uncomfortable with AI making autonomous travel decisions. The gap between "help me explore" and "spend my money" is not noise. It is the defining structural feature of consumer AI in 2026.
Skyscanner founder Gareth Williams summarized consumer sentiment in a March interview with Skift: "I've been really struck by how negative the public is." He was talking about travel. The data says he could have been talking about any category where AI approaches a consumer's wallet.
Why the gap exists: three structural reasons
The trust gap is not irrational caution. Consumers are responding to three real, measurable problems.
The accountability void. When an AI agent books the wrong hotel room or charges the wrong fare, who is responsible? No clear framework exists. Forty percent of Expedia's respondents specifically cited worry about customer service after an AI-assisted purchase. The concern is practical, not philosophical: if something goes wrong, who fixes it? The FTC reports that consumer fraud losses rose 25 percent in the most recent reporting year. McAfee estimates $13 billion in AI-enabled travel fraud. Consumers are not being paranoid. They are reading the environment correctly.
The transparency collapse. Stanford's 2026 AI Index, released the day before Expedia's study, found that the Foundation Model Transparency Index dropped from 58 to 40 points in a single year. Eighty of 95 notable models shipped without training code. Google, Anthropic, and OpenAI all reduced disclosure of basic model information. The most capable models disclose the least. When consumers cannot evaluate how AI generates its recommendations, trusting that AI to spend their money is an act of faith — and 92 percent of travelers are not willing to make that leap.
Incentive misalignment. Walmart's Sparky AI assistant increases order values by 35 percent. Macy's AI chatbot drives 4.75x spending among engaged users. Meta launched Muse Spark with a dedicated Shopping mode built on social behavior data. Google, OpenAI, and Stripe are competing to control the AI checkout protocol — Google's Universal Checkout Protocol, OpenAI's Agent Commerce Protocol, and Visa's integration with Stripe's merchant connection toolkit. Every AI shopping assistant is funded by an entity that profits when consumers spend more. Fifty-three percent of consumers distrust AI-generated social content. They may not know the technical details, but they sense the incentive structure.
Discovery is where consumers welcome AI. Transaction is where they draw the line. The infrastructure being built assumes that line will move faster than the data supports.
What 8 percent means across four verticals
Travel. Google AI Mode already books restaurant tables in nine countries. Flights and hotels are coming, with Booking.com, Expedia, and Marriott among the confirmed partners. Eighty percent of travel executives plan to deploy AI at scale, according to Skift. The infrastructure for AI-mediated booking is built and expanding. But 8 percent trust means 92 percent of consumers will use AI to discover options, compare prices, and read reviews — then book through a channel they already trust. The discovery-transaction split is the defining architecture of travel in 2026, and every platform needs to decide which side of the split it serves.
Shopping. Shopify reports that AI-driven purchases grew 11x year-over-year and AI traffic increased 7x. Those are real growth rates — from a small base. When 45 percent of shoppers use AI for research and 12 percent trust it to buy, the checkout protocol war between Google, OpenAI, and traditional payment networks is building infrastructure for a market that is currently 12 percent of the addressable population. The protocols assume consumers want AI to complete purchases. The data says consumers want AI to help them decide, then complete the purchase themselves.
Local services. Grab built a 13-feature AI life assistant for 700 million people across Southeast Asia — covering restaurants, groceries, rides, hotels, and microloans in a single app. Google AI Mode handles restaurant reservations. These services work because the stakes are manageable: a $50 dinner, a $15 ride, a $30 grocery order. As AI agents move into higher-value territory — booking hotels, arranging flights, managing insurance claims — the trust requirement scales with the dollar amount. The 8 percent figure came from travel, where the average transaction value dwarfs a restaurant booking.
Government. NASCIO reports that AI overtook cybersecurity as the number one CIO priority for state governments after cybersecurity held that position for 12 consecutive years. Eighty-two percent of state IT employees use generative AI daily. But only 25 percent of states have dedicated AI funding, and states are legislating to fill the gap that voluntary disclosure left open — Nebraska requires chatbot identification, Maryland regulates AI-driven pricing, Maine banned unlicensed AI therapy services. Government AI is the one category where consumers cannot opt out, which makes the trust deficit even more consequential.
What platforms must build in an 8 percent world
The 8 percent will grow. Expedia is investing in AI checkout. Google is wiring AI Mode into the full travel booking stack. The protocol builders will improve accountability frameworks, and consumer trust will follow — gradually, unevenly, and over years rather than quarters. The question for independent platforms is not whether the trust gap will close. The question is what value we provide while it exists, and what value persists after it narrows.
At Mubboo, our answer has been the same since we started building: editorial independence applied to consumer decisions. We update prices every 24 hours via retailer APIs across our US site and our Australian site. We shift seasonal recommendations monthly. We include anti-recommendations — products and services we evaluated and chose not to feature. Every comparison table carries a "What to know" column with our editorial assessment, not the retailer's marketing copy. We publish under named authors. We disclose our AI usage in content production.
These are not features we added because the data told us to. We built them because the logic was obvious: when AI can generate any recommendation instantly, the scarce resource is not information — it is judgment. Judgment about whether this hotel suits this trip with these constraints. Judgment about whether this price reflects genuine value or algorithmic inflation. Judgment that no AI agent funded by the entity being evaluated can credibly provide.
Even when 50 percent of consumers eventually trust AI to book — which the trajectory of every prior technology suggests will happen, eventually — the question "is this the right choice for my specific situation?" will still require editorial layers that operate independently of the transaction. That is what we build. That is what the 8 percent validates.
The number that defines the era
The AI Trust Gap has a number now: 8 percent. It comes from 5,700 respondents across three countries, conducted by YouGov for the largest online travel company in the world. It confirms what Dune7 found with 1,000 travelers, what IBM-NRF found with 18,000 shoppers, what the Retail Technology Show found with 1,000 UK consumers, and what Stanford documented across 400 pages of global AI research.
Consumers want AI to help them explore. They do not trust AI to commit their money. The gap between those two states ranges from 33 to 69 percentage points depending on the study, the vertical, and the transaction value.
That gap is not a problem to solve. It is a market to serve — with independent editorial judgment, transparent methodology, and content designed to be the layer between AI's recommendation and the consumer's decision. We have been building there for months. The data caught up this week.

Richard Lee
Founder
Richard is the founder of Mubboo, building an AI-powered platform that helps everyday consumers navigate shopping, travel, finance, and local life across multiple countries.