What AI Vendors Won't Tell You

Vendor demos are optimised for one thing: getting you to sign. That's not cynical—it's just business. But it means the demo shows strengths and hides weaknesses.
After helping dozens of organisations evaluate AI vendors, I've compiled the questions vendors hope you don't ask—and the realities they hope you don't discover until after the contract is signed.
The Demo Data Trick
What you see: A demo running on a carefully curated dataset that makes the AI look brilliant. What you don't see: The months of data preparation that created that demo dataset. The edge cases that were excluded. The failure modes that only appear with real-world data. What to ask: "Can we test with our data during the evaluation?" If the answer involves weeks of "data onboarding" before you can test, be cautious. The reality: Most organisations find performance drops 20-40% when they move from demo data to real data. This isn't fraud—it's the difference between laboratory conditions and the real world.The Pricing Cliff
What you see: Attractive per-user or per-transaction pricing that seems affordable. What you don't see: The volume tiers that kick in at scale. The add-on modules you'll need. The professional services that are "recommended but not required." The training costs. The integration costs. What to ask: "Can you provide a detailed quote for our anticipated usage over 3 years, including all likely add-ons and services?" The reality: Total cost of ownership is typically 2-4x the base license cost. Some vendors are better than others at being transparent about this.The Integration Minimisation
What you see: "Pre-built connectors" to your existing systems. Claims of "quick implementation." What you don't see: Those connectors work with the standard configuration of your systems. Your systems aren't standard. The connectors need customisation. Some things don't connect at all. What to ask: "Can you provide reference customers with similar tech stacks? What was their integration timeline?" Then actually call those references and ask about the integration experience. The reality: Integration is usually the longest and most expensive part of implementation. "8 weeks to production" becomes "6 months to production" when you add real integration requirements.The Training Data Mystery
What you see: Claims about model accuracy and capability. What you don't see: What data trained that model. Whether that data is representative of your use case. Whether there are biases in the training data that could affect your outcomes. What to ask: "Can you provide documentation on your model's training data, known limitations, and bias testing results?" The reality: Many vendors can't or won't answer this question in detail. That's a red flag for any high-stakes application.The Support Reality
What you see: Promises of "dedicated support" and "customer success." What you don't see: What "dedicated" means (shared across how many customers?). What happens when you have a critical issue. How responsive support actually is post-implementation. What to ask: "What are your SLAs for critical issues? Can we speak to customers about their support experience?" The reality: Sales teams disappear after signing. Support varies wildly. Get specific SLAs in the contract and talk to existing customers about their real experience.The Roadmap Fiction
What you see: An impressive roadmap of coming features that addresses your concerns. What you don't see: Which roadmap items have actual engineering resources. Which are placeholders to handle objections. Which will be ready "Q3" for the fifth year running. What to ask: "Which of these features have committed engineering resources? Can we talk to other customers who are waiting for specific features?" The reality: Only evaluate what exists today. Roadmap features are promises, not products. If you need a feature that doesn't exist, either get contractual commitments or assume you won't get it.The "AI" Definition
What you see: AI-powered everything. Machine learning. Deep learning. Cutting edge. What you don't see: What's actually AI versus rule-based automation marketed as AI. Whether the AI is genuinely better than simpler approaches. Whether you need AI for this use case at all. What to ask: "Can you explain specifically what AI/ML techniques you use and why they're better than simpler approaches for our use case?" The reality: Some "AI" products are predominantly rule-based systems with a thin AI layer. Sometimes that's fine. But you should know what you're buying.The Lock-In Trap
What you see: Easy onboarding and smooth implementation. What you don't see: How hard it is to leave. Whether your data is exportable. Whether your workflows are portable. What happens to your historical data if you switch. What to ask: "If we decided to switch vendors in 2 years, what would that process look like? What data would we retain? What format would it be in?" The reality: Some vendors make leaving very difficult. Understand exit terms before you sign entrance terms.The Case Study Selection
What you see: Impressive case studies with big names and big numbers. What you don't see: Whether those customers are still using the product. Whether those results are typical or exceptional. What the customer would say if you asked them directly. What to ask: "Can I speak directly with these case study customers? Can you also connect me with customers whose implementations were less successful?" The reality: Ask for both success stories and "challenging implementations." How a vendor handles difficult situations tells you more than their wins.The Expertise Gap
What you see: Sales and demo teams who know the product deeply. What you don't see: Whether the implementation team has the same expertise. Whether post-sales resources are as capable as pre-sales. What to ask: "Can we meet the actual implementation team before signing? What's their experience with similar implementations?" The reality: Pre-sales and post-sales often have very different skill levels. Meet your actual team, not just the demo team.Protecting Yourself
Based on all this, here's how to run a better vendor evaluation:
Before the Demo
1. Define your requirements clearly—what you need vs. what would be nice 2. Prepare your test data and test scenarios 3. Research the vendor independently (reviews, forums, ex-employees)
During Evaluation
4. Insist on testing with your data 5. Meet the implementation team, not just sales 6. Talk to references—both successful and struggling 7. Get detailed pricing for realistic scenarios 8. Document all claims in writing
Before Signing
9. Get specific SLAs and commitments in the contract 10. Understand exit terms and data portability 11. Define success criteria and what happens if they're not met 12. Have your legal team review carefully
Throughout
13. Maintain healthy scepticism 14. Trust your technical team's instincts 15. Don't let urgency override due diligence
The Vendor Perspective
I'm not anti-vendor. Good vendors provide real value. They've solved problems you shouldn't have to solve from scratch. They have expertise you don't have internally.
But their incentives aren't perfectly aligned with yours. They want to close deals. You want solutions that work.
The best vendors welcome hard questions. They're honest about limitations. They proactively share references who had difficulties. If your vendor gets defensive about legitimate questions, that tells you something.
The Bottom Line
Vendor evaluation is a skill. Like any skill, it improves with practice and pattern recognition.
The questions vendors hope you don't ask are exactly the questions you should ask. The scenarios they avoid demonstrating are exactly the scenarios you should insist on testing.
Trust, but verify. Extensively.
Related Reading
- Enterprise AI Vendor Comparison 2026 — Compare major AI platforms objectively
- How to Run an AI Pilot That Actually Scales — Design pilots that prove real value
- AI Governance Framework for UK Enterprises — Evaluate vendors against governance requirements
Enjoyed this article?
Stay ahead of the curve
Weekly insights on AI strategy that actually ships to production.
Ready to develop your AI strategy?
From scattered experiments to enterprise AI. Our consultancy programme delivers a clear roadmap with fixed-price phases.
