AI Vendor Due Diligence: Red Flags Before You Sign
AI vendors promise speed and efficiency, but rushing into contracts without proper due diligence can expose your business to data breaches, compliance violations, and liability gaps. Here's what to check before you sign.
The vendor evaluation trap
Most AI vendor evaluations focus on features and pricing. But the real risks hide in data handling, liability terms, and audit rights. A vendor that can't answer basic security questions probably isn't ready for your business data.
Unlike traditional software, AI vendors often:
- Train models on customer data (unless explicitly prohibited)
- Use third-party AI services as subprocessors
- Store data across multiple jurisdictions
- Limit liability to subscription fees (not actual damages)
Critical due diligence questions
Data handling and privacy
- Where is our data stored and processed? Multi-region storage creates compliance complexity.
- Do you use our data to train or improve models? Get explicit opt-out language in writing.
- What subprocessors handle our data? Each subprocessor adds risk and compliance requirements.
- How do you handle data deletion requests? GDPR and state privacy laws require verifiable deletion.
- Can you provide data processing addendums (DPAs)? Required for GDPR compliance and good practice generally.
Security and access controls
- What certifications do you maintain? SOC 2 Type II, ISO 27001, or industry-specific standards.
- How do you handle access logging and monitoring? You need audit trails for compliance and incident response.
- What's your incident response process? How quickly will you notify you of breaches or issues?
- Do you support single sign-on (SSO) and role-based access? Critical for enterprise security policies.
Liability and insurance
- What's your liability cap? Many vendors limit liability to monthly fees — inadequate for data breach costs.
- Do you carry cyber liability insurance? Ask for certificate of insurance with adequate coverage limits.
- Who's responsible for AI output accuracy? Clarify liability for hallucinations, bias, or incorrect recommendations.
- What happens if your AI violates regulations? HIPAA, GDPR, employment law violations can be costly.
Contract red flags
Watch for these problematic clauses:
- "Best efforts" security language → Demand specific security standards and SLAs.
- Broad data usage rights → Limit use to providing services only.
- Unlimited subprocessor rights → Require approval for new subprocessors, especially AI model providers.
- Liability caps below actual risk → Negotiate higher caps or carve-outs for data breaches.
- Automatic renewal without notice → Require 60-90 day termination notice.
- No data portability guarantees → Ensure you can export your data in standard formats.
Industry-specific considerations
Healthcare and HIPAA
Require Business Associate Agreements (BAAs) for any PHI processing. Verify the vendor understands HIPAA requirements and has experience with healthcare compliance. See our HIPAA and AI guide for details.
Financial services
Check for SOX, PCI DSS, and Gramm-Leach-Bliley compliance. Verify data residency requirements and audit rights. Review our financial AI compliance guide.
Employment and HR
Ensure compliance with EEOC guidelines and state employment laws. Verify bias testing and audit capabilities. Check our AI hiring discrimination guide.
The procurement checklist
Before signing any AI vendor contract:
- Security assessment → Review certifications, penetration testing, and incident history.
- Data flow mapping → Document exactly how your data moves through their systems.
- Compliance verification → Confirm they meet your industry's regulatory requirements.
- Insurance review → Verify adequate coverage and get certificates. See our insurance questions guide.
- Reference checks → Talk to similar customers about their experience, especially around incidents.
- Pilot testing → Start small with non-sensitive data to evaluate performance and security.
- Legal review → Have counsel review liability, indemnification, and termination clauses.
When to walk away
Some red flags should end negotiations immediately:
- Vendor won't provide security certifications or audit reports
- Unclear or evasive answers about data usage and training
- No cyber liability insurance or inadequate coverage
- Unwillingness to negotiate liability caps or indemnification
- No clear data deletion or portability process
- History of security incidents without transparent disclosure
Questions to ask yourself
- Do we understand exactly what data this vendor will access?
- Have we mapped all compliance requirements for this use case?
- Does our insurance cover risks from this vendor relationship? Review our cyber vs. AI liability guide.
- Do we have internal processes to monitor vendor performance and compliance?
- What's our exit strategy if this vendor relationship doesn't work out?
No email required — direct download available.
Master vendor risk with the complete toolkit
Start with our free 10-minute AI preflight check to assess your current vendor risks, then get the complete AI Risk Playbook for comprehensive vendor evaluation frameworks and contract templates.