Published 26 April 2026
GDPR and AI chatbots: What you need to know before going live
AI chatbots process personal data. Here are the seven questions you need to be able to answer before putting an AI chatbot on your European website.
GDPR and AI are two words that crease the brows of many European business owners. Rightly so: it's hard to keep track of which rules apply, which vendors you can use, and what you actually need to do before putting an AI chatbot live on your website.
This is a pragmatic guide. It covers the seven questions you, as data controller, need to be able to answer. It is not legal advice, and if you work with sensitive data, particularly regulated industries, or the public sector, you should check with a data protection officer.
Question 1: Does the chatbot process personal data?
Almost always yes. An AI chatbot typically collects:
- The chat messages themselves, which often contain personal data (name, email, problem descriptions, account numbers in the worst case)
- Technical data: IP address, browser type, device information
- Cookies or localStorage to track the conversation
- Timestamps, session data
That means you, as the company behind the website, are the data controller for the data collected via the chatbot. You carry the full GDPR responsibility.
Question 2: Who is the data processor?
The chatbot vendor is usually the data processor. That means they process personal data on your behalf. GDPR requires a written Data Processing Agreement (DPA). Without it, they are technically processing data unlawfully.
Check before signing:
- That the vendor offers a DPA (many do automatically at account creation)
- That the DPA specifies which data is processed, for what purposes, and for how long
- That it includes provisions for deletion at termination
- That it covers any sub-processors (e.g., the cloud provider, the language model provider)
Question 3: Where is data stored, and is it transferred outside the EU?
This is the most complicated question, particularly after the Schrems II ruling. If data is transferred outside the EU/EEA, there must be a valid transfer basis. The common options are:
- Adequacy decision: Countries the EU has assessed as having an adequate level of data protection (UK, Switzerland, Japan, and others). The US is currently covered via the EU-US Data Privacy Framework, but only for certified companies.
- Standard Contractual Clauses (SCC): Standard clauses approved by the EU. Require supplementary measures after Schrems II.
- Binding Corporate Rules (BCR): Used by large international groups.
Ask the vendor specifically:
- Where is data hosted (server location)?
- Which sub-processors are used, and where are they?
- Is a US-hosted language model used (OpenAI, Anthropic via their own APIs) or an EU-based one?
- Which transfer basis is used?
Many European vendors host in the EU and use language model APIs that are either EU-based or have clear transfer bases. That simplifies your compliance task.
Question 4: What is the legal basis for processing?
You need to be able to point to one of the six legal bases in GDPR Article 6:
- Consent: The visitor explicitly accepts. Must be free and specific. A generic cookie banner is not enough.
- Performance of contract: If the chat is necessary to deliver an agreed service.
- Legitimate interest: The most common basis for customer service. Requires a written legitimate interest assessment.
- Legal obligation: Rarely relevant for chatbots.
- Vital interests: Almost never relevant.
- Public interest: Relevant for municipalities and public institutions.
For an ordinary business website, legitimate interest is often the right basis for the chat itself, while consent may be necessary if you use conversation data for marketing or analytics beyond operations.
Question 5: How long is data retained?
GDPR requires that you only keep data for as long as necessary. For chat conversations, typical retention periods are:
- 30 days: sufficient for most support purposes
- 90 days: if there is a need for follow-up or pattern analysis
- Up to 5 years: if the conversation is relevant for bookkeeping (e.g., a specific agreement)
Set a concrete retention period, document it, and make sure the vendor actually deletes automatically.
Question 6: How do you handle data subject rights?
Under GDPR, individuals have the right to:
- Access: To see what data you have about them
- Rectification: To have inaccurate information corrected
- Erasure (the right to be forgotten)
- Restriction of processing
- Data portability: To receive their data in a structured format
- Objection to processing
You need to be able to honor these rights for chat data too. That means your vendor must be able to look up a specific user and either export or delete their conversations. Ask whether this functionality is available in the dashboard.
Question 7: How do you inform users?
You need to inform users about the processing before they begin. Typically through:
- Privacy policy: Should describe which data is collected via the chatbot, purpose, legal basis, retention period, vendor
- Cookie banner: If the chatbot uses cookies, that needs to be covered there
- Notice at chat start: Many choose to display a short text in the widget on first opening, linking to the privacy policy
Don't forget to update your privacy policy when you go live with a new chatbot. It's the most commonly missed detail.
What does Clarifier do on GDPR?
Clarifier is built in Denmark and complies with GDPR. We offer a Data Processing Agreement at account creation, retain chat conversations for up to 30 days for operations and improvement, and do not resell data to third parties. You can see the full processing description in our privacy policy.
For public institutions we tailor the Data Processing Agreement to your specific requirements. Contact us to walk through it together with your data protection officer.