By Anne T. T. · Published April 12, 2026 · 9 min read
Dictating session notes, summarising an assessment, drafting a letter to the referring physician in thirty seconds: generative AI tools promise considerable time savings for therapists. And the promise is real — provided you don't sacrifice your patients' confidentiality in the process.
Because behind every query sent to ChatGPT, Claude or Gemini lies a fundamental question: where does your patients' data go? In Switzerland, the answer is not just an ethical matter. It is a legal obligation, governed by the nLPD, the Criminal Code and your profession's ethical rules.
When you enter a session note into ChatGPT, the text is transmitted to OpenAI's servers, hosted primarily in the United States. Yet the nLPD (Art. 16 ff.) imposes strict conditions for any transfer of personal data to a country that does not offer an adequate level of protection.
The United States is not on the Federal Council's list of countries with adequate protection (Annex 1 of the OPDo). A transfer requires additional safeguards — standard contractual clauses, explicit patient consent, or another recognised mechanism. In practice, a therapist typing notes into ChatGPT meets none of these conditions.
The nLPD (Art. 9) requires a Data Processing Agreement (DPA) with any third party processing personal data on your behalf. OpenAI offers a DPA, but only for Enterprise accounts and API subscriptions with billing. An individual ChatGPT Plus account has no DPA. You are entrusting sensitive data to a processor without a compliant contractual framework.
The nLPD (Art. 5 let. c) defines sensitive data as explicitly including health data. Therapist session notes — diagnoses, discussed content, clinical observations — fully fall into this category. Processing sensitive data is subject to enhanced requirements: clear legal basis, defined purpose, strict proportionality.
Criminal risk: Art. 321 of the Swiss Criminal Code punishes violation of professional secrecy with up to three years imprisonment or a fine. Transmitting information covered by professional secrecy to a foreign cloud service, without informed patient consent, constitutes a real criminal risk.
Since its entry into force on 1 September 2023, the new Federal Data Protection Act (nLPD/nDPA) applies to every therapist practising in Switzerland. Here are the principles directly relevant to AI tool usage.
Only data necessary for the processing purpose should be collected and processed. Sending an entire session note to an LLM to correct three typos violates this principle.
Data may only be processed for the purpose stated at the time of collection. Your patients entrust their information to receive therapeutic treatment, not to feed the training of a language model.
Processing sensitive data requires explicit consent (Art. 6 para. 7 nLPD). This consent must be informed: the patient must understand that their data will be sent to a third-party service, in which country, and for what purpose. A generic consent for therapeutic treatment does not cover the use of external AI tools.
Transfer of personal data to a country without adequate protection is only authorised if appropriate safeguards are in place (Art. 16 para. 2 nLPD). Copy-pasting a note into ChatGPT satisfies none of these requirements.
AI is not incompatible with confidentiality. Several approaches allow you to benefit from its advantages without compromising legal compliance.
Some language models can run directly on your computer, without any connection to an external server. Tools like Ollama or LM Studio allow running open-source models (Llama, Mistral) locally. Data never leaves your machine.
Advantage: maximum confidentiality. Limitation: requires a powerful computer, technical setup, and performance below the most advanced cloud models.
A more accessible alternative is using AI services hosted in Swiss data centres, subject to Swiss law. This is the case for Infomaniak AI Tools, which offers open-source language models (Mistral, Qwen) hosted exclusively in its data centres in Geneva and Winterthur.
This is the approach chosen by Therago for its voice dictation and writing assistance features. AI processing uses models hosted by Infomaniak, in Geneva. Voice and text data are encrypted (AES-256-GCM) and are never transmitted to OpenAI, Google or Amazon.
Learn more about voice dictation: Voice dictation for therapists: how to save 1h per day.
If you still wish to use a tool like ChatGPT for specific tasks (rephrasing, bibliographic research), systematically anonymise your data before any input. Replace names, dates of birth, locations and any identifying element with codes or generic terms.
This approach remains imperfect: in a therapeutic context, even seemingly innocuous details can allow re-identification (rare pathology, particular family situation). Caution is advised.
Before adopting an AI tool in your practice, ask yourself these questions:
1. Where is the data processed?
Processing must take place in Switzerland or in a country recognised as offering adequate protection. If servers are in the US, additional contractual safeguards are essential (Art. 16 nLPD).
2. Is there a Data Processing Agreement (DPA)?
The nLPD (Art. 9) requires it. Without a DPA, you have no contractual guarantee on how your data is processed, stored or deleted.
3. Can I delete the data?
The right to erasure is part of the data subject's rights (Art. 32 nLPD). Verify that the provider allows effective deletion — not just archiving.
4. Is my data used to train the model?
Most consumer LLMs use user conversations to improve their models. OpenAI does so by default. This retraining constitutes additional processing not covered by your patients' initial consent.
5. Is encryption in place?
Data must be encrypted in transit (TLS) and at rest. Per-user encryption (where each therapist has their own key) offers additional protection.
6. Is it compatible with professional secrecy (Art. 321 CP)?
Beyond data protection, ask whether transmitting information to this service constitutes disclosure under Art. 321 CP. Swiss hosting with a DPA significantly reduces this risk.
Rule of thumb: if you answer "no" or "I don't know" to any of these questions, do not enter patient data into that tool.
For a complete view of your data protection obligations: nLPD/LPD 2026 checklist for therapists.
Disabling history prevents OpenAI from using your conversations to train future models. However, data is still transmitted to US servers and retained for 30 days for abuse monitoring. This resolves neither the cross-border transfer issue (Art. 16 nLPD) nor professional secrecy (Art. 321 CP).
In theory, explicit and informed consent could cover the cross-border transfer. But this consent would need to be truly free, specific, and informed. In practice, this avenue is legally fragile and discouraged by legal doctrine.
Enterprise API accounts offer a DPA, a commitment not to use data for training, and enhanced security controls. But data is still processed in the US. The cross-border transfer issue and Art. 321 CP remain. A Swiss-hosted solution is preferable for health data.
The nLPD provides for fines up to CHF 250,000 for responsible individuals (Art. 60 ff. nLPD). Violation of professional secrecy (Art. 321 CP) carries up to three years imprisonment. A data breach can also lead to disciplinary proceedings with your professional association (FSP, ASP) and loss of patient trust.
Artificial intelligence has its place in therapeutic practice. Voice dictation, writing assistance, note summarisation: these features save precious time. But the choice of tool matters as much as the feature itself.
A Swiss therapist who enters session notes into ChatGPT takes a concrete legal risk — nLPD violation, unauthorised cross-border transfer, potential breach of professional secrecy. This is not a matter of paranoia, it is a matter of compliance with laws that exist to protect the most vulnerable people: your patients.
The criterion is simple: if you cannot explain to your patient exactly where their data goes and who has access, do not enter it into that tool.
Therago integrates Swiss-hosted AI (Infomaniak, Geneva) with AES-256-GCM encryption per therapist. No data is transmitted to OpenAI, Google or Amazon. Discover Therago.
AI for therapists, without compromising confidentiality. Swiss-hosted, encrypted, compliant.
30-day free trialNo credit card. Data hosted in Switzerland.