Saltar al contenido principal
Back to blog
GDPRData protection

GDPR and AI in Public Administration: Practical Checklist

February 26, 20265 min readOptimTech
Share:

Why the GDPR matters when the public administration uses AI

AI projects in city councils and public bodies usually involve processing personal data: chatbots handling citizen inquiries, predictive systems for social benefits, automated analysis of administrative documents, etc. The General Data Protection Regulation (GDPR) and complementary rules (e.g. the forthcoming EU AI Act and Spain’s National Security Framework - ENS) impose specific obligations on transparency, legal bases, data subject rights and security. Ignoring them is not only a legal risk but also a threat to public trust.

Below is a practical, verifiable guide for embedding compliance into your project from the design phase.

Practical steps before deploying AI

1. Data inventory and classification

  • Identify which personal data are used (identifiers, special categories, sensitive data) and their origin (internal systems, forms, third parties).
  • Classify by purpose: citizen services, assisted decision-making, profiling for resource allocation, etc.
  • Record of processing activities (Article 30 GDPR): document controllers, purposes, retention periods and transfers.

2. Determine the appropriate legal basis

  • As a general rule, public entities should rely on Article 6(1)(e) GDPR (performance of a task carried out in the public interest) or a legal obligation (Art. 6(1)(c)), not on legitimate interest when acting in a public authority role.
  • If special categories of data are processed (health, political opinions, etc.), check Article 9 and the relevant sector legislation that may permit the processing or provide alternative measures.

3. Carry out a DPIA (Data Protection Impact Assessment)

  • This is mandatory when the processing is likely to result in high risk (e.g. systematic profiling, automated decisions that affect rights).
  • Minimum content: description of the processing, assessment of necessity and proportionality, evaluation of risks to rights and freedoms, and measures to mitigate them.
  • If a high risk remains, consult the supervisory authority (Spanish Data Protection Authority, AEPD).

4. Ensure transparency and data subject rights

  • Provide clear and accessible information (Articles 12–14): purpose, legal basis, recipients, retention periods, rights and how to exercise them.
  • For automated decisions with legal or significant effects, offer human intervention, a comprehensible explanation and the possibility to appeal (Article 22).

5. Contracts and liability with suppliers

  • Sign Data Processing Agreements (DPAs) in accordance with Article 28 GDPR, specifying subprocessors, technical and organizational measures, audit rights and obligations in the event of a breach.
  • Assess the provider’s technical capacity and guarantees: certifications, security testing, storage locations and international transfers.

6. Technical and organizational measures

  • Data minimization: collect only what is necessary for the specific purpose.
  • Pseudonymization and encryption: reduce risks and support compliance (Art. 32).
  • Access controls, operation logs and decision traceability for auditability and explainability.
  • Limited retention: clear retention periods and procedures for deletion or anonymization.

7. International transfers

  • Avoid transfers outside the EU unless necessary. If unavoidable, use adequacy decisions, standard contractual clauses or approved supplementary measures.

8. Interface with the EU AI Act and ENS

  • If the AI system is classified as "high risk" under the EU AI Act, incorporate additional requirements: risk management, technical documentation, logging of operations and transparency labeling.
  • Integrate ENS controls (Royal Decree 311/2022) to ensure the cybersecurity of the solution in public environments.

Practical examples for common cases

  • Informational chatbot:
    • Legal basis: performance of a public task.
    • Key measure: initial notice about processing, limited logs and pseudonymization of queries for analysis.
  • Predictive system for scoring social benefits:
    • High risk: full DPIA, explanation of criteria, human review of automated decisions and an appeals policy.
  • OCR and case processing:
    • Minimize extraction of sensitive personal data; remove or anonymize unnecessary fields before large-scale analysis.

Operational checklist (project summary)

  • Updated data inventory and record of processing activities.
  • Legal basis documented for each purpose.
  • DPIA completed and approved; consult the AEPD if required.
  • Information notices and channels to exercise rights implemented.
  • Contract with provider compliant with Article 28 GDPR.
  • Technical measures (encryption, pseudonymization, access control) applied.
  • Retention and anonymization/deletion plan.
  • Assessment of international transfers.
  • Review of applicable EU AI Act and ENS requirements.

Compliance owner: internal roles

  • Data controller: defines purposes and policy decisions.
  • DPO (Data Protection Officer): oversees the DPIA, communication with the AEPD and training.
  • Technical team: implements security and traceability measures.
  • Legal/procurement: reviews provider clauses. Ensure coordination among these functions from the design phase onward.

Conclusion and recommended action

For public administrations, data protection is not an add-on but a central element of any AI project design. Immediate recommended action: launch a rapid audit (2–4 weeks) combining a data inventory + preliminary risk assessment (DPIA screening) and a contractual review of suppliers. That diagnosis will allow you to prioritize concrete measures and demonstrate compliance to citizens and the supervisory authority.

OptimGov includes traceability and document management features designed to help meet many of these requirements, but the starting point is always the same: inventory, DPIA and clear contracts. Taking these three steps reduces legal risk and increases trust in municipal digital services.