Data Governance, The EU AI Act and the Future of Global Mobility
As the EU AI Act heralds a wave of AI regulation, immigration and Global Mobility teams need to get to grips with data governance principles.
Alex Schulte | 28 October 2024
Artificial Intelligence isn’t magic, though it might seem like it at times. The speed, fluency and precision of tools like ChatGPT emerge from inscrutably complicated processes of data retrieval and processing. To use AI is to put a vast nexus of data into motion. That’s why it’s impossible to regulate AI without paying attention to where all this information is coming from – and what, exactly, is being done to it.
So it follows that the EU’s new AI Act, the first piece of major AI regulation, devotes a lot of time to setting rules for data governance. If you haven’t heard this term before, it’s best to think of it as an umbrella for everything an organisation does to procure, manage, secure and dispose of the data it uses. As you might imagine, this an extremely broad topic with implications for every department. In this piece, we will limit its scope to the data you call upon when using or creating AI tools.
Why Data Matters in Immigration and Global Mobility
Most people who work in the immigration field, whether they’re legal practitioners or in-house Global Mobility managers, aren’t data specialists. But as more workflows are reshaped by digitalisation and AI, everyone in this field will need to grasp the fundamentals of good data governance. The alternative is leaks and other embarrassing privacy mishaps, not to mention harsh penalties from regulators.
The kind of information that clients or employees provide in the course of relocations is among the most sensitive data there is. Personal Identification Information (PII) like tax identification numbers and addresses, immigration status, bank details, medical records, employment history; each of these data points is already heavily protected by law.
The EU AI Act and GDPR
In 2018, the EU introduced the General Data Protection Regulation (GDPR), a sweeping set of requirements outlining how organisations should process personal data. Here in the UK, the Data Protection Act 2018 quickly instituted these rules in domestic statutes.
The two acts are, in fact, mutually complementary. The EU AI Act builds on GDPR’s foundational framework, making compliance with the new rules much easier for those already following the letter of the law.
The two legislative frameworks depart in how they apportion compliance burdens. While the GDPR’s data protection rules are applied universally, the EU AI Act works by assessing the risks of specific uses of AI system and meting out obligations accordingly. As a result, the use of AI systems in immigration and asylum processes is classed as de facto high-risk. This subjects users and developers to the strictest transparency and accountability measures.
Our new report finds that the legislation provides various exemptions allowing teams to leverage AI in many conceivable use cases without these elevated duties. However, immigration legal teams and Global Mobility professionals must still update their data management frameworks to align with the EU AI Act and GDPR.
Data Governance and the AI Value Chain
The EU AI Act places different data governance obligations on you depending on your place along the AI value chain.
The legislation identifies six key entities in the development, use and transmission of high-risk AI systems.
Providers
A Provider is any party that develops AI systems or models to be put on the EU market.
Providers of high-risk AI systems are the most heavily regulated entities in the legislation and are subject to the most extensive data-related obligations.
- Validating input quality: Providers must ensure the data sets they use to build models meet stringent quality criteria. The data should be representative, relevant, complete, and properly labelled.
- Record-keeping: Providers must keep detailed technical documentation concerning an AI system’s data governance across its lifecycle. This includes information on their provenance, collection methodology, and any limitations or biases. This metadata is crucial for ensuring transparency and allowing effective oversight of the AI system. Providers must also institute automatic collection of technical logs.
- Data Storage and Access Control: Providers must implement stringent measures to secure sensitive data, including encryption, access controls, and regular security audits. They must also establish clear data retention policies to ensure personal data is not kept longer than necessary.
- Technical Solutions: Where appropriate, providers should develop technical solutions, such as data augmentation, synthetic data generation, or dataset adaptation, to improve their quality and diversity.
These rules are similar for those classed as Manufacturers – a party that incorporates AI systems into its products and/or provides, distributes or uses AI systems under its own name.
Deployers
A Deployer is any party that uses third-party AI systems under its own authority for professional activities.
Deployers have fewer obligations than Providers, but they are still expected to conduct due diligence on the tools they use.
- Input Data Quality: Deployers must vet the input data to ensure that it is relevant and appropriate for the system’s intended purpose.
- Monitoring: They are required to monitor the operation of the high-risk AI system based on the instructions provided. This includes monitoring the quality and appropriateness of input data throughout the system’s use, and keeping logs of its activity.
- Data Protection: When using high-risk AI systems that process personal data, deployers must ensure compliance with data protection regulations like GDPR.
- Reporting: If deployers have reason to believe that the use of the system may pose risks to health, safety, or fundamental rights, they must inform the provider or distributor and potentially suspend use of the system.
Importers
An Importer is any party that brings AI systems bearing a non-EU Provider’s name and trademark into the EU market.
Their data-related responsibilities focus on verification and documentation:
- Verification: Importers must verify that the provider has implemented appropriate data governance practices, such as using high-quality training, validation, and testing datasets.
- Documentation: They must keep a copy of the EU declaration of conformity and technical documentation for 10 years after placing the system on the market. This documentation includes information about the data used to train and test the AI system.
- Information Provision: Importers must provide all necessary information and documentation to demonstrate the AI system’s conformity upon request from competent authorities.
Distributors
A Distributor is a slightly more opaque classification. It refers to any party, other than a Provider or Importer, that makes AI systems available to others in the EU market. Imagine a specialist company that provides AI-powered tools, developed in the EU, to teams in Europe without trademarking those tools themselves.
While distributors have fewer direct data governance obligations, they still play a role in ensuring compliance:
- Verification: Distributors must verify that the high-risk AI system bears the required CE marking and meets the data governance standards expected of entities higher up the value chain.
- Reporting: If a distributor considers or has reason to believe that a high-risk AI system is not in conformity with the AI Act, they must not make the system available on the market until it has been brought into conformity
How to Get Your Data Governance Ready for the EU AI Act
If defining your place on this value chain and putting the right processes in place sounds like a tough task, don’t panic. The EU AI Act may have passed into law, but its different stipulations are staggered to allow organisations to get compliant in time.
Article 10, the part of the EU AI Act covering data governance, will enter into force on 2 August 2026. With almost two years to go, there should be ample time to layer these new rules on top of your GDPR compliance structure.
1, Appoint a Chief Data Officer (CDO)
Adopting these data governance practices will require you to orchestrate your efforts across departments, a function that typically falls under the Chief Data Officer’s (CDO) remit. If you don’t have a CDO in place, it’s worth considering whether you need to fill this role, or who else in your team might have spare capacity.
2. Audit your data governance
If you’re not steeped in data governance best practices, chances are there will be a lot of low-hanging fruit. CDOs should conduct a current state assessment of their organisation’s governance frameworks to gauge their readiness for the EU AI Act. Strategy, infrastructure, processes, and governance mechanisms should all come under the microscope.
3. Learn the rules for cross-border transfers
Global Mobility depends on cross-border transfers of personal data. The EU AI Act and GDPR harmonise the rules for conducting these transfers within the EU market. But you’ll still need to meet the rules of other territories when sending data outside Europe – and understand potential rebound risks for your EU compliance.
Here are three strategies for maintaining high standards of data security, wherever you’re sending information to.
- Use approved data transfer mechanisms: As stipulated in the GDPR, you must rely on approved mechanisms like Standard Contractual Clauses (SCCs) or Binding Corporate Rules (BCRs) to protect personal data when transferring it outside the EU.
- Assessing recipient countries’ data protection laws: Before transferring data to countries outside the EU, you should thoroughly assess the adequacy of data protection laws in the recipient country and implement additional safeguards where necessary.
- Keep detailed records: Maintain detailed logs of all cross-border data transfers, including the legal basis for the transfer and the safeguards in place.
Good data governance isn’t just important for avoiding regulatory penalties. It’s a table-stakes aspect of modern professional services.
Business relationships live or die by trust. If you can show your clients that you handle their sensitive information in line with the latest best practices, you stand a better chance of staying at the top of their list.
Data-driven technologies are about to upend Global Mobility, like every other sector. As this progresses, new rules of engagement will emerge—the EU AI Act is unlikely to be the last word on AI alignment regulation. Getting to grips with it is a non-negotiable first step in an evolving compliance journey.
If you’ve not already read our guide to innovating in line with the EU AI Act, download your free copy now.