Skip to main content

Overview

New technologies can improve how we work but they must be implemented fairly. This guidance helps employers and unions achieve this together.

This document gives guidance to managers, trade unions and other worker representatives. The guidance is about using algorithms and artificial intelligence (AI) systems in devolved public sector workplaces in Wales to help manage workers and their work. Devolved public sector workplaces include local authorities, NHS trusts, Welsh Government and arm’s length bodies in Wales. The principles and process set out here can be used to guide dialogue and decision-making regarding the use of new technology in public sector workplaces.

This document is based on more detailed guidance written on behalf of the Workforce Partnership Council (WPC). The WPC is made up of devolved employers, trade unions and the Welsh Government. The WPC has agreed this guidance, which means it should be followed in all devolved public sector bodies.

Key terms

Key terms in this document are:

  • Algorithmic management system: any system using computer processes to take or support employment decisions. This includes automation, machine learning, statistics or artificial intelligence (AI).
  • Social partners: representatives of Welsh Government, employers/management and workers/staff in the public sector in Wales. This can be at national or organisation level.

General principles

The following general principles should be followed by employers and employees throughout negotiation, decision-making, initial use, and ongoing review of any algorithmic systems that managers use.

1. Social partnership

This aligns with the Social Partnership and Public Procurement Act 2023 and our way of working in Wales.

2. Putting human oversight and human interaction first

There must be human oversight over decisions made about using algorithm systems (a “human in command”). There must be clear responsibility lines to managers for any decision made that relies upon the system. Human interaction should also be needed in the day-to-day running and decision-making of a system (a “human in the loop”).

3. Fair work

Following the Welsh Government’s meaning of fair work, including equality. Fair Work means workers are rewarded fairly, heard and represented, secure and able to progress in a healthy, inclusive working environment where rights are respected.

4. Building capability

A commitment to building capability regarding the use of technology across all workforce levels.

5.  Jobs

A commitment to protecting jobs, creating jobs, and investing in the workforce.

Before adoption and implementation

In this section, the guidance focuses on activities that should be done before a system is used. These activities make sure the Guidance Principles are followed and there is clear communication from the beginning of the process.

Audit existing systems

First, social partners should conduct a full audit of where technology, including artificial intelligence, is being used.

This audit may prompt dialogue about established uses of technology in accordance with the principles set out above. Social partnership may find the Algorithmic Transparency Recording Standard (ATRS) a useful way to create a comprehensive record of information about the different systems in use.

Build or buy?

Before starting development or buying a system, organisations should do a readiness check. This helps decide whether to buy a system from a supplier or build it themselves. Social partners should together identify whether technology is needed to solve a problem and whether organisation, in terms of IT, data, and workforce skills, is ready to adopt change.

Due diligence and consultation during buying or building systems

Organisations have a lot of experience buying or building systems for use in the public sector. As with the other parts of the process talked about in this document, employers and unions should work together. Organisations may find the Algorithmic Transparency Recording Standard (ATRS) useful to help structure talks with unions about buying or building new systems. Consultation while writing an ATRS record would support transparency about decision making early in the process.

There should be a collaborative approach to assessing and addressing any risks identified, which should include engagement with different teams within the organisation, workers, and experts in the field. Assessments conducted should include a Data Protection Impact Assessment, an Equality Impact Assessment, and an Algorithmic Impact Assessment.

Social partners should consider a range of risks, including to legal rights as well as broader risks like work intensification, loss of expertise and overriding human judgment. Organisations should also think about what happens if a contract with a technology supplier ends or if they need to stop using a system.

Worker representatives are important people involved in buying or building something new. They should be encouraged to share their skills and feedback. Managers should discuss worker concerns with a builder or supplier of a new system and seek any reassurances needed. It must be recognised that worker representatives may need outside advice or training skills to help their involvement in buying or building new algorithmic management systems. Time and funding should be given to enable this.

Staff training

Offering staff training is key to introducing a new system. The managers and workers should talk about this early in the process. There should be a whole team approach. Time should be taken to identify where workers, including managers, will need training to understand any new system.

After a new system starts being used

Algorithmic management systems must be audited. Social partners should agree on a framework for auditing the system before it is used.

Questions to ask could include:

  • Does the system do what it should?
  • How does it affect work and working conditions in the company?
  • Are agreed ways to mitigate risk in place? Do they work?
  • Did any risks or impacts appear that were not planned for? Do these need to be lowered or dealt with in another way?
  • Do technical checks and bias audits give good results?
  • Does the system and its use follow any rules agreed between employer and employees?

Feedback from workers and worker representatives should be sought, collected and used to shape the system. Workers should be able to raise concerns without being worried about any negative personal impact of doing so. Workplace representatives should have periodic access to the system to maintain transparency. This is to monitor how organisations are using the system, including examining decisions made or supported by the system.

Where concerns or negative impacts are highlighted through monitoring, organisations should seek to address and improve the system, with support from the supplier where necessary. Organisations must remain open to pausing or stopping the use of the system completely where risks cannot be managed properly.

Human in command and human in the loop

Human oversight of algorithm systems is important. A senior manager should be responsible and accountable for any algorithm management system, and for decisions made when using it.

Any manager who uses new technology should have enough training to understand the system, including how to interpret the system’s outputs correctly. They should be able to ignore or turn off the system if needed.

The system should involve humans for significant decisions about individuals. This is called keeping a “human in the loop”.

Organisations must protect people’s rights throughout the system’s life. This includes:

  • The right to get information about data processing, especially about automatic decisions.
  • The right to ask for access to your own data held by the organisation, and to correct it.
  • The right to ask for irrelevant or old data about you to be deleted.
  • The right to ask for a review of any automatic decision about you.

Partners may also agree extra rights, such as:

  • The right to raise concerns about a system without being negatively affected.
  • The right to not be harmed if you rely on an algorithm system.
  • The right to a personal explanation of any significant decision about you using an algorithm before it takes effect.

Information sharing and further updates

Social partners should share information about the system and how it is used as much as is reasonably possible.

Where the organisation wants to make changes to a system or updates are needed, they should make sure the Guiding Principles set out above are followed.