Transparency must underpin algorithm accountability
22 May 2018
News
The House of Commons Science and Technology Committee has today launched a report on algorithms in decision making, which sets the agenda for the newly established centre for data, ethics and innovation.
Algorithms, the fundamental basis for all computerised tasks, are becoming increasingly integral to healthcare for solving problems in research, diagnosis, prognosis, and monitoring. And earlier this week Prime Minister Theresa May spoke about using algorithms more in the NHS to improve cancer diagnosis rates.
Algorithms in decision making, acknowledges the huge opportunities algorithms bring to healthcare and the wider public sector, but calls for as much transparency as possible to ensure that algorithms work fairly and do not disproportionately affect certain groups.
Chair of the Science and Technology Committee Norman Lamb said:
Algorithms present the Government with a huge opportunity to improve public services and outcomes, particularly in the NHS…The Centre for Data Ethics & Innovation should review the operation of GDPR, but more immediately learn lessons from the Cambridge Analytica case about the way algorithms are governed when used commercially.
The Government must urgently produce a model that demonstrates how public sector data can be responsibly used by the private sector, to benefit public services such as the NHS. Only then will we benefit from the enormous value of our health data.
In written evidence submitted to inquiry, the PHG Foundation highlighted that emphasis on technical and non-technical solutions early in algorithm development is necessary to ensure algorithms are as transparent as possible and their developers sufficiently accountable. The Foundation also argued that more specific advice on how data controllers comply with GDPR, which becomes effective this week, is needed and the Information Commissioner’s Office (ICO) should engage with the algorithm development sector directly.
PHG Foundation’s Head of Humanities, Alison Hall said:
The report acknowledges that the vital challenge for the development of AI in the UK is to develop a regulatory framework that simultaneously promotes innovation whilst securing vital trust and confidence. This can only be done by promoting a regulatory system that is trustworthy as well as transparent and the recommendations in this report will be critical in achieving this.
Today’s report joins a broader report on the economic, ethical and social implications of advances in artificial intelligence published by the Lords Select Committee on Artificial Intelligence, which PHG Foundation contributed oral evidence too.
PHG Foundation is currently carrying out a project on regulating algorithms in healthcare which involves bringing together regulators, academics and developers to better understand the regulatory landscape that algorithms fit into in the context of healthcare. Find out more about the project here.