The PHG Foundation is currently undertaking an extensive report, funded by the Wellcome Trust on Transparency and explanation in Black Box Medicine we have focused on the consultation questions that are most relevant to our own perspective.
- We commend the Information Commissioner’s Office for facilitating dialogue around how to meet the obligations for transparency and explanation. As the guidance suggests, meeting the obligations for transparency and explanation goes further than simply satisfying relevant legal obligations, but involves fostering trust and confidence across a diverse range of stakeholders
- In our view, the draft guidance is a good attempt to demonstrate that different explanations serve different purposes, and that in a given context a variety of different explanations are necessary. There is a wealth of detail in the guidance and we are pleased that the communication challenges of the decision recipient receiving multiple explanations throughout the process, is addressed in step 7. The emphasis on layering explanations, and ensuring that there is a continuing dialogue rather than a one way process are key, and we note that in the healthcare context, similar discussions have been had about the nature of consent to care. Delivering robust and appropriate explanation will requires a high level of staff engagement and expertise and sufficient resources, for which an institutional commitment is needed
- Our focus is on decision making in healthcare. Here, healthcare professionals have a substantial role in implementation of AI systems, and in the foreseeable future, are liable for their use. More weight could be given in the guidance to the role of professional guidance in framing the obligations of implementers for offering explanation. An example is the General Medical Council, Duties of a Doctor guidance which includes for example, the requirement for a doctor to communicate the risks associated with a treatment or intervention.
The full response is available to read here