Over the past few months PHG Foundation has been exploring the role of citizen generated data (CGD) in health by looking at themes such as patient participation in their care and the increasingly blurred line between lifestyle and medical devices. We touch on some of these issues in a briefing note What is citizen generated data?, and at a recent roundtable with cross-sector experts and representatives we discussed some of these topics in more detail. We will be analysing some of the key workshop findings in more depth in the coming months.
Take the PHG Foundation survey and help us make healthcare better - click here
What opportunities does CGD present for health?
Delegates discussed various aspects of health that could be impacted by CGD. Opportunities for the health system include:
- Improving direct care, access to services, and pharmacovigilance
- Gaining insights on patient perspectives of healthcare services
- Enriching public health data flows and analysis
It remains to be seen how useful CGD will be in any of these contexts but the many examples delegates produced, such as mining publicly available sources of CGD to identify at risk individuals, highlighted growing interest and enthusiasm in this area.
Click an image to view the illustrated notes from the event
The intention behind data production
One of the defining characteristics of CGD is the intent behind its production. There are inherent differences in the opportunities and challenges around data that is produced with the primary intent of deriving health related learnings versus that which is produced without the intention that any health related inferences will be made from it.
Social media and internet search data can be considered unintentional in this context as individuals are not creating the data with the primary purpose of themselves or others analysing it for health purposes. There are multiple technical challenges in using such publicly available data - privacy and consent are also of concern. The perceived anonymity of the internet may create a false sense of security for users, therefore they may act in a more truthful way or in a way that is not really representative of them. An important consideration is that if data collected from the internet becomes routinely analysed for health purposes, will people augment their behaviours?
In contrast, the primary purpose of data produced through self-tracking tools is to draw some conclusion about health. With the ubiquity of smartphones and abundance of digitally-enabled wearable and environmental sensors, self-tracking is no longer confined to diabetics or enthusiasts part of the Quantified Self movement. Many people undertake some form of tracking, whether that is their mood on a smartphone app or their daily step count with a smart watch. To the individual, tracking isn’t necessarily about aiding a diagnosis or indicating ill-health, it’s about understanding oneself in order to become healthier, stay healthy or optimise health. Understanding the motivations behind self-tracking may help us to determine the usefulness of the data.
For both passive and active data production, however, key challenges include the lack of data provenance and data biases.
Is self-tracked data a moving target?
New apps and wearables that enable some form of logging or tracking are constantly appearing on the market meaning consumers have an abundance of choice. We know that users tend to engage with any one tracking tool over a relatively short time period (weeks/months rather than years). One reason for this, suggested at the workshop, is that once an individual learns the answer to their question, such as “do my activity levels alter my mood?”, they might not feel inspired to continue tracking those parameters. They may then move on to the next question they have which could involve different tracking tools.
It is also not realistic to think that most people will consistently and continuously use digital tools to track themselves over their lifetime. They are also unlikely to adhere to the same tracker and be interested in the same parameters for long periods. How would the health service keep up with such a rapidly changing landscape, with vastly different data structure and quality as well as missing and patchy data flows? Might data that is unintentionally produced be more promising for use by the health system?
The challenges with care.data, a project intended to allow researchers’ access to anonymised GP records, was discussed in relation to public trust. It was suggested that the public perceive that data sharing within the health system happens already and is generally beneficial but people are more anxious about data sharing between the health system and commercial entities that hold CGD (e.g. Google, Fitbit). Indeed, research has looked at public trust in terms of commercial access to health data but evidence is lacking for how the public would feel about the health system having access to CGD.
Citizens may accept that supermarkets analyse their shopping habits since they receive targeted coupons and deals. Delegates hypothesised that citizens would feel very differently about the health system seeing such data. Public perception and confidence in the use of data will largely depend on what the data will be used for, by whom and how it will be managed.
Citizens may also be concerned about CGD being fed into the health system and linked to sensitive information. Despite ongoing efforts to improve IT infrastructure and security, citizens may be apprehensive due to the perceived vulnerability of the NHS to cyberattacks and data breaches.
What next from the PHG Foundation on CGD?
Informed by the insightful discussions from the round-table, we will be working on producing briefing notes on key topics over the next few months, including:
- Opportunities presented by CGD for public health, such as understanding the wider determinants of health and improving disease surveillance
- Ethical and societal challenges of using CGD including diagnosis from a distance and safeguarding vulnerable individuals.