The collection of data from patients’ medical records

“Personal data and its endless uses form one of the most fundamental issues of our time, which boils down to the relationship between the individual and power, whether exercised by the government or private organisations.”

So says journalist John Harris, who raises concerns about threats to privacy with the ever- increasing access there is to individuals’ on-line data. These concerns have been echoed by bodies like medConfidential (an organisation that campaigns for consent and confidentiality in health and social care, particularly in relation to the NHS and access to patients’ medical data).

The project known as, for examplewas set up to extract patient information held by different healthcare settings (such as GP practices and care home) and then – combined with information from hospital records – enter it into a central set of data bases at what was then the national Health and Social Care Information Centre or HSCIC (now called NHS Digital). Over time, patient data from other settings such as community health services and social care were also to be also be included.

Plans to initiate stalled following public disquiet.  The original scheme suggested that data would be taken each month automatically from every patient’s record unless they explicitly opted out (the possibility of opt-out was not widely advertised). This information would then not only be available to staff working in health services (including those planning and buying services), social care and academic research but also – on payment – to private businesses, such as insurance and pharmaceutical companies.

Although the project was suspended, it’s worth knowing what it aimed to do as another, similar scheme is in the pipeline, under the auspices of NHS Digital (see below). It’s also relevant to other issues, like government guidance that requires doctors to allow the Home Office access to the personal data of undocumented migrants who seek NHS treatment.

What were seen as the benefits of

It was argued that would help the NHS find more effective ways of preventing or managing illness; monitor the risk of disease spread; streamline inefficiencies and drive economic growth. NHS England (NHSE) claimed that would be invaluable for the functioning of the NHS and improving the quality of patient care through:

  • Giving commissioners of services accurate information for planning;
  • Monitoring services and improving the quality of health care provision;
  • Allowing insight into patient outcomes, patient experience and the efficiency of a service;
  • Comparing the quality of care provided by different NHS providers;
  • Providing the public with information on which to base health care decisions;
  • Offering people inside and outside the NHS information for medical research, clinical audit and public health planning.

NHSE also suggested that would support economic growth by  reinforcing the UK “as a global centre for life sciences and health services research”, and supporting the development of “a vibrant market place” by making comparative data available to ‘app’ developers and website designers.  (See NHS England’s Privacy Impact Assessment). This chimes with NHS England’s five year plan for the NHS and its aim to reduce demand on services by using new technology (such as phone apps) as a way of encouraging patients to take more responsibility for managing their own health care.

There are strong arguments for collecting patient data – such as the opportunities this allows for monitoring inequalities of access to health care or changes in the population’s health, such a rates of heart disease or cancer.

In addition, as the government is relentlessly privatising NHS services, it becomes more and more important to have comprehensive, high quality data to monitor the impact of their policies. If patients registered with GP practices owned by Virgin or Serco, for example, opt out of schemes like, there will be no data about them and so the companies will be protected from scrutiny. This gap already exists in data about private nursing and residential care homes.

What were the concerns about

NHSE itself has acknowledged that posed some risk to privacy and confidentiality, including threats from hackers attempting to access the data illegally. The data could be differently at risk at different times, i.e. during the uploading of confidential data to the HSCIC; during processing of confidential data within the HSCIC;  or during the onward disclosure of data to other organisations.

There was also the possibility that, knowing that personal information would be taken from medical records, patients could loose trust in the confidential nature of the health service. Patients might feel the need to withhold vital information from the clinicians treating them, at the risk of receiving less than optimal care.

In addition, when patients visited their GP, certain computer codes (e.g. for a particular health problem, test or referral) would be automatically added to their medical records. Such coding can often be inaccurate and so there is a risk of incorrect data being transferred to  This has particular relevance given the access that health insurance companies may have to in future.

There is increasing involvement of private corporations, not just in the provision but also the commissioning of health services. Given fears that that the NHS may ultimately become a system based on private medical insurance, there were concerns that insurance companies could gain access to data made available by schemes like through their involvement in commissioning, while at the same time they are in the business, as insurance providers, of excluding individuals who represent the greatest health insurance risk.

What was the Health and Social Care Information Centre? 

The Health and Social Care Information Centre  (HSCIC) was set up as an Executive Non-Departmental Body responsible to parliament, and not to any minister. Its responsibilities included

  • Collecting, analysing and presenting national health and social care data;
  • Setting up and managing national IT systems for transferring, collecting and analysing information, as directed by the Secretary of State or NHS England;
  • Publishing rules to set out how the personal confidential information of patients should be handled and managed by health and care staff and organisations;
  • Creating a register of all the information they collect and produce in ways that ensure it will be useful to as many people as possible, while safeguarding the personal confidential data of individuals.

From 2013 there were growing concerns about its independence and its trustworthiness as the guardian of the personal details of every NHS patient in England. For instance, it was discovered that from at least 2005, the HSCIC (and its predecessor, the NHS Information Centre) had been passing on information about patients’ past and present addresses and GP registrations to the Home Office to allow it to trace and deport people living in the UK without the right to do so. Over 2016, the Home Office made 8127 requests to trawl the personal data of NHS patients for information, leading to 5854 people being traced. There had been no public consultation or parliamentary debate on this practice.

The ISCIC was superseded by NHS Digital.

What type of information would extract from patients’ records?

Three levels of data were to be extracted and then stored by ISCIC:

i) personal data where patients had been made anonymous and the data was aggregated or combined with that of other patients.

ii) personally identifiable data where there was a legal duty to provide this, e.g in a public emergency.

iii) pseudonymised data –  i.e data stripped of all features that would identify a patient, such as name or NHS number, and replaced by meaningless, fictitious identifying information that still allowed data about the same patient to be linked (e.g. when he or she used different services).

That said, identifying information such as a patient’s NHS number, postcode, date of birth, ethnicity, gender and GP surgery could still be made available to “approved analysts for approved purposes”, according to NHS England. It’s this level of data collection that caused the most concern.

Collected data might include:

  • ethnicity
  • the date registered with a GP surgery
  • medical diagnoses (including cancer and mental health) and any complications
  • referrals to specialists
  • prescriptions
  • family history
  • vaccinations and screening tests
  • blood test results
  • body mass index (height/weight)
  •  smoking/alcohol habits

This information would be coded – (to see the level of detail that would have been extracted from your notes see How this data was treated would depend on which category it came under.

How data would have been treated depended on which category it came under:

Green flow: This covered “anonymous” or aggregated data (Data aggregation is any process in which information is gathered and then expressed in a summary form, e.g. for statistical analysis). Despite being labelled anonymous, there was no guarantee that the original patient that the data refers to couldn’t be identified.  In addition, green flow data was not counted as ‘personal’. It therefore fell outside the Data Protection Act and could be freely given or sold on, without controls.

Amber flow: ‘Pseudonymised’ data  would be available to specific, approved groups of users, initially just for commissioning uses and in line with relevant guidance. (Pseudonymised data involves providing an individual with a pseudonym that will be attached to all their data, but not connected to their original identifying data.)

Red flow: This was identifiable data – it was only to be made available where there is a legal basis for doing so (e.g. with a patient’s consent or because of overriding public interest in disclosure, such as the outbreak of a new disease).  (see NHS England Privacy Impact Assessment.)

(There was some confusion about what happened when patients objected to their personal confidential data being uploaded to the HSCIC or beyond: for example, NHS England’s Privacy Impact Assessment stated that in these circumstances, the HSCIC would receive clinical data without any identifiers attached.)

Who would have had access to data?

Once the programme was up and running, a range of organisations would have been able to apply to gain access. These included commercial companies including drug companies, health charities, researchers at universities, hospital trusts, medical colleges, think tanks, IT specialists, and insurance companies. The MP David Davis expressed concern that, in addition, police would have been able to access data from the HSCIC ‘by the back door’.

Amber flow data was to be made available to NHS England’s Area Teams, to Clinical Commissioning Groups and Commissioning Support Units.  As commissioning covers a wide range of activities including monitoring, service planning, accounting and so on, and is increasingly carried out by private companies (see Commissioning Support Units), this meant that a wide range of people could have had access at this level.

Applications for access to sensitive data were to be considered by an independent body, the Data Access Advisory Group. Private firms, among others, would have been able to pay a fee to apply for access to sensitive or identifiable information. The assurance that ‘red flow’ data would only be available where there was a legal basis for this was contradicted by the scale of charges that HSCIC published for buying data (2013/14). This showed different categories of information available, including a ‘bespoke abstract containing personal confidential data”, containing patient identifiable data, sensitive data items or both. For example, BUPA, a private healthcare organisation, was one of the companies already cleared to access this ‘sensitive’ level of information.

What happened to was due to start Spring 2014. However, the scheme did not meet with widespread approval. The British Medical Association, for example, argued that should not continue at that point because:

i) it lacked confidentiality and there was a possibility for individual patient’s data to be identified;
ii) it carried the risk of making patients reluctant to confide in their GPs;
iii) who might use the data in future was not well defined;
iv) it needed to be an opt-in system rather than an opt-out one;
v) there was a risk that the data might not only be used for its stated purpose (improving patient care) but could be sold for profit.

On top of this, it became known that patients’ data from hospital admissions (including their diagnoses, treatments, ages and the area where they lived) had been given (or sold) to the body that regulates actuaries (those who evaluate financial risks). This data was subsequently used by insurance companies to price their products.

Because of a massive loss of confidence, with one in every 45 patients choosing to opt to of the scheme, the Department of Health announced in July 2016 that it was to close the programme.

What next?

In July 2016, the National Data Guardian for Health and Care (Dame Fiona Caldicott) was asked to

  • develop new standards for data security for the whole health and social care system;
  • develop, together with Care Quality Commission, a method of testing compliance with these new standards, and
  • propose a new consent/opt-out model for data sharing to enable patients to make an informed decision about how their personal confidential data will be used.

Her recommendations included:

  1.  New data security standards for every organisation handling health and social care information to address the main reasons for past breaches to security of paper-based and digital data. These standards will be simple for people to understand. They should apply across the entire health and social care system to support rather than inhibit data sharing. They will be fit for the future, where personal confidential data will be stored digitally, and health and social care will be integrated.
  2. Ensuring all staff working in health and social care are properly trained in complying with data security standards, with extra training for people in leadership roles.
  3. Tougher penalties for intentional breaches of security.

Meanwhile, according to medConfidential, patients’ data from hospital admissions continues to be sold.

(There was some confusion about what happened when patients objected to their personal confidential data being uploaded to the HSCIC or beyond: for example, NHS England’s Privacy Impact Assessment stated that in these circumstances, the HSCIC would receive clinical data without any identifiers attached.)

Enter NHS Digital

NHS Digital supercedes the HSCIC. It describes itself as providing  national information, data and IT systems for health and care services, and exists  to help patients, clinicians, commissioners, analysts and researchers.

Their data collections come from many aspects of health and social care, such as NHS Trusts, local authorities and independent organisations.   Their national data sets collect information from care records, systems and organisations involved in a range of areas of health and care, in order to inform policy, and monitor and improve care. NHS Digital also provides technological infrastructure, and aims  to help different parts of health and care work together.

So far, it’s not clear if NHS Digital uses the same categories as HSCIC for the data it collects, or to whom (beyond the Home Office) it makes data available.

Is access to my confidential data legal? 

The Health and Social Care Act (2012) provides the legal basis for the extraction of personal confidential information in some circumstances – for example, where the public interest justifies disclosure. According to the Act, the common law duty of confidence can be set aside, for instance, in order to maintain effective immigration controls or to protect limited resources and public services from unnecessary financial and resource pressures.

In January 2017 a Memorandum of Understanding (MoU) was signed. Among other things, this requires NHS Digital to respond to Home Office requests for patients’ data (date of birth and NHS registration, gender and address) to enforce immigration rules. This is despite objections from the Commons Health Select Committee that this made undocumented migrants too afraid to access health care. According to Doctors of the World, this use of patients’ records marks the intrusion of a political agenda into the way our medical records are kept. It uses patient information in a way that, were a doctor to do it, they would be in breach of their ethical and legal duty of confidentiality. The legal basis for the disclosure of information under the MoU is currently subject to judicial review.

The Data Protection Bill currently going through Parliament (2018) introduces a framework that exempts government data handling from any oversight or scrutiny.

Additional information was an entirely different scheme to the Summary Care Record (SCR). The SCR was originally intended to provide access to vital information (for example, about allergies, recent prescriptions, or bad reactions to drugs) to those providing direct care (e.g. clinicians in A & E departments or out of hours care) across different settings. The SCR also provides details of a patient’s name, address, date of birth and NHS Number.

There is growing concern that access to patients’ SCR will be widened, for example to pharmacies, including Boots or large supermarkets, and may be used for marketing purposes. For example, Parmacy 2U, an on-line pharmacist advertising ‘discreet and confidential services’, has been fined £130,000 for causing substantial damage and distress by selling on information provided by more than 20,000 patients when they registered to use Pharmacy 2U services. Information included name, date of birth, sex, postal address, postal and email addresses, together with a list of health conditions patients might suffer from. Pharmacy 2U database lists were advertised for rental at a cost of £130 per 1000 records. (20% of Pharmacy 2U is owned by EMIS, the company that provides computer systems to many GPs.) One company that bought records was operating a lottery from Australia while under investigation for fraud and money laundering. This company obtained the records of 3,000 men aged 70 or above – perhaps assuming a significant number of these to be frail or vulnerable – who were then targeted as “specially selected” to win “millions of dollars”.

If you have a Summary Care Record (around 94% of the population do) and you are concerned that your record may be misused or abused, you can opt-out of the scheme. There’s a link to the official opt-out form, which you need to fill in and give to your GP. However, if you do this and you have any allergies or bad reactions to medicines, it might be wise to get a MedicAlert bracelet or similar to provide information if you should need emergency care. and its replacement, and increasing access to the SRC, can be seen as part of a broader trend to make use of new technologies to increase access to patient data. In many instances this is in patients’ interests – to enable clinical staff working in GP surgeries, ambulance services and A & E services to access a patient’s records (such as test results), wherever they are. There is also a push for patients to have increased access to their own records “in order to manage their healthcare needs” (for more details, see the Five Year Plan). The Secretary of State for Health has acknowledged that, in order to gain the trust of patients, there must be measures to ensure the security of their confidential data.


In July 2016 a CQC Report proposed 10 security standards to be applied in every health and care organisation that handles personal confidential information. These included measures to protect systems against data breaches, “ensuring that NHS leadership takes ownership and responsibility for data security and ensuring that organisations are as prepared as they can be to meet the challenges of the digital age”.

Subsequently, the National Data Guardian Review proposed a new, less complex system for ensuring that patients can opt out of their personal confidential data being used for purposes beyond their direct care unless there is a mandatory legal requirement or an overriding public interest. The Review recommended that the new system is put out for public consultation.

According to NHS Digital in March 2018:

“The Secretary of State has agreed that the national data opt-out will be introduced alongside the new data protection legislation on 25 May 2018. It has also been agreed to present the national data opt-out as a single question to cover both research and planning. Type 2 opt-outs (which currently prevent identifiable data from leaving NHS Digital) will be converted to the new national data opt-out when it is introduced in May. Patients with type 2 opt-out will be contacted directly about this change.”

More information to follow.

See also

For more information on NHS Digital see

March 2018

Comments are closed.