– Collection of information from patient records

The was set up to extract patient information from different healthcare settings such as GP practices and care homes and then, combined with information from hospital records, enter it into a central set of data bases at the national Health and Social Care Information Centre (now called NHS Digital). Over time, patient data from other settings such as community health services and social care were also to be also be included.

It was argued that would help the NHS find more effective ways of preventing or managing illness; monitor the risk of disease spread; streamline inefficiencies and drive economic growth. The data was to be updated each month and taken automatically from every patient’s record, unless they explicitly opted out.

This information would then have been available to staff working in health, social care and research settings but also – on payment – to private businesses, such as insurance and pharmaceutical companies.

There were three levels of data that were to be extracted:

i) personal data where patients had been made anonymous and the data was aggregated or combined with that of other patients.

ii) personally identifiable data where there was a legal duty to provide this, e.g in a public emergency.

iii) pseudonymised data –  i.e data that was stripped of all features that would identify a patient, such as name or NHS number, to be replaced by meaningless, fictitious identifying information that still allowed data about the same patient to be linked (e.g. when he or she used different services). That said, identifying information such as a patient’s NHS number, postcode, date of birth, ethnicity, gender and GP surgery could be made available to “approved analysts for approved purposes”, according to NHS England. It ‘s this level of data collection that causes the most concern. Data collected from patients’ records was not only to be shared with NHS and social care staff, researchers and those commissioning (i.e planning and buying) NHS services: it was also to made available for sale to private companies.

Collected data might include

  • ethnicity
  • the date registered with a GP surgery
  • medical diagnoses (including cancer and mental health) and any complications
  • referrals to specialists
  • prescriptions
  • family history
  • vaccinations and screening tests
  • blood test results
  • body mass index (height/weight)
  •  smoking/alcohol habits

This information would be coded – (to see the level of detail that would have been extracted from your notes see was due to start Spring 2014. However, the scheme did not meet with widespread approval. The British Medical Association, for example, argued that should not continue at that point because

i) it lacked confidentiality and there was a possibility for individual patient’s data to be identified;
ii) it carried the risk of making patients reluctant to confide in their GPs;
iii) who might use the data in future was not well defined;
iv) it needed to be an opt-in system rather than an opt-out one;
v) there was a risk that the data might not only be used for its stated purpose (improving patient care) but could be sold for profit.

On top of this, it became known that patients’ data from hospital admissions (including their diagnoses, treatments, ages and the area where they lived) had been given (or sold) to the body that regulates actuaries (those who evaluate financial risks). This data was subsequently used by insurance companies to price their products.

Because of a massive loss of confidence, with one in every 45 patients choosing to opt to of the scheme, the Department of Health announced in July 2016 that it was to close the programm.

In July 2016, the National Data Guardian for Health and Care (Dame Fiona Caldicott) was asked to

  • develop new standards for data security for the whole health and social care system,
  • together with Care Quality Commission, develop a method of testing compliance with these new standards, and
  • propose a new consent/opt-out model for data sharing to enable patients to make an informed decision about how their personal confidential data will be used.

Her recommendations included:

  1.  new data security standards for every organisation handling health and social care information to address the main reasons for past breaches to security of paper-based and digital data. These standards will be simple for people to understand. They should apply across the entire health and social care system to support rather than inhibit data sharing. They will be fit for the future, where personal confidential data will be stored digitally, and health and social care will be integrated.
  2. ensuring all staff working in health and social care are properly trained in complying with data security standards, with extra training for people in leadership roles.
  3. tougher penalties for intentional breaches of security.

Meanwhile, patients’ data from hospital admissions continues to be sold (

What is NHS Digital (previously the Health and Social Care Information Centre? 

The Health and Social Care Information Centre  was set up as a public body in April 2013. Its responsibilities included

  • Collecting, analysing and presenting national health and social care data;
  • Setting up and managing national IT systems for transferring, collecting and analysing information, as directed by the Secretary of State or NHS England;
  • Publishing rules to set out how the personal confidential information of patients should be handled and managed by health and care staff and organisations;
  • Creating a register of all the information they collect and produce in ways that ensure it will be useful to as many people as possible, while safeguarding the personal confidential data of individuals.

Is access to my confidential data legal? 

The Health and Social Care Act (2012) provides the legal basis for the extraction of personal confidential information in some circumstances – for example, where the public interest justifies disclosure. According to the Act, the common law duty of confidence can be set aside, for instance, in order to maintain effective immigration controls or to protect limited resources and public services from unnecessary financial and resource pressures.  [1]

In January 2017 a Memorandum of Understanding was signed which requires NHS Digital to share confidential patient information with the Home Office for immigration enforcement. Even before this, over 2016, the Home Office made 8127 requests to trawl the personal data of NHS patients for information, leading to 5854 people being traced. There has been no public consultation or parliamentary debate on this practice. According to Doctors of the World, this use of patients’ records marks the intrusion of a political agenda into the way our medical records are kept. It uses patient information in a way that, were a doctor to do it, they would be in breach of their ethical and legal duty of confidentiality. (

What was HSCIC/Digital NHS going to do with patients’ information and who would have had access to it? 

How data would have been treated depended on which category it came under:

Green flow: This covers “anonymous” or aggregated data.[3] Despite being labelled anonymous, there is no guarantee that the original patient that the data refers to can’t be identified.  In addition, green flow data is not counted as ‘personal’. It therefore falls outside the Data Protection Act and can be freely given or sold on, without controls.

Amber flow: ‘Pseudonymised’ data[4] will be available to specific, approved groups of users, initially just for commissioning uses and in line with relevant guidance.

Red flow: This is identifiable data – it is only to be made available where there is a legal basis for doing so (e.g. with a patient’s consent or because of overriding public interest in disclosure, such as the outbreak of a new disease).  (see NHS England Privacy Impact Assessment.)

(There was some confusion about what happened when patients objected to their personal confidential data being uploaded to the HSCIC or beyond: for example, NHS England’s Privacy Impact Assessment stated that in these circumstances, the HSCIC would receive clinical data without any identifiers attached.)

Once the programme was up and running, a range of organisations would have been able to apply to gain access. These included commercial companies including drug companies, health charities, researchers at universities, hospital trusts, medical colleges, think tanks, IT specialists, and insurance companies. The MP David Davis expressed concern that, in addition, police would have been able to access data from the HSCIC ‘by the back door’. [5]

Amber flow data was to be made available to NHS England’s Area Teams, to Clinical Commissioning Groups and Commissioning Support Units.  As commissioning covers a wide range of activities including monitoring, service planning, accounting and so on, and is increasingly carried out by private companies (see Commissioning Support Units), this meant that a wide range of people could have had access at this level.[6]

Applications for access to sensitive data were considered by an independent body, the Data Access Advisory Group. Private firms, among others, would have been able to pay a fee to apply for access to sensitive or identifiable information. The assurance that ‘red flow’ data would only be available where there was a legal basis for this was contradicted by the scale of charges that HSCIC published for buying data (2013/14). This showed different categories of information available, including a ‘bespoke abstract containing personal confidential data”, containing patient identifiable data, sensitive data items or both.[7] For example, BUPA, a private healthcare organisation, was one of the companies already cleared to access this ‘sensitive’ level of information.[8]

NHS England claimed that would be invaluable for the functioning of the NHS and improving the quality of patient care through:

  • Giving commissioners of services accurate information for planning;
  • Monitoring services and improving the quality of health care provision;
  • Allowing insight into patient outcomes, patient experience and the efficiency of a service;
  • Comparing the quality of care provided by different NHS providers;
  • Providing the public with information on which to base health care decisions;
  • Offering people inside and outside the NHS information for medical research, clinical audit and public health planning.

NHS England also suggested that would support economic growth by  reinforcing the UK “as a global centre for life sciences and health services research”, and supporting the development of “a vibrant market place” by making comparative data available to app developers and website designers.  (See NHS England’s Privacy Impact Assessment). This chimes with NHS England’s five year plan for the NHS and its aim to reduce demand on services by using new technology (such as phone apps) as a way of encouraging patients to take more responsibility for managing their own health care.

Rather differently, some researchers argued that if many patients opted out of, this would undermine the possibilities for monitoring inequalities of access to health care or changes in the population’s health, such a rates of heart disease or cancer.

It was also argued that if many patients opted out, this would hide the impact of government policies to privatise the NHS. This is because the private sector currently has a really poor track record for data collection, even when carrying out work funded by the NHS. In addition, general practices owned by private companies such as Virgin and Serco would be protected from scrutiny if their patients opted out, as there would be no data about them. This gap already exists in data about private nursing and residential care homes.[9]

However, NHS England has acknowledged that did pose some risk to privacy and confidentiality, including threats from hackers attempting to access the data illegally.[10] The data may be differently at risk at different times, i.e. during the uploading of confidential data to the HSCIC; during processing of confidential data within the HSCIC;  or during the onward disclosure of data to other organisations.

There is also the possibility that knowing that personal information will be taken from medical records could lead to patients losing trust in the confidential nature of the health service. Patients might feel the need to withhold vital information from the clinicians treating them, at the risk of receiving less than optimal care.

In addition, when patients visit their GP, certain computer codes (e.g. for a particular health problem, test or referral) are automatically added to their medical records. This coding can often be inaccurate and so there is a risk that incorrect data will be transferred to  This has particular relevance given the access that health insurance companies may have to in future.

One of the main ways in which was to be used was in NHS commissioning (the planning and procurement of NHS services, now the responsibility of clinical commissioning groups or CCGs). CCGs have been delegating much of the work of commissioning to NHS Clinical Support Units (CSUs), but recently a new ‘Commissioning Support Lead Provider Framework’ has been announced.  This Framework includes some of the original CSUs but also a wide range of private companies that are becoming increasingly involved in the planning and buying of services. The list of these companies is dominated by management consultancy firms and corporations like US health insurer, UnitedHealth. Given that there are strong indications that there are plans to turn the NHS into an insurance-based system, many people are concerned that insurance companies could gain access to data made available by schemes like through involvement in commissioning, while at the same time they are the business, as insurance providers, of identifying individuals who offer the lowest health insurance risk.

Clearly, there are good reasons to have the best information possible to inform research into public health, new treatments, patient outcomes, as well as the planning of health and social care. At the same time, as we move into an era in which the number of people accessing and using medical data will rapidly expand, might be viewed with suspicion for a number of reasons, not least that the HSCIC has been established under the Health and Social Care Act (2012), which has extended the privatisation of the NHS.  From what we know, could have been a way of privatising patient data collection and analysis, and an integral step towards insurance-based care.

Now, the National Data Guardian Review is proposing a new, less complex system for ensuring that patients can opt out of their personal confidential data being used for purposes beyond their direct care unless there is a mandatory legal requirement or an overriding public interest. The Review recommends that the new system is put out for public consultation.

Additional information  was to cost over £50m and there were no indications that the HSCIC would carry out routine, in-house analysis of the data it collected.  Meanwhile cuts to the budget of the Office for National Statistics mean that researchers are loosing access to a range of existing health statistics. In addition, the future of the decennial census (which provides valuable data on the population as a whole and is essential to good public health research) is uncertain.[11] Rightly or wrongly, these cuts and the potential loss of existing, valued sources of data raise suspicions that the intentions behind were not what we were told (or not only what we were told). was an entirely different scheme to the Summary Care Record (SCR). The SCR was originally intended to provide access to vital information (for example, about allergies, recent prescriptions, or bad reactions to drugs) to those providing direct care (e.g. clinicians in A & E departments or out of hours care). The SCR also provides details of a patient’s name, address, date of birth and NHS Number.

There is growing concern that access to patients’ SCR will be widened, for example to pharmacies, including Boots or large supermarkets, and may be used for marketing purposes. For example, Parmacy 2U, an on-line pharmacist advertising discreet and confidential services, has been fined £130,000 for causing substantial damage and distress by selling on information provided by more than 20,000 patients when they registered to use Pharmacy 2U services. Information included name, date of birth, sex, postal address, postal and email addresses, together with a list of health conditions patients might suffer from. Pharmacy 2U database lists were advertised for rental at a cost of £130 per 1000 records. (20% of Pharmacy 2U is owned by EMIS, the company that provides computer systems to many GPs.) One company that bought records was operating a lottery from Australia while under investigation for fraud and money laundering. This company obtained the records of 3,000 men aged 70 or above – perhaps assuming a significant number of these to be frail or vulnerable – who were then targeted as “specially selected” to win “millions of dollars”. (For the full report, see .)

If you have a Summary Care Record (around 94% of the population do) and you are concerned that your record may be misused or abused, you can opt-out of the scheme. There’s a link to the official opt-out form, which you need to fill in and give to your GP. However, if you have any allergies or bad reactions to medicines, it might be wise to get a MedicAlert bracelet or similar to provide information if you should need emergency care. and increasing access to the SRC can be seen as part of a broader trend to make use of new technologies to increase access to patient data. In many instances this is to enable clinical staff working in GP surgeries, ambulance services and A & E services to access a patient’s records (such as test results), wherever they are. There is also a push for patients to have increased access to their own records “in order to manage their healthcare needs” (for more details, see the Five Year Plan). The Secretary of State for Health has acknowledged that, in order to gain the trust of patients, there must be measures to ensure the security of their confidential data.

In July 2016 the CQC Report proposed 10 security standards to be applied in every health and care organisation that handles personal confidential information. These include measures which will protect systems against data breaches, “ensuring that NHS leadership takes ownership and responsibility for data security and ensuring that organisations are as prepared as they can be to meet the challenges of the digital age”. It is not yet known how the DoH will respond.

See also

For more information on the HSCIC


[1] NHS England. Privacy Impact Assessment:  accessed 20.2.14
[3] Data aggregation is any process in which information is gathered and then expressed in a summary form, e.g. for statistical analysis.
[4]  Pseudonymised data involves providing an individual with a pseudonym that will be attached to all their data, but not connected to their original identifying data.
[11] Pollock A. Why the public should opt into and out of data privatisation. 31.1.14[12] Molloy C. Privacy campaigners team up with leading public health figure to fix Hunt’s ‘complete nonsense’ on
Updated July 2016

Comments are closed.