Experts from the National Cyber Security Centre have been supporting the development of the NHS COVID-19 contact tracing app, which will be launched on the Isle of Wight this week.
The app forms part of the government’s wider test and trace programme to tackle the coronavirus pandemic and pave the way to safely reducing current social distancing measures.
The privacy and security of app users’ data is a priority and the NCSC has been advising on best practice throughout the app’s development.
The NCSC has today published three documents relating to its work on the app:
- A technical paper, ‘High level privacy and security design for NHSX Covid-19 Contact Tracing App,’ written by NCSC Technical Director Dr Ian Levy.
- A blog post, ‘The security behind the NHS contact tracing app,’ also written by Dr Levy.
- An explainer setting out how the app will help slow the spread of coronavirus whilst protecting your privacy.
Head to GOV.UK for the latest government advice on the coronavirus.
There’s been a few reports, the first one I saw being in the FT earlier this week, saying that the UK is now investigating using the Google/Apple APIs for its tracking app. Here’s an article from yesterday that isn’t behind a paywall.. https://techcrunch.com/2020/05/07/uk-eyeing-switch-to-apple-google-api-for-coronavirus-contacts-tracing-report/ which says…
“Yesterday the FT reported that NHSX, the digital transformation branch of UK’s National Health Service, has awarded a £3.8M contract to the London office of Zuhlke Engineering, a Switzerland-based IT development firm which was involved in developing the initial version of the NHS COVID-19 app.
The contract includes a requirement to “investigate the complexity, performance and feasibility of implementing native Apple and Google contact tracing APIs within the existing proximity mobile application and platform”, per the newspaper’s report.”
I’m a computer scientist not an epidemiologist so I have no way of knowing whether the data forsaken (not able to be collected) by going the Google/Apple route is sufficiently useful for modelling and/or pandemic control to be worth compromising privacy and having a more complicated app development cycle but one thing that does leap out at me is that, with the UK’s vast expertise and well-developed software engineering sector, HMG isn’t using a UK company to build this app, presumably because it felt no UK company was the optimal choice.
I suppose the counter-argument is that this is a massive global crisis with the potential for a once in a lifetime impact on UK citizens and our economy so use the absolute best tool (software development company) for the job regardless of where it comes from but even in that case I find it sad that it might be the case that it was felt that selecting the best UK organisation out there would have compromised the work vs choosing an overseas operation.
Thanks, Juilan.
” Zuhlke Engineering, a Switzerland-based IT development firm which was involved in developing the initial version of the NHS COVID-19 app”…”The contract includes a requirement to “investigate the complexity, performance and feasibility of…” etc.
I’m no computer scientist so it’s all a bit complex for me but isn’t this a bit poacher turned gamekeeper?
Also… a bit concerned, as you are, that HMG haven’t contracted a UK firm to do the work.
Also… as this is basically a national security issue, shouldn’t GCHQ do the job? They would be more than competent and would be totally neutral. And it wouldn’t cost as much.
I don’t understand how these apps will work. How does the app’s “positive” for Covid-19 get activated? Is that done at a doctor’s office, government testing facility, ect with a code or something?
Some of the other contact notification systems seem to have the owner of the phone themselves activate a positive Covid-19 response. If that is so then what stops prankersters, idiots, people with mental problems, ect from turning on their phone’s Covid-19 positive notification and going around people, crowds to mess with people?
The apps “positive for Covid-19” trigger gets activated by the user of the app who declares him or herself positive. For apps based on the Apple/Google API (which the currently in-testing NHS app isn’t) there is the option to not allow that to happen unless the user enters a valid one-time unique passcode. The idea is that a valid passcode is only issued by test centres when they report a positive test result to someone who has been tested. The idea is to stop idiots hitting the “I’m positive” button after one too many beers or whatever. I’ve heard rumours that the current version of the NHS app being trialed on the Isle of Wight doesn’t have that safeguard but that may not be correct or, even if it is correct, some sort of similar protection might be introduced if/when it rolls out nationally.
Even with test centres issuing authorisation codes necessary to enable a user to declare that they are positive there is of course still the possibility that someone positive might think it funny to give their passcode to someone else so that they can declare a false positive and mess with the system (but only one person could register the false positive, it is a single use passcode) but I suspect the likelihood and consequences of that are far, far less that multiple drunken idiots pressing the “I’m positive” button on an unprotected system.
So with these passcodes the government would be aware of exactly who is infected and who’s not and store them in a database that can be accessed by other agencies, ect. Well there goes one’s privacy.
Complex programs like this can take a very long time to develop and even given proper testing time can still have many security holes. Rushing something like these apps out is a recipe for disaster. If they ever make these things mandatory I will put away my iPhone and go back to using my old BB or even older flip phone.
Well no, not “there goes one’s privacy” because of the app. If you go to a testing centre, get swabbed, and give a name and address for your results to be sent to then obviously you are known to the government. People being tested in hospital or at the drive-in centres have had no privacy ever since the crisis started. Complaining about that seems no different to me to saying “I went to my GP yesterday to have a cholesterol check – there goes my privacy”. There was never, and has never been, any privacy around the positive tests in the first place. It is the ability (or rather lack of ability) to see who people using the app have come into contact with and where they have been that is the focus of the privacy debate.
As far as the app goes the passcode that the testing centre issues never leaves the user’s phone so can’t be associated with the set of keys uploaded once the passcode is entered by the user into their instance of the app. All that the government servers see on each “I’m positive” event is a set of random keys that the user’s phone has been sending out (sort of, it’s actually more complicated and secure than that – https://blog.google/documents/69/Exposure_Notification_-_Cryptography_Specification_v1.2.1.pdf).
All other phones running the app check in with the central government server at regular intervals to download the list of latest uploaded exposure keys from people who tested positive. Note that this database of keys is simply a collection of 14 byte random numbers with no associated ID or location data. From those keys the apps on other users’ phones do local check against the set of “I came into contact with” keys stored locally on each users phone to see whether they came into contact with any phones owned by infected people over the last 14 days. Those “I came into contact with” keys are never uploaded to the government server in the Google/Apple scheme whereas they are in the home-grown NHS scheme.
Apart from the small caveat at the end of the paragraph above all the above is for apps built on the Google/Apple API. I have no detailed knowledge of how the current home-grown NHS app works.