About The Blog

Debate at the intersection of business, technology and culture in the world of digital identity, both commercial and government, a blog born from the Digital Identity Forum in London and sponsored by Consult Hyperion



  • Add to
Technorati Favorites


  • Creative Commons

    Attribution Non-Commercial Share Alike

    This work is licensed under a Creative Commons Attribution - Noncommercial - Share Alike 2.0 UK: England & Wales License.

    Please note that by replying in this Forum you agree to license your comments in the same way. Your comments may be edited and used but will always be attributed.

« Thanks, thank you all | Main | Rob Schuurman, Nedap »

Out of control, up to a point

By Dave Birch posted Nov 17 2009 at 12:04 PM

[Dave Birch] I re-read an excellent post over at Emergent Chaos. It reflected an important discussion between two people, both of whom I take very seriously. To paraphrase and simplify horribly, Bob thinks that the social structures maintain privacy, Adam thinks that technological structures maintain privacy.

In a world where some people say "I've got nothing to hide" and others pay for post office boxes, I don't know how we can settle on a single societal norm. And in a world in which cheesy-looking web sites get more personal data — no really, listen to Alessandro Acquisti, or read the summary of "Online Data Present a Privacy Minefield" on All Things Considered... -- I'm not sure the social frame will save us.

[From Emergent Chaos: Bob Blakley Gets Future Shock Dead Wrong]

The lack of a "norm" is a good point here, and I have to say it made me think. We should be developing tools that allow people to construct their norms (within boundaries, obviously) but not setting out a norm so that the tools can only implement one model. For this reason, amongst others, I tend to come down on the more technological side of this argument, which is why I'm so keen to see privacy as part of customer propositions and privacy-enhancing technologies as part of the systems being built in both public and private sectors.

So why was I thinking about the informative e-discussion between Adam and Bob? Well, it's because some evidence to support my point of view (!) appeared in the British press today. The story concerns the (entirely predictable) leaking of confidential patient data from the National Health Service (NHS) multi-billion IT system.

In the security breach, an employee [...] was not authorised to access individual patient records. After the person left, however, NHS Hull discovered that the person "inappropriately accessed identifiable medical records. The trust says: "A total of 358 patients [registered at] GP practices have been affected by this."

[From Police probe breach of NHS smartcard security as e-records launched in London | 16 Nov 2009 | ComputerWeekly.com]

The response has, of course, been immediate and proportionate. The person responsible has been prosecuted, the person in charge of security has been fired, and the relevant Director has been demoted. Only joking. None of this has happened, but the police are "probing".

Kath Tanfield a director at NHS Hull who is in charge of IT, says: "It is shocking to us that an individual who takes on a public service role and who agrees to abide by strict confidentiality agreements should go on to abuse their position and violate patients' rights to privacy".

[From Police probe breach of NHS smartcard security as e-records launched in London | 16 Nov 2009 | ComputerWeekly.com]

Indeed. But it's shocking to me that a £20 billion IT scheme, led by the best brains in British management consultancy (the project was famously run by Richard Granger, an ex-Deloitte partner who had failed his school computer studies exam, much to the amusement of our gutter press), should be so badly put together that an employee can access records that they are not authorised to see, which would strike me as one of the minimum requirements to place on the security of any system. In fact, I'd go further and say that that any system without this bare minimum functionality cannot be called "secure" in any sense, and that anyone who calls it "secure" should be prosecuted for conspiring to defraud the public purse. So the Director in charge of IT can put her faith in confidentiality agreements, social norms and best behaviour, but I'd prefer to trust the laws of mathematics.

Incidentally, as far as the NHS goes, my prediction is holding to be true.

the trajectory appears to be "don't worry, it's secure because all the users have personal smart card" to "there's no risk in people sharing smart cards" to (someday soon, I'm sure) Scott McNealy's famous "you have no privacy, get over it".

[From Digital Identity Forum: The public and confidence]

This just isn't good enough. Technology can deliver more, and we should be demanding it.

These opinions are my own (I think) and are presented solely in my capacity as an interested member of the general public [posted with ecto]


TrackBack URL for this entry:

Listed below are links to weblogs that reference Out of control, up to a point:


The comments to this entry are closed.