Privacy and Security during Lunch Hour – Sécurité et vie privée à l’heure du déjeuner

June 27th, 2011

Sécurité et vie privée à l’heure du déjeuner

Boris Jamet-Fournier

 

Au forum Informatique, Libertés et Vie Privée (Computers, Freedom, and Privacy, ou CFP), on ne lésine pas sur la pause déjeuner ; il y a de la nourriture pour tout le monde, les estomacs comme les cerveaux—et elle est de premier choix. C’est donc à l’heure du repas, voire de la sieste, que la session « Sécurité, vie privée : jusqu’où aller ? » cueille les participants. Malgré cet horaire défavorable, le débat fut, de mon point de vue, l’un des plus réussis de toute la conférence, au moins pour trois raisons.

Tout d’abord, les thèmes abordés sont appétissants. Accessible au débutant comme à l’expert, la discussion veut explorer les dossiers qui font se confronter, comme c’est souvent le cas, sécurité et vie privée. De plus, le format, très stimulant, ne permet pas le bavardage. Six sujets à traiter en quelques minutes, un question-réponse rapide entre les deux intervenants, et une modération immodérément efficace, pour profiter de chacune des soixante minutes de cette heure de débat. Enfin, les participants sont chevronnés et passionnants ; même si cela n’est pas une rareté à CFP, la qualité des speakers est à souligner. Daniel Solove, récemment auteur de « Nothing to Hide: The False Tradeoff Between Privacy and Security » (« Rien à cacher : pourquoi il ne faut pas choisir entre vie privée et sécurité »), enseigne le droit à George Washington University. Peter Swire, lui, a étroitement collaboré avec l’administration Obama[1] sur des thèmes aussi variés que la webcommunication ou les politiques économiques—il est, parait-il, surnommé « the Dean of privacy » (« le grand sage de la vie privée ») dans les cercles autorisés.

 

Nous voilà donc prêts à attaquer six thèmes majeurs que l’on retrouve dans bien des séances proposées a CFP cette année—conçue comme une boîte à outils (et à idées), cette session nous offre des clés pour mieux comprendre tous les débats sur la sécurité et la vie privée.

 

Premier mythe à terrasser (c’est d’ailleurs l’objectif avoué des deux invités), la légende selon laquelle il faudrait choisir entre sécurité et respect de la vie privée (les américains appellent cela la « all-or-nothing fallacy »). Bruce Schneier, un des papes de la sécurité informatique dont le discours a enthousiasmé le public de CFP peu avant la clôture du forum, est connu pour sa dénonciation de la position d’une administration américaine, qui, depuis la tragédie 9/11, semble penser que le respect de la vie privée des citoyens est un luxe que l’on ne peut se permettre si l’on veut éliminer les ennemis la liberté.

On voit donc bien pourquoi les deux intervenants, américains tous deux, insistent sur l’absurdité de ce mythe ; il a dans leur pays des conséquences très pratiques. Le Patriot Act, adopté 50 jours après le 11 Septembre 2001, a réduit nombre de libertés individuelles[2], officiellement pour protéger les Etats-Unis contre la menace terroriste. Pourtant, nos deux invités sont d’accord, on ne peut pas opposer respect de la vie privée et sécurité ; une société qui célèbre le droit de chacun à la maîtrise de son identité n’en est pas moins sûre pour autant.

Selon Daniel Solove, le vrai choix qui se présente à nous est plus subtil : voulons-nous un Etat policier sécuritaire, donc sans respect aucun pour la vie privée des individus, ou un Etat sécurisé dans lesquels des contre-pouvoirs et mécanismes de contrôle garantissent que l’impératif de sûreté n’emporte pas tout sur son passage ? Solove comme Swire préfèrent évidemment cette dernière option. Ils rejettent tous deux le mythe de l’alternative sécurité contre vie privée, n’en déplaise aux plus fervents supporters de la War on Terror. Peter Swire va même plus loin, arguant qu’en vérité, le primat de la sécurité porte en lui-même des contradictions. Ainsi les agences gouvernementales de sécurité intérieure américaines, comme le FBI ou la CIA, veulent-elle accéder à toutes les conversations téléphoniques du continent Nord-Américain ; mais en poussant cette idée plus loin, cela ne veut-il pas également dire que les communications de ces organisations devraient elles aussi être accessibles à la population, demande Swire ? Il voit donc là un moyen de renvoyer le paradigme policier à ses propres contradictions ; « le meilleur ennemi du sécuritaire à tout crin, » dit-t-il, « c’est encore plus de sécuritaire. »

 

Sur l’encodage et la sécurisation des données dans un contexte de mondialisation (« encryption and globalization »), Solove et Swire tombent de nouveau d’accord. L’arrivée des puissances émergentes, d’ordinaire reléguées aux seconds rôles,  déstabilise l’Internet des pays pionniers, nous apprennent-ils. En effet, la question de l’encodage et de la sécurisation des données (quand elle sont envoyées d’un point à un autre d’un réseau informatique, et de l’Internet en particulier), qui nous préoccupait au milieu des années 90, est réglée depuis plus d’une décennie dans les pays les plus avancés. Mais, évidemment, l’Internet a ceci de fabuleux et de redoutable qu’il ne connaît pas de frontières ; le rôle croissant de l’Inde, de la Chine et de la Russie sur le réseau mondial peut-il compromettre nos données, sachant que ces pays appliquent des standards de cryptage[3] de l’information bien moindres que ceux que nous utilisons aujourd’hui ? Ce problème semble préoccuper les deux experts, qui soulignent que l’Inde limite le chiffrement des données à des niveaux insuffisants alors que la Chine se repose sur des algorithmes de cryptage dont la fiabilité laisse à désirer.

Mais la question de l’accès aux données se pose aussi à l’intérieur de nos frontières, comme on l’a vu plus haut. Solove et Swire ont un exemple en tête : aux Etats-Unis, les pouvoirs publics ont depuis fort longtemps eu recours à des techniques d’écoute pour les besoins de la protection civile, et rêvent d’un monde où les communications sur IP, comme les conversations par Skype, seraient aussi facile à espionner que le sont aujourd’hui les lignes fixes. Au-delà des complications techniques que cela engendrerait pour les prestataires de services[4], ce changement requiert également une évolution de la loi relative à la coopération entre l’Etat et les entreprises de communication[5].

 

Le troisième thème abordé n’oppose toujours pas nos deux intervenants, même s’il fait apparaître entre eux des différences d’approche sur un sujet capital : que faire de l’argument selon lequel quelqu’un qui n’a « rien à cacher » ne devrait pas se soucier de la protection de sa vie privée ? Dans ce domaine, les exemples comiques et les citations outrancières ne manquent pas ; on retiendra simplement que l’ancien PDG (et actuel dirigeant) de Google, Eric Schmidt, avait exprimé en Décembre 2009 sa conviction que « si vous avez fait quelque chose que personne au monde ne doit savoir, peut-être n’auriez-vous pas dû le faire. » Cette remarque, qui pourrait sembler logique, avait valu au capitaine d’industrie les attaques et les moqueries de quantités d’observateurs. Bruce Schneier, dont on a parlé plus haut, avait déclaré à l’époque : « Préserver ma vie privée me protège des abus des puissants, même si je n’ai rien à cacher. Trop souvent, on veut opposer sécurité et préservation de la vie privée. Mais en fait, le choix qui nous est posé, c’est entre société libre et société surveillée qu’il se situe.  Le régime de la peur, qu’il soit le résultat d’attaques extérieures ou des pressions constantes d’un Etat policier,  cela reste le régime de la peur, cela reste une tyrannie. La liberté vraie, elle, c’est un Etat sûr, mais sans caméras à chaque coin de rue ; c’est la sécurité et la préservation de la sphère privée. […] Voilà pourquoi nous devons tout faire pour protéger notre sphère privée même si nous n’avons rien à cacher. » Au passage, on note que cette citation répond aussi tout à fait aux questionnements évoqués plus haut, au sujet de la « all-or-nothing fallacy ».

Sur le thème du « rien à cacher », il n’y a donc pas d’ambigüité pour les  deux invités—cet argument doit disparaître, et le plus vite sera le mieux.  Pour étayer son point de vue, Solove aborde un élément essentiel : la définition même de « protection de la vie privée ». Contrairement à ce que l’on croit souvent, protéger sa vie privée n’implique pas seulement de cacher, de détruire, ou de soustraire des informations. Bien sûr, c’est une des dimensions du concept. Mais ce n’est pas la seule.

S’intéresser au profil des personnes ou des organisations qui ont accès à vos informations personnelles détenues par un tiers (par exemple, vouloir savoir quelles entreprises pourront piocher dans les renseignements fournis par votre page MySpace), cela relève de la protection de la vie privée.

Contrôler son image (par exemple, refuser que son portrait soit affiché sur tous les « 4 par 3 » du village, même si la photographie en question est publique), cela relève de la protection de la vie privée[6].

Ne pas avoir à donner une raison quand on achète tel ou tel ouvrage sur le terrorisme ou la pédophilie, cela relève de la protection de la vie privée.

En somme, l’argument du « rien à cacher » ne peut absolument pas s’appliquer à tout ce que le concept de protection de la vie privée recouvre. Swire acquiesce, et renforce la thèse de Solove en abordant la question différemment. En termes d’engagement politique, il est souvent difficile de dire que l’on a « rien à cacher »—par essence, la publicité des opinions politiques est un risque, nous dit-il. Mais Eric Schmidt ne semble pas voir qu’imposer la logique de la transparence absolue à l’engagement citoyen, c’est peut-être tuer la démocratie.

 

Avec le quatrième sujet de la session, on commence à voir des différences d’opinion pointer entre les deux invités. La question est en effet provocatrice : les réseaux sociaux, qui permettent à la fois la création de mouvements citoyens (évidemment, l’exemple des « révolutions arabes » est dans toutes les têtes) et le fichage des individus au profit d’entités privées ou gouvernementales, sont-ils une avancée pour les libertés publiques ? Il est bien sûr impossible de répondre de manière tranchée, aujourd’hui, tout au moins, puisque nous n’avons que quelques années de recul sur ces phénomènes complexes. La discussion s’oriente donc vite vers le marketing politique sur Internet, question sur laquelle, une fois n’est pas coutume, on a de nouveau affaire à un dilemme : comment concilier l’exigence de respect de la vie privée de chacun (en l’occurrence, du droit à l’utilisation d’Internet sans être constamment sollicité par des partis en quête de soutiens et de votes—voilà encore un éclairage sur la définition du concept de vie privée) et l’impératif de libre expression d’organisations politiques dont les prérogatives sont défendues, aux Etats-Unis, par le premier amendement à la Constitution ?  C’est un débat ou les spécificités de chaque pays importent énormément, et pas seulement en matière juridique.

En France, la tenue des primaires socialistes et écologistes a récemment éveillé les soupçons de l’administration, et notamment ceux de la Commission Nationale de l’Informatique et des Libertés (CNIL).

Aux Etats-Unis, les détenteurs d’une ligne téléphonique sont autorisés à enregistrer leur numéro sur l’équivalent d’une liste rouge (« do-not-call » ; littéralement, « ne pas appeler ») pour ne pas avoir à subir les sollicitations intempestives de télé-marketeurs toujours plus agressifs et toujours mieux informés. Peut-on alors envisager un modèle similaire pour l’Internet, notamment avec le système « do-not-track » ? Daniel Solove, très remonté contre le stratégies communicationnelles des campagnes politiques d’aujourd’hui, et notamment contre les « Robocalls, » ces messages enregistrés et très souvent malhonnêtes qui se multiplient chaque été précédent une élection aux Etats-Unis. Pour lui, il faut légiférer, et en finir avec ces intrusions qui, sur la forme comme sur le fond, nuisent au débat démocratique. Comment ces pratiques vont-elles évoluer une fois associées à la puissance de l’Internet et aux milliards de données personnelles qu’on peut y glaner, si le législateur ne prend pas les choses en main, se demande Solove ? Dans le public, et du côté de Peter Swire, on semble plus fermement attaché à la célébration du principe de liberté d’expression, quasi-absolu pour les entreprises politiques aux Etats-Unis.

 

C’est encore le même souci de définition du concept de « vie privée » qui occupe les deux intervenants pendant leur échange sur le cinquième thème : l’influence d’un environnement numérique changeant sur le quatrième amendement à la Constitution[7]. Selon Daniel Solove, cet amendement, adopté en 1791, est notoirement inadapté aux réalités actuelles. En effet, depuis 1967 et l’arrêt Katz v. United States de la Cour Suprême, les protections du quatrième amendement ne sont garanties que si l’individu perquisitionné peut se prévaloir d’une attente raisonnable en matière de vie privée (« reasonable expectation of privacy »), par exemple se trouver à son domicile ou dans une cabine téléphonique.  Tout le problème se situe dans la définition de cette « attente », et compte tenu des pratiques de gestion des données dans l’ère numérique, mais aussi de la méconnaissance de ces mêmes pratiques par les utilisateurs, les protections salutaires contre une dérive policière de l’action de l’Etat garanties par le quatrième amendement sont largement remises en cause.

 

Enfin, Swire et Solove débattent de la réduction du volume de données (en anglais, « data minimization » ), un concept très en vogue chez les spécialistes, mais bien peu connu de l’Internaute moyen. Il s’agit en fait d’une idée très simple : pour éviter que des informations précieuses ne soient perdues, volées, ou vendues quand elles ne devraient pas l’être, la première des mesures à appliquer consiste à ne collecter que les données strictement nécessaires pour la conduite d’un projet ou d’une transaction spécifique. En clair, il est inutile de fournir le nom de mes enfants ou la liste de mes diplômes à un tiers qui propose de me vendre de la glace à la vanille ou un billet d’avion. La réduction du volume des données est une des bases du « respect de la vie privée par défaut » (« privacy by design »), qui stipule que la question de la protection des données se joue au niveau de l’architecture des systèmes de d’information. De la même manière que le risque sismique est pris en compte à chaque étape de la construction d’un bâtiment, cette théorie veut que les données personnelles soient protégées « à la source » par des systèmes qui se donnent justement pour objectif de segmenter l’information et qui fournissent un environnement favorable au respect des utilisateurs.

Même si ces grands principes semblent faciles à appliquer, la chute vertigineuse des coûts de stockage, mais aussi les progrès de l’agrégation des données, qui rend l’assemblage de toutes ces informations plus rapide et plus efficace, ne permettent pas à la « data minimization » et au « privacy by design » de s’imposer, ni dans les faits, ni mêmes dans les esprits des acteurs majeurs du commerce et de l’informatique—au plus grand dam de nos deux invités. En effet, comme l’explique Solove,  la réduction du volume de données est bien plus un idéal, un état d’esprit qu’une véritable norme, quantifiable, mesurable, applicable. Et pour l’instant, les défenseurs de la vie privée restent sur leur faim.

 

 

* *

 

Après cette heure de festin juridico-informatique, le public, repus, ne peut que saluer les prestations convaincantes des deux participants et se réjouir qu’une telle discussion ait pu inspirer les professionnels présents cette année à CFP. Il peut aussi se féliciter de l’intérêt que semblent porter quelques médias et quelques personnalités politiques, dont quelques unes étaient présentes à la conférence, à ces questions cruciales.

Comment ne pas voir, toutefois, que les points de vues sur la question de la vie privée et de la sécurité sont multiples (légal, commercial, militaire, réglementaire, politique, citoyen, scientifique, …) et que les intérêts des différents acteurs sont bien souvent différents, si ce n’est opposés ? Comment ne pas voir, non plus, que la technicité et la complexité des débats sont des obstacles majeurs au consensus, et donc à l’action ? Comment ne pas voir, enfin, qu’arrivés à la 21ème édition du forum Informatique, Libertés et Vie Privée, il est regrettable que des questions de définitions conceptuelles posent encore problème ?

 

Cette session ne fut peut-être pas le repas annoncé en introduction ; parler d’un joli petit hors d’œuvre goûtu aurait été une métaphore plus heureuse. Espérons que les débats abordés par les deux invités, les prochaines éditions du forum CFP, et les efforts communs de toute une communauté d’académiques et de professionnels spécialisés aboutiront à un plat principal assez copieux pour leur donner la force d’affronter les nombreux défis présents et futurs en matière de vie privée et de sécurité. Il y a… du pain sur la planche !


[1] Avec Larry Summers et Julius Genachowski en particulier.

[2] Notamment le droit au respect de la vie privée des citoyens.

[3] L’Académie Française recommande l’emploi de « chiffrement » à la place de cet anglicisme.

[4] Skype, en l’occurrence, devrait repenser son modèle de A à Z pour rendre cela possible.

[5] Une réforme du Communications Assistance for Law Enforcement Act aux Etats-Unis, par exemple.

[6] Le droit français, et plus particulièrement la Cour de cassation, considèrent qu’il n’y a pas de droit à l’image, et donc, de protection de la vie privée, lorsque la photographie est publique.

[7] Ce texte protège les américains contre des perquisitions et saisies non motivées, requérant un mandat (et une sérieuse justification) pour toute perquisition.


 

 

From the Chair: Lillie Coney and the mission of CFP

June 16th, 2011

I spoke with Lillie Coney of EPIC, the Chair of this year’s CFP in the Georgetown Law Center in Washington DC – the culmination of two years’ hard work for her and her team. What was it all for?

“The issues raised by the Computers, Freedom and Privacy conference are becoming more and more central to our way of life, and to what will happen when technology is integrated with everything everywhere. Yes, life with technology is easier and safer, but the impact on freedom and privacy needs to be considered. That’s our mission.

“As human beings, we haven’t figured out how to cope with the digital age. We don’t have instinctive, well-rehearsed codes of conduct to guide us in our privacy behaviors online. If I’m walking past a house and the drapes aren’t pulled, I might avert my eyes so as not to stare into someone’s room. If I go onto the property and look in through the window, that’s trespass.

“But digitally, we don’t know to avert our eyes, and there’s no law of trespass in cyberspace. We are only just starting to learn the culture of the 24-hour digital world.

“It’s interesting that it’s the children who are leading the way. That’s why as well as taking account of gender, ethnicity, disability and orientation at this year’s conference, we’ve also included young people and children. Parents often struggle a lot of the time, learning from each other. But we made a whole track about youth and privacy at this year’s conference.

“This year we’ve experienced the civil rights movement in the Middle East, which in so many ways has had parallels with the civil rights and women’s movements in the 1960s in the US. We had more time, it’s true: in the Aravb countries and North Africa, they’re dealing with a lot of tough questions all at the same time. We’re really glad to have guests from the Middle East and Africa at the conference this year.

“It’s very dangerous to do this, just like the civil rights movement in the US was a dangerous timer. To decide to go out and tell the powers that be you’re going to change things, and then take on the literal re-shaping of a whole area of the world – that’s democracy, and it’s hard. At the same time, cyberspace is a new frontier and they’ve also got to put privacy protections in place. In cybersapce, how do you define friend and foe?

“It’s good to bring the group together. People here working hard in government agencies don’t get the chance to go to meet these pioneers. I travelled – to Belgium, to Africa – and a lot of people worked very hard to get these panels together. I’m glad we’ve done it. This year’s conference has had a very deliberate focus on freedom.

CFP is the chance to bring a whole community together, every year for a few days. There isn’t anything like it. It derives from the ACM – in fact Graham Chapman, who first began it, passed away earlier this year – and engineers are always motivated by the hard problem. Two hundred years ago, engineers figuring out how to build the bridges of the new industrial age used to come together to test out their ideas on each other. They were trying to work out how to reconstruct things that were lost – the architecture of Rome, at that time. They formed guilds and worked it out.

“CFP aims to do the same. It’s not just a case of can you build this thing, but should you? If you’re thinking about doing it., what about the freedom and privacy implications? We want to get young people thinking about digital policy and technology. We want to nail policy as it relates to computing; but it’s also wider than that.

“We’re also looking at the decisions made today that will impact the future, and how to live in a digital world.”

 

Christina Zaba

Keynote address, Senator Patrick Leahy: “Now is the time to bring privacy into the digital age”

June 16th, 2011

Thursday 16th June, 1.30 pm

Justice Earl warned in 1963: “The fantastic advances in the field of electronic communication constitute a great danger to the privacy of the individual.”

In the past month, smartphones have been collecting and storing location information and I’m concerned about this. A recent survey found that 38% of smartphone users in the US cited privacy as their Number 1 concern.

In my state of Vermont, hundreds of acres of land and families have known me since I was in high school. Saturday they report a strange car. A farmer sitting out on his porch asks, are you a friend of his? In Vermont, privacy comes naturally.

But hundreds of people have been attacked, attacks going even into the Senate in the last week or so. So if we’re going to find a successful course into the future, we need to address this. There’s so much good electronic communications can do, but we cannot give up our privacy.

What can we do?

We need to modernise the legal framework for the digital age. I was working on the Electronic Communications Privacy Act Amendment 2011 last week: the first changes to the Act in 25 years. When the Act was first drawn up, no-one could have anticipated the change in technology to come. But many of the assumptions we made about new technology are no longer valid. Americans now use electronics as their primary form of communication – in the cloud, too.

So we need to have according privacy arrangements. If I came into your house I’d need a warrant. So government should have to obtain a search warrant to gain access to electronic communication.

This amendment would require the government to obtain a search warrant before going into your communications. Whether people are coming from the right or the left, the Act needs updating. If it isn’t updated, we won’t move forward.

Since 2005, 533M electronic records in the US have been involved in security data breaches.

In Moscow I hear about State-sponsored hacking. We cannot have that here.

I’m working to establish a national data breach notification – a safeguard of our sense of personal information.

I’ve introduced this bill four times. Each time the threat has been greater so I’m hoping this fourth will be the time.

It’s a bipartisan effort – next week we’ll be working with the administration. Republican or Democratic, it’s a pass – we need the best privacy available.

We are different as a nation. We do have freedoms, and we don’t want to give them up. President Kennedy once said: “Change is the law of life, and those who look only to the past or present are certain to miss the future.”

These discussions at Computers, Freedom and Privacy are extremely important. Now is the time to bring privacy into the digital age. If we don’t do it now, what is the next generation going to do?

 

Christina Zaba

 

 

Keynote address, Bruce Schneier: The Rhetoric of Cyberwar

June 16th, 2011

Thursday 16th June 2.00 pm

 

I recently took part in a debate discussing the argument: “In cyberwar, the threat has been greatly exaggerated”. On one side were me and Marc Rotenberg and  I thought this would be easy. It turns out we lost the debate – the debate about language. We failed to convince the audience that the threat of cyberwar is grossly exaggerated.

And we are losing. The hype is taking hold. “Cyber 9/11″, “cyberarmageddon”: this is what gross exaggeration looks like. Reasonable policymakers are talking about cyberwar, cyberthreats in these extreme terms.

In 2007 Estonia was the victim of a series of denial of service attacks. It was at the time of some tension between Russia and Estonia, so Russia was to blame. In the event, a 22-year old Russian origin man objecting to a Russian statue being taken down was arrested. So maybe cyberwar is so easy, children can do it.

A similar denial of service attack in Georgia in 2008 precipitated an actual land invasion. We don’t know if that was state sponsored, or kids playing politics.

The problem we have is that there is no good definition of war in cyberspace.

Clearly we are not fighting an actual war in cyberspace right now. This is a rhetorical war – the war on drugs, the war on terror. And generally we kind of know the difference.

But here we are. It started with the war on terror and now it’s cyberwar, and the two are kind of blending. And that’s confusing.

It’s not that we’re fighting a cyberwar, but we’re increasingly seeing warlike tactics being used in cyberconflicts.

 

In cyberspace you don’t have a clean division between conflict and war. You have tactics and weapons that used to be used by nation states, now used across borders. Technology is spreading capability, and that blurs things.

Ghostnet was discovered in spring 2009 and we never know who did it. We backfill the perpetrator from the victim list – so we assume China was behind it.

In July 2009 there was a denial of service attack in Korea which seems to have come from China, or London, or Florida.

In January last year, Google announced they had been victims of an attack and again it was China – of course other states can go through China too.

There’s Stuxnet, Burma, Wikileaks, Anonymous vs. HP Gary, where a group of people took down a company – we learned that the company made cyberweapons. Or 1991, when the US inserted malicious code into printers which were sent to Iraq and helped disable machines before the Gulf War.

These are not criminally but politically motivated attacks.

 

If you think about it, if you’re attacked in the real world you can call on a lot of people, depending on two things – who’s attacking you and why.

In cyberspace you know neither. There isn’t a clean division that you can see immediately.

Two weeks ago a hacker group told NATO not to bug it. There’s meant to be a division between a bunch of kids and NATO. It’s like Al Qaida and that’s why there is this rhetoric.

A lot of this is the new buzzword APT – Advanced Persistent Threat. It’s worth talking about non-financially motivated attacks.

If it’s a financially motivated attack, it’s easy – the strongest guy wins.

But if you’re Microsoft or Sony people are going to work overtime to get into you. In conflict, what matters are the advantages the two sides have. In the Civil War, one side had a strong weapons advantage; in World War I we had a strong defender advantage, and right now on the internet, the attacker has the advantage.

The politics worries me more. We are in the early years of a cyberwar arms race. There’s a lot of cyberwar rhetoric, a lot of money being spent and it really does have all the hallmarks and the dangers of an arms race.

We know who won this power struggle – a government of free enterprise supporting a critical infrastructure. This is important because metaphors matter.

To the police, we’re citizens to protect; to the military, we’re a population to be subdued.

War changes the debate. War changes the solution space. Things you would never agree to in peacetime, you agree to in wartime because we’re “at war”.

This affects the whole debate about wholesale surveillance. In peacetime, if the government goes to AT&T and says they want to surveille everyone, they’re asked, where’s your warrant? Not in wartime.

Which leads to the eavesdropping debate – and the FBI’s big problem, Skype.

With gmail it’s easy to eavesdrop. If the FBI wanted to eavesdrop they’d get a warrant. With Skype, it’s impossible – it’s end-to-end encrypted. There is no way to eavesdrop except to dismantle it and make it less useful.

We saw this debate last year with RIM. The United Arab Emirates said BlackBerries are annoying us because we can’t eavesdrop.  RIM said we can’t do that, we don’t have the capability. They said nonsense, you’re doing it in Russia. So they put the service within the countries so that the data wouldn’t leave it there.

The cyberwar rhetoric matters a lot in the kill switch debate. It kind of puts me in mind of a big red button on Obama’s desk. Whether he should get that kill button depends on whether we’re a nation at peace or a nation at war.

If we knew who you were, if you did something wrong we could get you (though that’s technically impossible) – so to identify you is not unreasonable if we’re at war. It seems to make sense, in wartime.

None of this says we should abandon the US cybercommand. War expands to fill all spaces. And I like seeing the debate about who controls the critical infrastructure. But if you’re in Syria, say, and someone drops a bomb on your head, you can look up and see the colors on the plane. But with cyberwar, you don’t know if anyone’s attacking and you don’t even know you’re being attacked.

Meanwhile the US is blocking cyberwar treaty efforts. I’m sure the US military has had conversations with Google and Amazon, with the big cyberguns in this country that can be turned into weapons.

And mercenaries seem to have come back, and they’re there in cyberspace as well.

We need to figure out how to deal with that and with non-state actors. How the country and the world deals with it is really important.

What’s important is the rhetoric – because the nation does not believe that the danger of cyberwar is greatly exaggerated.  People are seeing these attacks not in terms of crime or peacetime espionage, but in terms of war.

And if that’s the case we start losing a lot of debates. Privacy in wartime is a luxury.

So I want us here to fight the war metaphor at every turn, because it affects the debate at ever turn. A lot of money is riding on it. There’s a reason they exaggerate the threat of cyberwar: it directly affects their bonuses. And the full body scanner manufacturers are now trying to get their stuff into stadiums.

That money is going to push the debate and push the threat. The threat is real. But it’s like the Cold War. Yes, they’re threats. But this metaphorical normal state is not war.

 

Christina Zaba

 

Plenary: Cybersecurity, Freedom and Privacy

June 16th, 2011

Thursday, June 16, 2011; 8:50 AM – 10:30 AM

This panel will explore the comprehensive approach to cybersecurity being proposed by governments both in the US and around the world. These global initiatives seek to protect digital information systems and the information they manage from all threats. The category of threats will include those faced by governments, consumers, corporations, critical infrastructure, and networked local, state, and national government agencies. The challenges are not only domestic, but international in scope. This panel will explore the topic of Cybersecurity Freedom and Privacy as they outline the roles that governments, companies, users, and advocates can play or should play in attempting to create a free and safe Internet. Some Questions for the Panel: Policing cyberspace is it necessary? And if so, who should be responsible: one agency or many? Will either diplomacy, military, law enforcement, advocacy community or users win the day or will they each be needed to make a 24-7 world work in harmony?

Panel organized by Lillie Coney’: Associate Director, Electronic Privacy Information Center (EPIC). 


Moderator: Marc Rotenberg: Executive Director, Electronic Privacy Information Center (EPIC) 
: Washington Post 


Panelists:Dr. Mouhamadou LO: Legal Advisory, Computing Agency of Senegal, Presidency of the Republic of Senegal
Dr. Lo’s Presentation: English; French

Pradeep K. Khosla: Dean of the College of Engineering, Carnegie Mellon University 


Ross Anderson: Professor of Security Engineering, University of Cambridge 


Timothy Edgar: Senior Legal Advisory to the Information Sharing Environment, Office of the Director of National Intelligence 


Joe Onek: Principal, Raben Group 


Stuart Shapiro: Security Scientist, MITRE Corporation

 

PRADEEP KHOSLA

Marc has asked us to talk about security but since the session before this was about privacy, I want to talk first about future privacy concerns.

So we all love personalization – that’s exactly what’s happening on the web.

Abut as much as we love it on the web, there’s a significant amount of information being collected.

You’ll see specific ads presented to you – how does the system know?

It’s based on your IP address and cookies.

There’s a new class of services out there who can feed this information to companies.

What this implies is that the system knows a lot about you, all encoded in IP and cookies.

Alessander talked about face, but we can imagine that knowing your name, knowing your face will be connected.

All of your personal information is out there. It’s going to be really significant.

The only gap now is that it’s all in different servers – but when will it be aggregated and a complete profile created?

It’s an issue has been concerning me for a while now and we need to talk about it.

 

STUART SHAPIRO

All I say here is my own view and does not reflect the view of my employer.

The debate seems to be dominated by two viewpoints.

First, cyberarmageddon

Or, overblown hype, bunch of hogwash.

Actuality somewhere between two extremes.

Depending on what label you want to use, advanced cyberthreat is real and can cause damage. On the other hand majority of cyberattackers the same old ones, not very sophisticated, depend on people doing stupid things.

So to the extent that the proposals out there can get people and organizations to stop doing overtly stupid things (storing passwords in clear text, using WEP and not changing default passwords) then we can look forward.

But myopic – it’s really proposals to do more of the same – “batten down the hatches and monitor and all will be well”.

But that won’t solve advanced cyberthreat – if you’re a high-value target they’ll probably get in at some point.

In the move to more monitoring I have privacy concerns.

Marc asks for solutions – so from tech standpoint we need to look at resiliency. Traditionally this has focused on reliability, can you keep function in the face of various kinds of failures? Now looking at resilience from a security standpoint. How can you architect your internal networks?What things can you put in place to continue to function securely even though the attacker’s in and they’re roaming around?

Another thing we need to think about is simplification – it’s too complex and a practical impossibility to make secure something so complex. How can we streamline?

From a privacy standpoint, what concerns me is that the remedies being proposed is the same old, same old: we’ll have good oversight, we’ll have privacy officers, but if you’re talking about the Government sticking its fingers into fundamental communication infrastructure, you need to look beyond ordinary oversight. If this is going to go on I’d like some hard core oversight that’s independent of the executive branch, because to the extent that it’s in the executive it’s going to be suspect.

 

TIM EDGAR

Marc: we particularly appreciate Tim’s participation – he has a significant role and agency

Tim: I’m here in two capacities, as lawyer and having worked in cyber-office in the Whilte House on privacy issues

Instantaneous communication has transformed our lives – and has also transformed government. It’s not necessarily as clean and easy to use as what we see in the commercial world but powerful. Ability to share information across agencies more powerful than it’s ever been.

After 9.11 this become more widespread and the sharing of information has proliferated. It was a real shock to the State Department when Wikileaks released because there was a real belief that everything was locked down.

In retrospect it’s not surprising – you can set up a closed network but you’re still using computers, software. Being able to airgap a system not necessarily a protection.

So how can we preserve the benefits of the data revolution while addressing these threats? We know as we put on more and more granular data, the threats to that data being destroyed, attacked, stolen or compromised become more serious.

Superficially sharing and security look like they’re in conflict. But share it in the right way, time, with the right controls – whether sensitive or classified information or not.

Roled-based access, id management, auditing can protect info as much as share. We believe information is a shared national asset, but have to do it in a framework that protects principles.

Privacy by design – I would challenge this group and ask, why is this a new idea?

Why is that something we’re still talking about as a new idea? We should be beyond that. Not because there isn’t commitment by government or industry to these standards, but once we’ve agreed – we need a core set of standards that we can design systems to meet. Telling engineers you’re supposed to meet LAWS not helpful but STANDARDS may be more useful.

Use a standards-based approach. I’m often called on as a lawyer to say whose rules are being violated? That’s beside the point. Are we creating best-practice systems?

 

JOE ONEK

Marc: Joe has had an extraordinary career, hoping he can pick up my request for solutions

Joe: I’ve been focusing on government intrusions into privacy. I’ve been retained to represent two unions about a private sector intrusions. Hunt and Williams [sic] were engaged to data mine and find out material to discredit my clients – social media and computer malware. The effort was only aborted when one of the defence contractor got into a spat with the hacker known as Anonymous and Anon posted all the emails on a website. And that’s how we found out.

Private sector larger, more diverse, with more diverse interests into intruding into privacy, whether larcenists’ interest or other – or using intrusions of privacy for political and advocacy organizations. Extremely vulnerable. Even if you don’t hack, you can use social media to find enormous info about their supporters. Find out by association – some supporters will do silly and stupid things (or you can try by dirty tricks to induce it) – the purpose in the case above of false persona was exactly that. To discredit.

In the venomous environment we’re now in, there will be more.

So what is to be done?

From legal standpoint, a focus on intrusions for economic damage – the law should be responsive. With respect to the technical side, I’d like to believe more can be done at the front end, but I know there are limits and you can’t guarantee the defence will win.

Finally the human element. More can be done to educate people. Campaigns can and do work. The chart we saw earlier is an example. More can be done to act in ways that will enhance security. People will cause harm though, and there’s no known cure. But in legal and technical fields, things can be done and should be.

Marc: We try to make possible at CFP, not just an academic conference, but a chance to take points of policy. Now it’s a special pleasure to introduce Mouhamadou Lo  - who’s doing extremely important work

 

MOUHAMADOU LO

I’m the legal adviser to the executive and also I’m responsivble for formulating all laws concerning information in Sengal. I also work for the West African association and work for the French language association for the protection of data in Senegal. And I also work on how to set up an international legal instrument concerning personal data.

So it’s a pleasure to share with you what we do on the other side of the Atlantic Ocean – as you know cybercrime is not specific to any area of the world. So I would like to take a little more than 5 minutes to share with you what I’ve spent 8 hours flying over to tell you.

My presentation can be summarized:

Introduction
Context of regulation
And the stakes of the regulation

Senegal is a west African country, surrounded by Mauretania, Mali, Guinea, 13M population and we do have advantages in respect of connectivity. We also have very well performing structures of communication and we’ve been independent since 1960 with a presidential government and several political parties – it’s a stable country. The ECEO, Economic Community of West Africa consists of 15 countries, and there are several languages – English, French, and Portuguese.

There are over 200M people in this area.

Respect your regulation re cybercriminality and personal data, we have a growth in the use of information technology in the public and private sector.

Second, there’s a multiplication of abuses in the use of new technology and information. And as a lawyer I see we’ll have to set up an appropriate legal environment of what’s happening.

Within the ECEO heads of state in 2004 declared their attachment to freedoms and the fundamental rights of people especially privacy in using files and processing personal data. The legal basis of that is Article 33 of ECEO which goes back to 1995 ? for a common policy on electronic communications.

The first challenge was to set up a cybercriminal policy to master the economic, legal and institutional barriers and to ensure the coherence of a legal environment throughout, but also to guarantee the respective rights and basic freedoms in private life, and to fight against the digital divide and to render the information systems less vulnerable, to avoid criminality, cyberattafcks etc. So after ctalking about the challenges we had to set up legal texts.We went through several steps to do so.

First step was a study to see how to harmonise telecoms regulations

Second step was an audit

Third step was creating policy drafts to establish guidelines

Three drafts:

  1. personal data
  2. cybercrime
  3. e-transations

 

Guidance law

Personal data act

Cybercrime

Electronic transactions

Cryptology

 

Two directives were adopted – one to do with personal data and the other on electronic transactions. Cybercriminallity is ongoing.

On cybercrime, we created new types of incrimination relating to information technology, systems and computer data. Illicit content is taken into account, the activities of those who furnish access and publicity. |n addition we also updated the definitions of infractions relating to criminal protection for incrimination against theft fraud and also suppressing property crimes.

Moreover we also modernized the rules of penal and criminal law and criminal procedure was upgraded and defined new procedures for co=-operation.

In respect of personal data Senegal organized collections in how the use of data is co-ordinated, rights, procedures, sanctions. The texts also provided for setting up a data protection act in each country and support the free flow of information in order to facilitate the liberalization of information in our country.

Access must be necessary, proportional and respect legislation. The files of the security services must respect the authority and the shelf life (two years in Senegal) and the exercise of rights – the deletion of data.

The objective is to set up an information tech which will be economic, social, cultural good and a tool fo good governance and we want to create international fight against cybercrime. There is co-operation with Senegal and other countries of ECOAS and Senegal is talking to the Council of Europe and Budapest connected to the G8 work, not yet completed.

Perhaps the solution of the problem will be the establishment of an international instrument.

Illustrate with two facts.

Cybercrime concerns people more out of the Middle East, Europe and the Americas and the Near and Far East because the victims of cybercrime tend to live in those countries, the cases we’re dealing with now tend to deal with people in those countries.

EG Strauss Kahn – a young Senegalese woman saw her picture widely disseminated naming her as the victim of the DSK alleged crime. Her only fault was to leave a picture on FAcebook which appeared to correspond to the alleged victim.

Thank you.

Marc: The last introduction is to a long time friend and someone I admire very much, a leading expert in technology field but great attribute is to understand relationship between technology and public policy.

ROSS ANDERSON Professor of Security Engineering, University of Cambridge

10 yrs ago we realized you had to do the economics as well, and that there may be a role for public policy.

Security engineering; report on the resilience of the internet

Cybercrime policy – transparency, fraud reporting, international

2009 Database State – privacy and public sector systems, here on Monday a pre-conference on healthcare privacy – America in same position now that Britain was in 1990s – you’ve had a stimulus bill.

The fundamental problems of health IT turn out to be similar to the problems of intelligence – of scale, content and control.

A long time ago you had access to a few hundred patients – when you aggregate these you scale and lose control.

Architecture was wrong.

One of the things we’ve been warning against in our database state report is that England has created Secondary Users Service on 50M people and accessible to 800,000 I predicted a professor would leave a laptop on a plane – several laptops were stolen from a plane this week, containing details on 8.6M people, now in the tabloid press.

Intervention from the Catholic bishops’ conference. They took the view that religious women had the right not to allow their medical data to be used for abortifacients.

Consent to sharing medical records for research issue – an extraordinarily powerful tool.

Control matters. We’ve seen in the UK that records become hosted – now medical records are kept on a server farm run by BT or Emis.

Why will this change the world?

Four years ago a policeman went to visit a woman who ran a pregnancy charity and demanded all the records of patients under 16 (a crime) She told him to go away, but what happens when all our records are kept in holding facilities? will the office manager do the same?

So scale, context, control – they come together in ARCHITECTURE. Architecture matters, engineering, economics, law, policy all come together. Understanding of architecture seems to be absent.

Our Government’s Chief Information Officer isn’t at the top table when the drafts are being roughed out. It’s about incentives, architecture, and possible to learn from experience elsewhere. Don’t repeat the mistakes we made.

Research across states – states with strong privacy laws found that health information exchanges were more free where there were strong privacy laws.

 

—————————————————————————————–

 

Broadcast from NPR this morning:

A cyber barrage!

Nothing new – have been researching for more than 20 years – but the latest are good attacks – highly sophisticated with advanced tools and software.

Hacktivists which want to make a point.

Cybercriminals who want money.

Cyberspies who want to steal data.

Why would a cyberspy want to break into IMF data?

Would know ahead of time which country’s currency is likely to rise – and other important secrets. IMF has long favoured physical protection rather than its computer networks. Hackers were able to penetrate the inner sanctums of IMF institutions.

Some welll-financed private groups are involved in gathering data to sell.

On a level previously unimaginable.

——————————————————————-

 

QUESTIONS

 

QUESTION: Not sure government understands the risks of the government build-out of health information systems. What it means when the most intimate information about everyone is available where they don’t even encrypt, and the individual has no control because consent was eliminated in 2002 under HIPR. Now in control of covert entities and there’s not even a data map. SO WHAT DO WE DO ABOUT HEALTH RECORDS?

(Tim Edgar) what we consider are the appropriate uses of data. That’s a distinct question. One we’ve decided what IS OK how do we enforce it? I would say the national security establishment is probably more aware than much of the rest of the government. We’re intelligence agencies. So one of the reasons that those agencies can supply useful technical assistance for defensive purposes is that they figure, well we know how we do it so our best adversaries also do. These ambitious goals are there for generally very good reasons in terms of anticipated benefits to society. Often potentially quite compelling. But you have to build in the appropriate time to have them done in the right way – or adjust or move the goal.

(Stuart Shapiro) people think there’s a security risk if PII involved. It can be useful if you want to leverage it for a social engineering attack to get what you really want (sensitive information not much to do with individuals)

(Ross) not just protecting defence and intelligence communities – mission to protect the world. The first place would be the medical records.

(Tim) Absolutely right – those kinds of vulnerabilites we’re uniquely sensitive to. A lot of talk of DHS role in protecting civilian networks. How far will it work for NSA to help? Probably quite a lot.

(Marc) gave evidence at reading of Bill – inclusion of strong principle of data minimization. Not only reducing risk at the outset but when a breach does occur, there’s less that can go wrong. Important role of data minimization in reducing risk.

 

QUESTION: Secretary Chertoff said: It’s alright to give us all the data, we’re the good guys. National establishment feel they have an entitlement to all the data and then the architecture built around that. HOW DO YOU ADDRESS THE QUESTION OF ENTITLEMENT AND GET IT IN THE ARCHITECTURE?

(Ross) I would prefer it if provider systems remained provider specific. They tend to work according to the buyer – privacy, dependability, functionality, control and skill. A test case brought by the UK equivalent of Debra’s organization concerned the estranged husband of a lady who had her arm broken when she was in hospital. If you let the things be designed without public review you run these risks.

 

QUESTION: POLAND IS CREATING A DATABASE STATE. IS CREATING MEDICAL RECORDS HELD BY PATIENTS A SOLUTION?

(Ross) I favour provider-held records – aligns doctor’s and patient’s interests. Leviathan driven by adminstrators, control and convenience. The Netherlands is more user-controlled. The current UK government were not willing to make dent in healthcare system but it’s failing. Large systems don’t work.

(Stuart Shapiro) we have really lousy models – I’m a big admirer of FIPS but they’re non-normative and we don’t have a framework for talking about risks and harms.

Stephanie – do you have a strategy for controlling spam in those countries which are developing telecommujnications in West Africa

(Low) Cybercrime has more to do with foreigners than Senegalese people themselves. In fact the attacks on Senegal and other countries and all the spam is sent from our country. In streets there are cybercafés and people sit in those cafes all day and send spam round the world all day. That shows that cybercrime is worldwide.

QUESTION: WHAT IS YOUR POSITION ON PEOPLE STORING DATA UNENCRYPTED?

(Tim) One of the things I’ve noticed is that law enforcement is frustrated by encryption but it’s useful in preventing crime. I’ve seen a shift – encryption poses issues, the FBI is concerned about going dark, but appreciation that it may be a valuable tool

(Stuart Shapiro) to the extent that cyberthreat is a genuine threat, we may be taking hard decisions in order to thwart those attacks and give up some of the access that law enforcement enjoys.

(Marc) In 1996 report supported encryption on balance.

——————————————————————-

NOTE: EPIC published a first report on e-deceptive political campaign practices in 2008, republished in 2010.

Christina Zaba

 

 

Keynote Address: Alessandro Acquisti

June 16th, 2011

 

 

Alessandro Acquisti: Associate Professor, Heinz College at Carnegie Mellon University

Thursday 16th June 8:30 AM – 9:00 AM

The future of our personal information

People with les privacy, value privacy less. They can be manipulated.

Today I’m going to extend these initial studies and present mostly unpublished research and then conclude with very preliminary results that talks about the future of our personal information.

A vast scholarship of literature which links privacy with control. Even authors who don’t usually link the two concepts but we were wondering if more control links to more privacy.

We did numerous studies and experiments, we randomly assigned things to different subjects – we wanted to make our subjects feel more or less in control of the publication of their answers to a survey on their behavior.

Subjects were told: You don’t need to answer these questions, but if you do, you implicitly grant us permission to publish.

Some of the questions were sensitive.

And for some of them we gave something more – we added a little box up there – saying “permission given to publish”.

So we made some subjects have the impression that they would have more control.

We expected them to be more likely to click on the box and to answer the questions.

Explicit control – the non-sensitive questions most likely to be answered by the subjects.

The more you make people feel in control the more they are likely to reveal more sensitive information to strangers.

We’ve been monitoring Facebook disclosure since 2005 – what we’ve discovered (this is unpublished work I’m doing with Franz Gross and Fritz Stuzman) that people are becoming more careful. BUT this is only half of the story – I call it a sleight of privacy.

The bottom line is what I’m showing is much more info from 2005 to 2010 average has been revealing to whom? TO FACEBOOK ITSELF.

So on one side users are revealing less to other users, but more to Facebook itself and implicitly to advertisers. They aren’t aware there is a silent watcher.

Scientific evidence is that control doesn’t solve all privacy problems. In normative sense it does have an effect but it’s not enough because you can manipulate people.

Why does this matter?

Last summer there was aninteresting debate on the Internet as the art of forgetting.

There will be a trail somewhere.

How people see today or tomorrow, information revealed 5 years ago – how will they see it now?

There’s a saying – everyone will leave a skeleton on Facebook. But will people care?

Will this information be good or bad? How far will we discount information about others according to how old or new it is, how good or bad it is?

So I did an experiment – the story of Mr A.

One group of subjects were given the story and asked whether they liked it and what they thought.

This was a neutral condition.

But then we added 4 more conditions:

  1. he found a wallet with more than $10,000 in cash
  2. he gave it to the police
  3. it happened 5 years ago
  4. or it happened recently
  5. or he did not report it to the police and this happened more than 5 years ago
  6. or he did not report it to the police and it happened within last 12 months

 

We wanted to see how our subjects liked Mr A depending on what information they were given.

On Y axis 1 to 7 – subjects were asked how much they liked Mr A. where 1 I don’t at all, 7 I like a lot

On x axis age – 5 years ago or 12 months ago

And neutral position

 

Rightly enough our subjects liked Mr A just about average, 4 or 5. Then there were the two groups of subjects who had the wallet story.

Where he returned the wallet, they liked him much more when recently, liked him much more. 5 years ago – made no difference. But the negative story remained negative.

So if it’s good in the past you discount it – but if it’s a negative result you remember it always – the feeling does not fade.

The Future of personal information

In 2000 100 bn photos shot worldwide
In 2010 2.5 bn photos per months uploaded by fb users
In 1997 best face recognizers in FERET program scored error rate of 0.54
In 2006 best recognizer scored 0.01

The future is going to be an augmented reality

Offline data is you in the street, but your online data – your face is the link to that

We first downloaded main facebook profile images and then we went to online popular dating sites and tries to use face recognizers to link to.

On dating site they like to be pseudonymous but by linking to Facebook, 15% success rate.

Then an offline comparison. We asked students to stop by our little desk and take photo with webcam and then fill out a survey, while they were filling it out, unknown to them we compared them online – we could identify 30% of students who stopped by our desk.

Now trying to do it on iPhone. The future of augmented reality in this world – a stranger can identify you.

Choice is not enough – the problem is too large. Facebook has become the new Real ID.

I do believe one of the largest threats isn’t cyberspace but augmented reality.

 

Christina Zaba

 

 

Do Not Track: Yaaay or Boooh?

June 16th, 2011

Tuesday, June 14, 2011; 11:00 AM – 12:30 PM

Moderator:

Jim Harper: Director of Information Policy Studies, CATO; Member, Data Privacy and Integrity Advisory Committee of the Department of Homeland Security 


Panelists:

Ryan Radia: Associate Director of Technology Studies, Competitive Enterprise Institute

Berin Szoka: Founder, TechFreedom

Chris Soghoian: Graduate Fellow, Center for Applied Cybersecurity Research; Ph.D. Candidate, School of Information and Computing at Indiana University

Harlan Yu: Ph.D. Student, Department of Computer Science, Center for Information Technology Policy at Princeton University

Andy Zeigler: Program Manager, Microsoft’s Internet Explorer Engineering Team


Dr. Ed Felten: Chief Technologist, Federal Trade Commission

Do-Not-Track” is the idea that Web surfers should have easy-to-use protection against tracking by ad networks. This panel will explore the merits and demerits of the Do-Not-Track concept, the various technical options for implementing Do-Not-Track, and the legal and regulatory implications of each.

“From a tech point of view, the Do Not Track train has left the station.”

CHRIS:

Deleting your cookies and cache doesn’t stop fingerprinting. Do Not Track finally allows you to say no – it’s a single method by which consumers can tell companies not to pass on their behavior to ad agencies.

ED:

There are no silver bullets. The technologies that exist to protect you from being tracked online are imperfect but they still go ahead, because there’s such advantage in tracking you. Do Not Track doesn’t really protect you.

BERIN:

It’s not about tracking, it’s about use specification – ‘do not collect my data for certain purposes’.

ANDY:

It’s a do not track feature. Tracking protection enforces the user’s choice – websites are connected, they have embedded contact – but there’s a privacy and security impact in including that content in websites. Every time you do that, you share information.

A web page has links, hypertext. But if you go to any more complex page, like the Wall Street Journal, you’ll automatically be invited to get content from other web sites – e.g. you could have a Google map embedded right on the page. And it happens automatically.

When all those websites see you visiting,they get insight into your travels on the web that’s more than just one website. A variety of different techniques are used to track you. If all those websites include the same content, they have a line of sight that you’ve gone there. Tracking protection sites get lists: when you go to a website all contact on that page is filtered through the tracking protection list, so the explorer won’t track the content.

ANDY:

It’s a potential Do not Load mechanism.

JIM:

The leading proposal right now is to put a signal in the header that your browser sends out.

BERIN:

In some ways, tracking protection lists are a sledgehammer – in some ways it’s a very inflexible mechanism. If you don’t load the third party domain, the company can’t decide how to serve that ad. If you get the Do Not Track header, the company could fall back to eg a contextual ad. From a commercial standpoint it might do too much.

ANDY:

Tracking protection is not an alternative to the Do Not Track header. They both have pros and cons.

Do Not Track is a browser option, adding an http header to all tracking requests. It signals to the technology company that the consumer doesn’t want tracking. So for companies, how do they deal with that? Consumers have tools in their hands now…

This isn’t a debate about Do Not Track vs. other privacy controls, this is a debate between Do Not Track header or where the only ad company is Ad Block Plus. They will make it seem like there are other options, but that’s not a real debate, nor how it’s shaping up. If they continue, consumers will use block and then no ads are served at all.

If you haven’t seen the Keynes Hayek rap video, see it – by Russ Roberts – “I don’t want to do nothing …I want plans for the many and not for the few”. We need a mosaic of user empowerment tools.

My point is not that we should do nothing, but that we should have a fundamental debate – plans by the many and not by the few. Do we make something that works for everyone, or have the government put its finger on the scales of how this is used?

We need a system that empowers users: what’s at stake is not just advertising, but commercial messages. What principles should underlie a Do Not Track system? A no-cost optout is not scalable. We need a policy conversation, an economic conversation. It’s not just about ad revenue – they’re targeting messaging too.

JIM:

Question. Is the legislation adequate? Should there be a legislative mandate to build Do Not Track headers in the browser (but we already have that). Do we need legislation?

If a company makes a direct promise to consumers and then breaks the promise, and harms the consumer, the FTC will be interested.

CHRIS:

Last year we discovered that companies would dig through your browser history.

A:

Sounds like something that could be objectionable.

CHRIS:

It’s about the value of things – tension and the sharing of data. Is there an understanding about what’s going on? If you go to a website the fact that you visited it is validly collected.

CHRIS B:

I think fingerprinting is nefarious,  but the first party obtaining information that you visited that site – that’s fine as long as they don’t share.

ANDY:

I don’t think people are thinking about quid pro quo or what the data awareness is.

BERIN:

A lot of debate is only fair if the consumer understands what’s going on, but the economy doesn’t work that way.  Think of the essay by Leonard Reed, ‘I, Pencil’. It’s a complicated process. We’ll never have complete understanding. But are there certain practices or acts that we could agree are objectionable and defy consumer expectations.

I don’t own the fact that I’m in this room and everybody sees me. But I can stop you from mentioning me.

Or another example: You have shrouded the color of your underwear by wearing pants.

BERIN:

What do consumers understand about the quid pro quo? A lot don’t understand what’s happening in the background when a website loads. Consumers don’t want this kind of tracking going on. As a tech matter, Do Not Track is a better, more permanent mechanism to do what we’ve been doing in the past. If the only thing we get is that the Do not Track header is a replacement for cookie opt-outs, we’re already in a better place because consumers have better control. From a tech standpoint the Do Not Track train has left the station. It’s a better technology to empower consumers.

What do consumers what?

ANDY:

When you look at a web page you see an ad, a search box, you have a feeling this isn’t part of it. A lot of people don’t think about those issues at all. If you tell a user what’s going on when you visit a website, many would say that’s creepy – others will say I love ads.

BERIN:

What consumers think is an empirical question. One, consumers don’t want to bother. Two, consumers think they’re better protected by the technology or the law than they are. They already think certain things are illegal or impossible. To me it depends.

ANDY:

I’m not arguaing against DNT but what consumers want is important. First, what do consumers know; second, what would a rational reasonably informed person know? How do you build a user empowerment tool that lets people choose if they’re privacy sensitive, that works for them? If you look at polls, and ask people, you’re missing the key point about trade-offs.

RYAN:

Rational ignorance. It might not matter, might be a good thing in some respects that people don’t find out about privacy. Time is scarce and the world is full of risks. Haven’t taken time because they don’t perceive it’s worth their time. Lots of people are blissfully ignorant how tracking works. When hear about it they think it’s creepy but might, when find out,think trade-off’s worth it.

ED:

We should give consumers choice and let them decide?

JIM:

Sounds like a gotcha question

HARLAN:

The beauty of Do Not Track is it’s a one-click one-stop shop to defend against all these activities.

CHRIS:

but Harlan if it’s a one-stop click  that purports to block tracking that is not the kind of user empowerment we want. We’re arguing for a form of user empowerment with some tool that’s more granular, tailored to real problems, intended to give people diversity options that reflect their privacy options.

JIM:

Harlan – I can see the red in Chris’s eyes – Consumers have an option that’s not too easy to use. Previous versions of Internet Explorer had to re-enable every time, what is right amount.

ANDY:

And complained about the interface – if it’s primarily block or don’t block, that’s not where you want to do.

HARLAN:

Our mothers are not going to want to spend two hours configuring their browser options.

CHRIS:

Are you saying sb shd stop Mozilla from doing this?

RYAN:

We’re expressing that the government has forced them to do something suboptimal. If we look just for government diktat we’re missing the bigger picture. Capitol Hill likes to bully companies.  Parent controls  – can be given tools to allow them easily to subscribe to whitelists /blacklists – allow them to curate it – makes sense – simple options – “here are the sites, you should block if you’re concerned about companies.”

JIM:

Where are we with tracking protection lists?

ANDY:

iegallery.com – an aggregation site offering any number of tracking protections

JIM:

Is this reminiscent of spam wars, – are any of those problems emerging?

ANDY:

That’s why having more aggregation sites is a good thing – download – trust – economics – vision.

CHRIS:

There used to be spam, not a problem now. Google does it for me. What about the spammers being denied Viagra ads, losing money – are we going to sympathise with them? No. It’s a trade-off.

JIM:

So this is the death of the internet, is it not?

RYAN:

The internet is a non-rival good – with tracking, people are not losing anything of value. Info sharing and advertising supports the internet economy.Ads always support media.Problem is with economic analysis. So when you start changing the way the marketplace works, the reality is, building these systems costs money. You’re probably not going to get much revenue, the stakes are high. Display advertising – $7 bn.

Is it worth consenting to be tracked in order to have access to the site?

NB. There were more yays than boos…

Christina Zaba

 

A Clash of Civilizations: The EU and US Negotiate the Future of Privacy

June 16th, 2011

Tuesday, June 14, 2011; 9:00 AM – 10:30 AM

Moderator:

Barry Steinhardt: Founder, Friends of Privacy USA; Senior Advisor and Trustee, Privacy International; Member of the DHS Data Privacy and Integrity Advisory Committee 


 

Panelists:

Jan Philipp Albrecht: Member of the European Parliament from the German Greens

Mary Ellen Callahan: Chief Privacy Officer, Department of Homeland Security


Edward Hasbrouck: The Identity Project

Viviane Reding (via pre-recorded message): Vice President, European Commission

Frank Schmiedel: First Secretary, Washington D.C. Delegation of the European Union, Political, Security & Development Section

 

Europe and the US have very different ideas on what laws and institutions are needed to protect privacy. High-level negotiations between the EU and US will set the international standards for the use of personal data by governments and the private sector. Government officials and experts from both sides of the Atlantic will explore those differences and discuss what is at stake for Americans and the world.

“No good can come of it and it needs to be reformed.”

 

BARRY STEINHART:

Negotiations are now going on between the EU and the US over exchange of personal data in the context of national security, law enforcement and terrorism.

That context includes those directly involved in the negotiations; it also involves the SWIFT exchange of bank messages, and passengers’ records.

So much of the data being exchanged is not just that relating to law enforcement and national security, but is commercial. An awful lot of the data of US citizens is in Europe, and similarly, a lot of Europeans’ data is in the US.

The current US/EU negotiations, if successful, will set a global standard for the transfer of personal information across the world. Asia, Latin America and other countries are looking to the negotiations as a potential bellwether. While a large part of the exchanges are commercial, the discussions are not purely about private exchanges of data. A second set of parallel discussions is also going on.

 

 

JAN ALBRECHT (Member of the European Parliament from the German Greens):

During the last two years of debate on questions of privacy and security and the law enforcement sector, one thing I have learnt is that although there may be different environments of policy-making, and different perspectives, the principles in a way are quite similar. We share, in the EU and the US, the principle that the  state always has to be less intrusive when it comes to fundamental rights intrusions, the right to privacy and a principles-led rule of law across a whole range of private- and public-sector bodies.

This panel is named “Clash of Civilisations”, but I don’t think that applies. There is no clash of civilisations when it comes to values and fundamental principles. But there are different perspectives in the US and the EU.

I have the impression that in the EU there seems to be a more vivid remembrance of situations when those values were eroded and not so present as we wanted to have them.

Sometimes people ask me, isn’t it a little strange that you’re always pointing out the problem of the public sphere, when all the people are themselves using Facebook, putting their data on the internet, and so on?  But I am sure there’s a big difference between people doing that privately and public bodies doing it, because the relationship between people and the state is based on fundamental rights, checks and balances. I can choose not to be part of Facebook and e-commerce, but I cannot choose not to be part of the state.

There is no possibility of getting out of this. I give legitimacy to my government and institutions, and also I give executive power to them to enforce principles and legislation that we have co-decided in a democracy.

So I don’t think there is a problem or any fundamental conflict between the EU and the US, apart from a conflict in policy-making.

For both sides, we’re in favour of effective laws and security. But what is effective law enforcement? What is an effective way of lowering crime rates? What is effective and necessary to achieve fewer attacks and dangerous situations?

Crime science in the past has shown clearly that more surveillance measures have never really led to lower crime rates. That’s an interesting point of view which has disappeared from the debate, especially in the US.

We were all affected by 9/11. Europe also felt that our [security] priorities weren’t right, while on the other side of the Atlantic, democratic engagement has been affected by more surveillance measures, and even more if they are invisible –or, if you like, theoretical.

Even if I just feel the weight of surveillance, it’s enough. We can see people changing their behavior when they think they are under surveillance.

It should be that on both sides of the Atlantic we share the same democratic principles. And so we should invest, for example, in education and social security [provision].  Doing both these things would result in the lowering of crime rates and the strengthening of  democratic society. This is what enables individuals to create a secure way of living together – instead of huge surveillance measures which are threats to an open society.

In Europe people don’t want to be part of a policy of fear. We want a free self-determination, so that we can create secure societies without being surveilled the whole time. All across Europe, the constitutions of countries have rejected long retention periods for the personal information of non-suspects. Profiling techniques is another problematic area.

These are not unbalanced privacy laws. They are the result of the devastating presence of Fascism in all European countries throughout the Second World War. As a German, I am deeply thankful to America and its people, that they helped us overcome the dark times. I was always grateful for the American view of a free individual in a free society where we can all have freedom of speech.

I grew up a few kilometres from the German border. I knew what it  was to grow up at the end of the world. We Germans swore “Never again”. We didn’t only mean no more violence, no more mass murder, but also, never again to a repressed and unfree people.

What we are struggling for today is exactly this. The EU and the US are far away from these horrors and scenarios. People also think data processing and technology could never lead to harm. The Romanians, the Portuguese and many other nations all trusted – and they all got dictatorships. Computers were used by the Nazis for identity profiling.

We shouldn’t forget these terrible examples from the past. We need common rules and standards to protect liberties. We should create a society in an effective and sustainable way. This is an opportunity to invest, not in surveillance but in real programmes, social security, education and more.

As the EU rapporteur for the new agreement on data protection, I’m not willing to say that this is a clash of civilisations. But there is a clear position in between the policy fields of both the EU and the US. We have to build up common rules, the EU and the US, for the sake of worldwide development. We have to set up high standards and preserve good rules – standards and rules we can share on both sides of the Atlantic.

 

MARY ELLEN CHALLAGHAN (Chief Privacy Officer, Department of Homeland Security and involved in the negotiations):

This is an important dialogue, with different elements of government-to-government exchanges of information.

We do have similar principles for data protection and privacy, derived from OECD principles, and have behind us decades of successful information sharing with member states and the US, using fair information practice and principles, codified by the Department of Homeland Security in a robust way.

We share the same basic principles and tenets. It’s like we’re family – and members of a family sometimes have different approaches – in this case because of different government structures.

The EU is primarily parliamentary-based executive, so the rulling parties are focused on an independent data protection commissioner. In the US the systems are very robust – 32 branches, all active in their elements.

Congress decided to first create my office, and then encourage privacy officers in each department, to make sure that privacy principles were embedded in the system. I investigated our own inspector general for a privacy breach. So we aim to get privacy protections inup front and then look after them. We have a great deal of oversight, with a Government accountability office and our own inspector general.

So negotiations between the US and the EU are ongoing, intended to establish mutual recognitions, and exchange for law enforcement, criminal justice and public security purposes, with some differences because of different government structures. The framework within which we engage is important, to be the leader with our EU colleagues in terms of government data sharing.

 

EDWARD HASBROUCK (Identity Project and authority on passeneger name records):

I would like to know what this umbrella agreement is, whose interests it serves, what would be needed to move it forward, and what ought to be included in it.

At the moment what we have is an asymmetry in public and political debate, which reflects a legal asymmetry between us. Though the debate is ongoing in the European Parliament, there’s been no public debate in the US. Why not?

From the US side, it will be presented as an executive agreement, not a treaty negotiated in secret. It will not be presented to Congress. This will be a press release.

But it will be binding in the EU and will be approved by the European Parliament. So it will have to be adequate.

DHS and the President can’t bind Congress to any action. Congress has already said it is not prepared to change any US law in respect of passenger data, and the same principles were expressed [by the EU], but there’s a fundamental difficulty in enforceability. The ICCPR is largely moot and unenforceable.

So the proposed agreement would be enforceable in the EU (and could implicate the EU’s treaty rights), but not in the US. Neither DHS nor other agencies need it, so the question is, why make it?

We don’t need to override any data protection law in the US. The ICCPR is not effectual here, and PNR (Passenger Name Records) are already outsourced to the US. SWIFT had mirror servers in the US, and e-commerce and comms data is all moving through servers in the US. The US can already get them, though to what extent we don’t know – the FISA provisions ensure secrecy.

So DHS doesn’t need this. Privacy advocates have nothing to gain: the US has said it’s not going to change the law. So who needs it?

First, businesses do who are already transferring data in the commercial sphere to the US in violation of existing EU law, who know they’re at risk of liability under EU law. So the agreement would function as an immunity law.

Second, this law would let EU data protection authorities, reluctant to enforce laws, off the hook – a police transfer of data to US would just be doing business.

There is a false distinction beween commercial and government data: commercial transfers of data to the US would be tantamount to putting the data into the grasp of the US government without accountability. This would be a law to remove the possibility of liability. It removes the possibility that EU data protection authorities might exercise their existing mandate.

So unless the US is prepared to make this a treaty (and the EU has said it would need to be a treaty that respected ICCPR), in reality the only thing being negotiated now is how much current protection will be signed away by the EU.

So what’s needed?

The US is not negotiating in good faith. They have already said these are not going to be binding laws. The US will negotiate if and only if they fear US companies will feel pain from the existing law.

So the EU authorities need to begin to actually impose sanctions costly enough to affect the annual reports and bottom line of those companies.

If airlines don’t provide the US authorities with a passenger manifest, they won’t let planes land. The US authorities know that the opposite isn’t true.

The initiative to do this won’t come from parliament, but by grassroots actions by European citizens demanding they enforce European privacy laws that sound good to the US in theory, but which in practice are hollow, such as the safe harbor scheme.

If it were a treaty, and based on the removal of Europe from  ICCPR, beyond that what would be needed would be to recognise a principle that mere effectiveness in policing is not a sufficient argument to sacrifice standards of proportionality and necessity.

If you break into people’s houses you will find crime, but that doesn’t justify house-to-house searches.

There are existing legal mechanisms, and if there’s a suspicion it can be a part of a specific investigation. The US authorities want access to the dragnet surveillance data of non-suspects, without regard to the threat to our security posed by embedding surveillance capabilities in the infrastructure. No good can come of it and it needs to be reformed, and I hope there will arise a European grassroots movement to oppose it.

 

VIVIANE REDDING (Vice-President, European Commission)

[A short speech by video link introducing the issues. Not transcribed: please see live stream]

 

FRANK SCHMIEDEL (First Secretary, Washington D.C. Delegation of the European Union Political, Security and Development Section):

We have also read with interest a letter from NGOs expressing worry that the EU may be signing away the rights and lowering standard of privacy in the US.

I’m not as cynical as Ed. We’re very happy at the vigorous debate in the US on privacy and data protection, because it’s a chance to reach a higher level of protection. Now is the time for creative thinking: how we can improve data protection and overcome differences.

In the EU there is a widespread perception that these data exchanges are largely a one-way street. I keep joking that the US authorities know our citizens better than we do.

So we need more transparency, more protection of our rights and irrespective of country of residence, more reciprocity, more sharing of intelligence derived from this data to prevent security threats. We are building our own, similar systems, and will come to your door and ask you for your data and expect reciprocity. So we need to find solutions that respect our respective legal orders, and apply restraints so that we can make these data exchanges (which won’t go away) legally and politically sustainable for the future.

To paraphrase President Kennedy: Don’t ask when it comes to data, what Europe can do for you – ask what you can do for Europe.  Data exchanges won’t go away, so let’s make them sustainable and reciprocal.

Discussion follows

MARY ELLEN:

But Europe uses wiretaps. We’re not allowed to. It’s a fair point – we talk about fundamental rights, but what does it mean in practice and what’s the enforceability of rights?

For example, when you go to the EU, if you stay in a hotel your information, if you stay on, will likely be transferred to the police.

This is a worthwhile discussion and CFP is a really important dialogue – about transparency. The US leads on transparency, but what we don’t do is privacy assessments. There’s a lot to learn on both sides.

JAN:

We’re working on our own data analysing systems; this is a debate we’re having on both sides of Atlantic at the same time.

We’d be stupid if we just let some Europeans do the work on privacy and civil liberties – we need to do it on both sides. The Euopeans are also going in the  direction of measures that the US took up after 9/11.

But there’s a difference between the classical law-enforcement information-sharing agreements based on using information and  tracking suspects on the one hand, and a new form of electronic  investigation on the other. We need to analyse the way someone is using blanket retention data. Police have always done profiling as part of an investigation, but there’s a huge difference in looking face to face at a police officer while being examined, and having the chance to ask for information and redress, and on the other side, being analysed without knowing it, by a black-box computerised system, with a policeman taking the result afterwards and judging me on that basis.

Such foreseeability is a fundamental principle of democracy and legitimate government.

EDWARD:

Our fundamental rights are at stake here, not primarily our data protection rights.

It’s essential to get the ICCPR into this debate. A data transfer will be used that will impact people’s fundamental rights – if you get a plane or transfer money, that will subject you to a variety of other consequences.

So far, the US side is only willing to say “we won’t make decisions on the basis of this data”. That’s not sufficient. We need to know what decisions, to what degree, are going to be made on the basis of this data.

FRANK SCHMIEDEL:

Information-sharing is essential to fight crime, but we must not forget that  citizens’ rights must be protected when it concerns gathering of all information about Europeans for security purposes.

Now there are negotiations about data protection, trying to overcome gaps and discrepancies, and to secure a solid standard. It’s crucial we can agree not only to share, but also to protect such data. So comprehensive data protection agreements with legally enforceable privacy standards will be essential for any sharing.

Since the end of March,  our negotiations have included comprehensive talks – not easy, with the constitutional and legal differences. We can build to achieve a real step forward in protecting personal data.

MARY ELLEN:

It’s the US’s intent to have an ‘umbrella agreement’ to cover both case and policy sharing. The motto of this conference is “The Future is Now”, and it is.

AUDIENCE QUESTIONS:

Q. ONCE DATA IS TRANSFERRED FROM THE EU TO THE US ARE THERE SUFFICIENT PROTECTIONS AGAINST THE ONWARD TRANSFER OF THAT DATA?

Mary Ellen:

I don’t think that’s an appropriately placed question. The US has specific authorities on how it collects and uses this information – it’s a myth in EU that there’s one big US database. The fear is unfounded.

We don’t get or give data to Google. Under DHS systems policy all protections under system of records are taken seriously, and SORNS and PIAs are indeed accurate.

Jan:

In Europe there’s a very strict purpose limitation on the use of personal data: sometimes we have even forbidden links. There needs to be a separation between intelligence services and the police, which has been overcome aready at the European level.

The concern though is, who decides if the data has to be disclosed to institutions or states.

I think Europeans would also insist on the originator having an opportunity to co-decide if his or her data should be onward-transferred or not. We have explicit data protection in the private sector, so in the case of onward transfer it will be very important to define what is adequate, and define the principles governing any onwards transfer to third states.

If we don’t do this, our principles will erode, because there are other frameworks and systems existing in the world, and if we open them all up together it will be problematic.

Frank:

The privacy directive 95 46 had an adequacy requirement, which meant you could transfer data from Europe to other countries, if it’s deemed to be adequate in preventing circumvention of the EU rules by putting the data in third countries. There is still a concern as to which countries our data goes to, and how protected it is there.

Jan:

The existence of the Patriot Act has always been a challenge to us. Such an Act would be impossible in the EU constitutional framework.

Mary Ellen:

It’s a fair point, and we will try to be more transparent. I agree that the originating country should know when there is an onward transfer of data.

Ed:

In the US, transparency, if any, is provided by the provisions of the Privacy Act for ‘accounting of disclosures’, but DHS have exempted virtually all of the Privacy Act lawsuits. Even for US citizens DHS is exempted from the provisions of the Privacy Act for data disclosures. US companies will only change when they’re forced to, when they see themselves or their competitors hit by painful sanctions from EU data protection authorities.

Jan:

The question arises of whether there has been a directive which would be applied to situations where a US company is acting in the EU. There are provisions in this directive and often it’s questioned if all those provisions fulfilled.

We just heard Commissioner Viviane Redding say that companies operating in the EU have to comply with EU rules, and that EU citizens when taking services in the EU can trust that EU data protection will be applied. So it’s a question of how to really enforce and comply with our own legislation. At the moment we’re debating on a question of applicable law: many people in the EU think that if we don’t want a race to the bottom of big companies, we need applicable law.

Ed:

On  papersplease.org we analyse a draft of the PNR agreement letter referenced earlier, sent by a number of organisations, including the Center for Digital Democracy.

We have asked the data protection authorities for  my PNR,  but only one out of three provided it, and then only when they were leaned on by the data protection authorities.

Jan:

It’s about a public data process, and how to share. We need a stronger data protection common framework on both sides of the Atlantic, since digitalisation forces us to have common rules. The internet will not stop at the EU’s borders – we are all fans of the Arab revolution and the open internet, and we won’t do censorship of data protection rules. It’s very important to build up alliances across the Atlantic – we need common data protection and activist groups.

Christina Zaba

Live webcast of this session

 

 

Cameron Kerry: “We cannot let this moment pass”

June 15th, 2011

MORNING, TUESDAY 14TH JUNE

First up for Conference’s keynote speech, Cameron Kerry, General Counsel in the Department of Commerce, didn’t mince his words.

“There’s one risk and driver across all scenarios: trust,” he said. “A safe and trusted computing environment will promote broadband growth; privacy breaches will undermine it.

“We are closer to a darker scenario. The frequency scale, and the depth, of recent cybersecurity breaches, the spread of measures which interrupt the free flow of information, all serve to undermine our capacity.

“As we move to a cloud computing world, the principal barrier to adoption is confidence – confidence in security, and in privacy.

“It’s like the situation we once had with the early adoption of e-commerce, when the key barrier was concern about security in credit card transactions.

“Confidence in encryption overcame that problem. But today, if credit card companies can’t keep secure, if the gatekeepers to the system can’t keep them from being penetrated, then e-commerce estimated today at $10trillion will never pass the predicted $24trillion by 2020.

“Our response must begin yesterday. What course the internet takes, what scenarios we’re in, are in the hands of all stakeholders. We cannot let this moment pass.”

Christina Zaba

 

“They touched the wrong girl’s hoo-ha”: Susie Castillo wins EPIC award

June 15th, 2011

Former Miss USA Susie Castillo became a folk hero across America when she took on the TSA’s intrusivc “pat-down” policy with a personal campaign and a petition to Congress.

On 13 June the Electronic Privacy Information Center (EPIC) presented her with the EPIC Citizen Activist Award for her work, at their Champion of Freedom awards dinner at the Fairfax Hotel on Massachusetts Ave on the eve of CFP 2011.

“The very most important thing is our constitutional rights and freedoms,” she told hundreds of guests assembled at the dinner. “People have died to protect the freedoms we give up every time we go through a body scanner.

“It’s not just about our rights, but our health. How can the TSA say that it’s 100% safe when they don’t know?

“My video about this issue went viral unexpectedly, but I was glad to take the campaign forward. I felt a responsibility to do the right thing – that’s what my mom taught me to do, and I encourage others to do the same.”

Introducing Susie and praising her achievements, Ralph Nader noted: “We are giving awards to voices in the wilderness. Congress has not done enough to oppose the dragnet enforcement process – a process which is both ineffective and inefficient. We shouldn’t have to rely on people like Susie to remind us of constitutional due process.”

Congressmen Jason Chaffetz and Rush Holt were also honored for their dedicated work for privacy and freedom, and the Wall Street Journal for its series “What They Know” on Internet marketers and profiling – “the most extensive reporting on privacy in the history of American journalism”.

EPIC exists to focus public attention on emerging civil liberties, privacy, First Amendment issues and other privacy matters, and has had a close relationship with CFP from the start.

Democracy Now reports EPIC’s lawsuit asking a federal judge to stop  TSA’s full body scanning procedures with an immediate injunction