Saturday, March 28, 2009

Fordham, filtering

Intermediaries as Legal Filters

Moderator: Rob Frieden, Pioneers Chair and Professor of Telecommunications and Law, Penn State University

Notice and takedown is reactive; what about proactive options, like filtering? Relates to the issue of net neutrality. Proactive filtering may serve intermediaries’ commercial interests. Deep packet inspection allows price discrimination, quality of service discrimination. Other examples of filtering: Eudora had a red-pepper scale of the provocativeness of an email. His wife, who also works at Penn State, stopped getting his emails; he had a link to his blog in his sig, and Penn State’s software determined that terms like “blogware” were automatically disallowed. Once filters are available, what does that do to the notice and takedown/safe harbor regime?

Dr. Ian Brown, Senior Research Fellow, Oxford Internet Institute & Honorary Senior Lecturer, University College London

Three examples of floundering towards information law: copyright infringement, gossip, and child abuse.

E-commerce directive: provisions on liability of intermediaries, similar to DMCA. Telecom lawyers in the 90s were very good lobbyists. Art. 12 “mere conduit”; Art. 13 “caching”; Art. 14 hosting; Art. 15 no general obligation to monitor. The only mandatory exception to the reproduction right in temporary copies is for intermediary transmission.

Movement to 3-strikes regime for users to lose internet service after 3 accusations of infringement. Driven by France. Constitutional objections: disproportionate; infringes rights to privacy, expression, association, education, commerce, civic engagement under the European Charter of Fundamental Rights. Also procedural problems: transparency, public examination of evidence, impartiality, etc. Europe has been slower to enforce constitutional norms than the US, though. It might be 20 years before we get a ruling on that—case filed 1991 on UK’s DNA database, just decided that it was an infringement on the rights of people who hadn’t been convicted.

Another problem: Data Protection Directive, driven by Germany which has obvious historical reasons for feeling very strongly about protecting personal information. The Directive applies to companies as well as governments, and even to individuals, except for a carveout for exclusively personal/domestic activities like keeping an address book. And the only real case out of this directive, Lindqvist, involved gossip on a website about a church member who had a broken leg, and that’s sensitive medical information. The Data Protection authority took her to court, and posting on a website was ruled not to be a personal/domestic use. How far can we regulate individuals as well as data controllers? Individuals are within regulators’ sights.

Blocking child abuse images. The British Telecom system blocks access to pages on a secret blacklist. This is imposed by other retail ISPs by the government in various ways. Wikipedia had a problem because one article showed an album cover deemed to show a child sexual image. UK users were blocked from accessing the article. The technical way this was done also resulted in Wikipedia thinking, because of the IP referring addresses, that everyone in England was coming from a couple of IP addresses, and because Wikipedia bans vandals’ IP addresses, English users were unable to edit Wikipedia for several days. Now blacklists are being considered by the EU, including topics like sites promoting terrorism and discussing bomb-making, with little consideration of constitutional issues.

What role can tech play in protecting fundamental human/constitutional rights? Were the recording industry attacks on P2P systems an unexpected boon for free speech, by spurring a boom in research and experimentation with systems that are decentralized and harder to control?

Can we design for privacy? Data minimization: is your data really necessary? Limit personal data collection, storage, access and usage. Anathema in the US, but a key principle in the EU. Encrypt data when it’s in the cloud, decrypt it only under user control; protect it against companies and governments.

Final thoughts: if code embeds values, we need to think about embedding constitutional values in global computation and communication systems. Why are cyberpunks so disgusted with regulation and so committed to designing disruptive tech? Disgust with corruption in the broad sense: lobbies driving legislation.

Wendy Gordon, Visiting Professor of Law, Fordham Law School; Philip S. Beck Professor of Law & Paul J. Liacos School in Law, Boston University School of Law

Her concern is methodology.

Grokster seemed to threaten the possibility of a thriving filterless internet. Tim Wu said: when the Court pinned liability on intent, it was ducking the hard questions of how important substantial noninfringing use is. Does it ever make sense to use an intent test? Her own view is that copyright is 90% about increasing the store of public knowledge, and 10% about just claims of desert. Morality’s role is often overstated, but it has a place. Is there a defense of an intent orientation?

There are certain bad acts that most of us would hesitate to undertake even if they had longterm good consequences: e.g., the trolley problem. Maybe what was being encouraged by Grokster was, really, terrible. Downloading kids may have been engaged in moral wrongs. But the underlying infringement by the kids is judged on a strict liability standard, so Gordon doesn’t think it should be ranked with things that ought to be prohibited at all costs.

Also, highly probable that claims of property in music cause harm. Standard: how would people do in the absence of the property? How would kids have fared but for this music that surrounds them, coupled with a prohibition on copying? Many are worse off than if they’d never heard the music—if someone sends music out into the world and it affects others, people need some liberty to reuse, even sometimes with exact copies. (She gave me a shout-out!) Exact repetition is part of every religion, every doctrine, the Pledge of Allegiance, every ritual. Copyright owners are not merely conferring a benefit that they can withdraw at will; others act in reliance when they integrate music into their psyches. Thus we can’t condemn uploading/downloading as a blanket matter.

Without that moral core, then Grokster’s behavior doesn’t seem so evil that it excuses us from taking longterm consequences into account.

Dawn C. Nunziato, Associate Professor of Law, George Washington University Law School

Her topic: How broadband service providers are and should be regulated in discriminating against legal content and applications. Backdrop: FCC’s August 2008 ruling that Comcast was unlawfully discriminating against P2P filesharing protocols. Communications providers as common carriers: designed to facilitate transportation/communication without discrimination—postal service, telegraph service, etc. are not permitted to engage in acts of discrimination. That’s how narrowband internet was initially regulated in the 1990s. How should cable broadband be regulated, then? If common carrier, then discrimination would not be allowed.

In 2002, the FCC decided that cable broadband was not a common carrier. What regulation, then? Regulated as “information providers,” which means minimal if any regulation. Brand X: Supreme Court upheld this. DSL and other broadband providers said: what about us? FCC said in 2005: you’re all immune from common carriage regulations. At the same time, the FCC made some broadband policy statements, according to which internet users should enjoy freedom to access their choice of legal content and freedom to run the applications of their choice. These policy documents do not establish rules and they’re not enforceable, yet the FCC pledged to act if they were violated. And these freedoms are subject to broadband providers’ discretion to engage in reasonable network management practices.

Nunziato has documented discrimination against legal content—allegations of Comcast censoring political email; AT&T prevented NARAL from sending messages to willing subscribers; Comcast blocked P2P filesharing using deep packet inspection and then allegedly lied about it. FCC characterized this as opening mail to see if Comcast wanted to deliver it, and argued that Comcast was doing this to protect its own interests in, e.g., video on demand.

Comcast complains: but we’ve been deregulated! FCC: you’re not subject to common carriage regulation, but nonetheless we have ancillary jurisdiction to regulate you, as Brand X said. Comcast: Ancillary to what? FCC: To the 1996 Telecom Act that set forth internet policy of an open internet, and to other areas.

Where are we now? A mess, caused by FCC’s decision to exempt broadband providers from common carriage regulation. These are the pipelines for the internet, and they have no business interfering with free speech.

Frieden: points out that FCC regulated cable before it had statutory authority; ancillary because it had jurisdiction over broadcast TV and cable TV had the potential to affect broadcast.

James Grimmelmann, Associate Professor of Law, New York Law School

Search engine amplification often worsens lots of internet problems, but search engines shouldn’t be targets for solutions. People have always been jerks; now they can be jerks on an unprecedented scale by allowing large-scale anonymity. Site operators have the technical power to mask or muzzle the jerks. But website operators aren’t the only intermediaries in the picture. Web pages aren’t megaphones blasted to horrified recipients: mostly people choose to visit, usually by searching. Search amplifies hate.

If we gave search engines more duties, we might hide harassment without unmasking speakers or shutting down webhosts. But: good search engines help people find the info they want, not the info other people want them to find. Ability to find info is essential to our ability to make our own decisions; it’s also economically important. Search is too important to muck up, so we need to be careful in regulating.

Good search favors active users, and so does good information policy. If you make search less useful, users can’t as easily lead self-directed lives. Crippling search gives content creators and third parties unwarranted power over search users—that is, over everyone who uses the internet. Notice and takedown would make some information unfindable, removing it from the commons. It would also be a slippery slope to making search engines responsible for whatever speech is online. We need internet-wide, general purpose search engines—key to the last decade. Fundamentally, search engines don’t want to mislead their users with half-truths and libel. (Unless, I’d say, the users want to find the half-truths and libel—compare to the argument that eBay doesn’t like counterfeiting; sure it doesn’t, at the point at which counterfeiting starts to interfere with eBay’s own credibility and profitability, but users who are thrilled with knockoffs don’t cause any problem for eBay.)

Search engines also don’t have a relationship with site owners that allows counternotification as a ready response. Regulating search would be a pretense of allowing speech—a “free speech zone” where no one finds the information who doesn’t already know about it. But speech is also about the audience—if people want to find the information, they should be able to do so. Tampering with search is second-best: if we don’t like the content, target the site, which will have better information about the quality of the content and its value.

Jerry Lewis, Chief privacy officer at Comcast: agrees with Nunziato that the law is a mess. There are real underlying technical reasons to engage in network management—congestion issues. The old management technique was content-agnostic and focused on applications generating congestion (P2P services). New technique: content- and protocol-agnostic. It looks at heavy users, and manages them directly. You can always claim free speech interests as a basis for regulation; he’s not convinced they’re important.

Comcast didn’t censor political speech; it has a spam feature and a number of customers hit the “this is spam” button on these emails. Once Comcast looked at the emails and saw they were political, they were whitelisted again in relatively short order. Comcast blocks ½ billion spam messages a day, and there’s a process to fix the inevitable errors.

Scafidi: Gordon’s argument sounds like anyone who has created something and declines to release it is acting immorally. Or maybe it’s immoral to create a bad story that horrifies people, or a jingle that sticks in people’s heads. Under this rationale, is there any justification for copyright? You may benefit enough from hearing something that even a withheld copy isn’t immoral. Isn’t it better to have loved and lost than never to have loved at all?

Gordon: What she’s talking about is part of an elaborate mental conception of the moral core of copyright. In none of this is she questioning the instrumentalist/consequentialist structure of copyright—Congress can further progress in science through copyright. But she is interested in what the moral minimum of copyright must be. One branch: is downloading a violation of the moral core of copyright sufficiently serious to avoid consequentialist weighing over whether Grokster should be shut down?

Also, she is not arguing for a slippery slope. If you take Lockean “enough and as good” seriously, that wouldn’t erode all copyright. The typical commercial copier indeed receives a net benefit even after receiving licensing fees. Scafidi overstates the dangers of the “enough and as good” condition—the condition that the property claimant not do harm to others, and that if he does harm then he does not have an absolute claim to the property.

Nissenbaum for Brown: Privacy right internationally needs to be stated at a high enough level of generality that a culturally specific definition isn’t imposed on everyone. For Grimmelmann: search engines are important, yes, but how far are you willing to go? Can we regulate search engines at all?

Grimmelmann: Purely comparative point—we should go after the site that hosts the content first and preferably.

Brown: Then what do you do if the site is outside your jurisdiction but the search engine isn’t?

Grimmelmann: That’s Yang’s point. As the regulator accountable to my own citizens, I go after the search engine. But for the good of the internet overall, we need harmonization and a set of international standards.

Goldman: Example of internet filtering—move to increasing obligations on intermediaries: cutting off online gambling by going after the payment processors. What do we think of that?

Brown: Regulation is nothing if it can’t be effective. If the US is not going to block access to specific sites, as the UK/EU are trying, then you need an alternative. Note that US trade partners are going after the US for US moves!

Grimmelmann: He thinks this is a procedural issue. Are intermediaries subject to potentially inconsistent and unpredictable litigation? But if you do it in the government, you get a secret blacklist and the process may not be sufficiently transparent and accountable. Assuming we want to ban access to X, how do we go about identifying X and blocking it in an effective and procedurally fair way?

Q: Court records—used to be friction-heavy, so openness had limited costs, but also limited benefits. Now that dissemination is frictionless, does that show that openness was always a mistake?

Grimmelmann: The courts have a responsibility to make privacy decisions; practical obscurity can no longer perform a protective function, so courts have to take new balances into account. Most things ought to be immediately publicly available, sensitive information can be request-only, and truly sensitive material can be sealed.

Brown: Grimmelmann’s suggestion is like the end-to-end rule—decisions are made at the very extremes, not by the intermediaries.

Q: Is trade secrecy a barrier to accountability, for example with filtering/search algorithm decisions?

Brown: in the EU, courts would never allow IP to override human rights concerns.

No comments: