Today brings the ACLU reply in the legal case of Edelman v. N2H2. This is a legal action about being able conduct research concerning censorware, free from the chilling effect of being sued. I'm following the case, for obvious reasons
My (nonlawyer) guess, is that the core of the issue in the reply is this:
N2H2 has also made public statements regarding its intention to assert all legal rights against Edelman if he engages in his proposed research. In N2H2's latest 10-Q quarterly filing with the Securities and Exchange Commission, N2H2 stated:
We intend to defend the validity of our license agreement and to enforce the provisions of this agreement to protect our proprietary rights. We also intend to assert all of our legal rights against Edelman if he engages in future activity that violates the agreement or our proprietary rights.
N2H2 Form 10-Q (filed Aug. 13, 2002), at 22 (emphasis added).4 Further, in the Wall Street Journal, N2H2 spokesman David Burt said the company would defend its license and intellectual property rights:
We think that our rights to protect our intellectual property and our software licensing agreements are valid. And we do intend to defend them.
Suit Seeks Exemption to Digital Copyrights, WALL ST. J., July 26, 2002, attached at Exhibit 2.
Ed Felten has two posts which I think make an unexpected point in contrast - Wishful Thinking, roughly regarding universal copy-control in hardware, and "leeway" about making laws function effectively.
That is, in a way, the spokesman for Hollings, regarding controls, is more correct than is being granted:
Andy Davis, a spokesman for Mr. Hollings, said the technology-minded critics of the bill were "missing the thrust of the senator's argument," which is that there is need for more protection of copyright works if online content and broadband Internet access are to flourish
This is a "politics" reply, which focuses on the short, snappy soundbite, i.e., what-about-the-children, it's-against-theft, motherhood-and-apple-pie, etc. But that shouldn't blind us to the existence of an argument underneath it all.
But I think this is being mistaken for a killer argument that any mandatory copy control proposals must fail, because they must blindly be applied in the most extreme and literal sense. That's appealing to the technical mindset, because one can equivalence all general computers at some abstract level. But it's a much weaker argument in practice.
What the spokesman doesn't want to say, because it would be horrible press-speak, is the following: "Look, this isn't about talking dog collars. It's about locking down what 99.9% of the population uses for business or entertainment. The hard problem is coming up with a solution that works for all of Hollywood and Intel and Microsoft. The practical difficulty is there, not in dog collars."
I don't think it's necessarily correct to believe that this is an unsolvable problem, because we postulate there can be no leeway in the required control for their purposes. Their goal is working the difference between theory and practice.
My anticensorware investigations could hardly say it better
Worth reading, on this topic:
I commented about this in an earlier blog entry. To get an idea of what sort of onerous terms are being put in licenses, take a look at this one, just to use Websense's form for looking up sites on Websense's blacklist:
By using the Websense Site Lookup, you warrant that you are a Websense customer or you are evaluating Websense Enterprise software as a prospective Subscriber. You acknowledge and agree that the information contained in the Websense Site Lookup is to be accessed or used strictly for your own internal use as a Subscriber of Websense Enterprise or in evaluating the performance of Websense Enterprise to facilitate your purchase decision. You may not transfer or assign your right to access or use the information to any other party for any reason. Any other access or use of the information contained in the Websense Site Lookup is strictly prohibited. Websense may terminate your right to access or use the information contained in the Websense Site lookup at any time and for any reason.
Someone needed to know today where my N2H2 censorware report BESS's Secret LOOPHOLE (censorware vs. privacy & anonymity) and similar material had been a factor in the CIPA (Federal censorware law) court decision.
So I dug out exact quotes from the expert witness reports.
In addition, a variety of services on the Internet provide proxy servers, translation servers, and other methods by which a user might retrieve Internet content via a third party rather than directly from the content provider. The use of such devices may stem from an interest in privacy, since proxy servers can prevent web server operators from gathering a variety of facts about a web user. Proxy servers may also provide other helpful services, such as translation of web content into other languages, addition of links to sources of related content elsewhere on the web, removal of unwanted or potentially- hazardous software code otherwise present in some web pages, or removal of advertisements. However, such servers also provide a possible means of circumventing the restrictions of popular blocking programs. Thus, it has been documented that blocking programs seek to prevent access to these proxy servers even when such blocking is not requested by the administrator of a blocking program and even when such sites are not within the specific descriptions of categories requested for blocking. 27 My testing found multiple examples of blocking of these sites, including translation service tranexp.com and privacy service idzap.com.
27 "BESS's Secret LOOPHOLE." <http://sethf.com/anticensorware/bess/loophole.php>
Similarly at least one of the programs tested blocked each of privacy service anonymizer.com, the web-based translation service tranexp.com, and online dictionary voycabulary.com. These sites (and the other web-based services referenced in Appendices A and B to my first report) all offer a large amount of valuable content, and research of others indicates that many other similar web-based services are also restricted by blocking software. 10
10 "BESS's Secret LOOPHOLE." <http://www.sethf.com/anticensorware/bess/loophole.php>
Google itself has a handy list of country Googles I didn't find any of them banning stormfront.org besides the already-noted France and Germany, and newly-noted Sweden.
If anyone wants to play too, here's the Googlely list, in text form:
One so-far unremarked aspect of the Google country-based site-bans, is that they apply to image searching too.
Compare an image search for stormfront.org in Germany
(two results, from www.suffolk.edu)
With the same image search for stormfront.org in the US
(many more results, from the stormfront.org site)
Directory searches are also similarly affected
This is yet more confirmation that the restrictions are implemented as a post-processing step using very simple patterns of prohibited results.
Following up on yesterday's Googlemania, I ran tests for bans of similar Nazi/racist sites on Google Italy (http://www.google.it). I didn't find any sites excluded there. I also tried searching "stormfront" and "hitler" on German Altavista (http://de.altavista.com), and didn't see a difference.
In a fascinating report:
"Localized Google search result exclusions Statement of issues and call for data"
authors Jonathan Zittrain and Benjamin Edelman examine sites excluded by Google from localized country-specific searching. Regarding search strategy, they suggest:
Since the filtering documented above may be but a part of such practices, we hope to augment our public database of such examples by seeding new searches with sites already known to be restricted, perhaps because someone simply searched on a known site and was surprised to find no results.
I found doing cross-correlation between, for example, US and German results from sites related to the banned sites, turns out to be extremely productive.
With just a little programming, I found banned in Germany:
(these are in addition to what had already been found)
Update - and for Google France, other sites, and directories such as:
Cross-correlation of results from sites which link to the banned sites also turns out to be fruitful. Though it requires some care to weed-out many old and repeated results.
For example, such cross-correlation found, for Google France:
In a fascinating report:
"Localized Google search result exclusions Statement of issues and call for data"
authors Jonathan Zittrain and Benjamin Edelman examine sites excluded by Google from localized country-specific searching. For methodology, they use:
A note on search criteria: The authors' searches use standard Google search syntax to request 1) pages on the specified web site (using the site:stormfront.org restriction), and 2) pages that lack a phrase of gibberish (using the exclusion syntax -asdfasdf), since some search term must be specified. Similar searches for other sites confirm that these search criteria provide a reliable estimate of the number of pages indexed by Google on a given web site.
This methodology has a notable flaw - it cannot find any
blacklisted item which is less than domain-level. For example, one
item blacklisted from Germany is the home page of the Holocaust
denier Arthur R. Butz, at URL:
This can be seen by comparing the German search using "allinurl" syntax http://www.google.de/search?q=allinurl%3Apubweb.acns.nwu.edu%2F%7Eabutz%2F
Versus a similar US search using "allinurl" syntax
The German search will return nothing, while the US search finds the relevant pages.
However, this item cannot be found with the "site:" syntax. A "site:" search argument is treated by Google as a domain name, and "pubweb.acns.nwu.edu/~abutz/" is not a domain. Thus, "site:pubweb.acns.nwu.edu/~abutz/" will never match anything.
Moreover, comparing site:pubweb.acns.nwu.edu search results
between Germany and the US
will NOT display any numerical difference in results. This is because
as noted previously, the Google database seems to be identical for
all countries. It is only the search display results which are affected.
Around 6,000 pages are indexed for pubweb.acns.nwu.edu. Since the maximum number of search results which can be displayed at a time is 100, there will be far more than 100 results which can be displayed even when the Holocaust-denier pages are removed.
Of course, if someone tried to retrieve all 6,000 pages, at some point, a difference due to banned pages would be visible. But that's an impractical, or at least very involved, task.
Thus, "allinurl" searches, when used with care as to what they mean, are a much better methodology for searching for banned items.
Again, it's important to note the separate components can appear
anywhere in the URL, so "allinurl:stormfront.org" is
"stormfront" and "org" in the URL, not just the string "stormfront.org"
as might be naively thought).
Update Oct 26:
The "info:" Google search operator is a good way to ask yes/no questions, which works for domains, directories, and pages.
Compare the German search using "info:" operator http://www.google.de/search?hl=en&q=info%3Apubweb.acns.nwu.edu%2F%7Eabutz%2F
Versus a similar US search using "info:" operator
Again, the German search will return nothing, while the US search finds the specific page. However, keep in mind it's possible for both "info:" searches to return nothing, depending on the vagaries of the database. That is, a not-found result in another country search, combined with a found result in the US search, is definitive evidence. But a not-found result in another country search and the US search may simply indicate the particular URL is not indexed.
Note searches for the German page caches ("cache:") DO work, even with banned sites (yet another proof that the database is identical, the results ban is a post-processing step)
In a fascinating report:
"Localized Google search result exclusions Statement of issues and call for data"
authors Jonathan Zittrain and Benjamin Edelman examine sites excluded by Google from localized country-specific searching. In discussing results, they conjecture:
The implication of these results -- confirmed in our subsequent searches on google.com versus google.fr and .de for the terms at issue -- is that the French and German versions of Google simply omit search results from the sites excluded from their respective versions of Google.
This implication can be refined and clearly demonstrated
by observation of more sophisticated searching. The following example
uses the "allinurl" syntax of Google, which searches for URLs which
have the given components (note the separate components can appear
anywhere in the URL, so "allinurl:stormfront.org" is
"stormfront" and "org" in the URL, not just the string "stormfront.org"
as might be naively thought).
Consider the following US search:
This returned: Results 1 - 25 of about 1,670.
Now compare with the German counterpart:
This returned: Results 1 - 9 of about 1,670.
Immediate observation: The rightmost (total) number is identical. So identical results are in the Google database. It's simply not displaying them. How is it determining which domain results to display?
Note which "stormfront.org" site URLs are visible on the German page:
What do these all have in common?
They all have a port number after the host name.
The exclusion pattern obviously isn't matching the :number part of the URL.
It's matching a pattern of "*.stormfront.org/", as in the following which are displayed the US search, but not the German search.
Thus, the restrictions appear to be implemented as a post-processing step using very simple patterns of prohibited results.
Update: See also my explanation "Google Censorship - How It Works"
In a fascinating report:
"Localized Google search result exclusions - Statement of issues and call for data"
authors Jonathan Zittrain and Benjamin Edelman examine sites excluded by Google from localized country-specific searching. In discussing the data, they state:
Many such sites seem to offer Neo-Nazi, white supremacy, or other content objectionable or illegal in France and Germany, though other affected sites are more difficult to cleanly categorize.
The purpose of this note is to point out that one reason for
certain sites being affected, is that they were
formerly in such an objectionable category. Even
though the domain has changed owners since then,
they apparently remained blacklisted. For example, from
consider the site:
1488.com - "Chinese Legal Consultation Network"
However, years ago, this domain was apparently a Neo-Nazi site:
Look at the upper-left-hand corner, "The Swastika Homepage"
Then the domain went up for sale:
"This Domain - for sale"
Then it became the current Chinese site:
Similarly www.14words.com was once a White-supremacist site:
But it is now just an empty domain. That's why it comes up with nothing but a homepage for a hosting company.
The implications here are that the blacklist is not re-examined or updated with any particular care, if at all.
Update: See also my explanation "Google Censorship - How It Works".
The ideas of solving the copyright logjam from Ed Felten's proposal of compulsory licensing music, and Ernest Miller's riposte of compulsory licensing pornography, seem to me to touch on something subtle - statistical payments.
Consider the typical hassle with micropayments - every penny, perhaps even every micropenny has to be shipped-around and accounted-for. That's true by definition. Nobody has figured out how to do this with any efficiency.
Perhaps we're looking at the problem the wrong way. Maybe a better way is to try to have some system that just comes out roughly even in an overall statistical sense. I think that's what being groped at (pun unintended) with the above ideas.
I don't know how to do this myself (if I did, I'd be doing it ...). By my suggestion is to, indeed, not special-case it to music, or pornography, or any particular item. Rather, if we somehow lift the constraint that every transaction must be paid, the focus was merely ensuring that overall, there was enough payment in the system somewhere to make it viable, could we have a workable protocol?
Richard Stallman has a great essay about the issues surrounding "Trusted Computing"/Palladium/DRM, etc:
Notably, it touches on many issues that have been discussed in the past few weeks, such as general vs. restricted purpose: (my emphasis)
"Treacherous computing" is a more appropriate name, because the plan is designed to make sure your computer will systematically disobey you. In fact, it is designed to stop your computer from functioning as a general-purpose computer.
The language aspect:
The presentation made frequent use of other terms that we frequently associate with the context of security, such as "attack," "malicious code," "spoofing," as well as "trusted." None of them means what it normally means. "Attack" doesn't mean someone trying to hurt you, it means you trying to copy music. "Malicious code" means code installed by you to do what someone else doesn't want your machine to do. "Spoofing" doesn't mean someone fooling you, it means you fooling Palladium. And so on.
And all the wonderful things we'll be able to do with these new capabilities:
Making sharing impossible is bad enough, but it gets worse. There are plans to use the same facility for email and documents -- resulting in email that disappears in two weeks, or documents that can only be read on the computers in one company.
There are proposals already for U.S. laws that would require all computers to support treacherous computing, and to prohibit connecting old computers to the Internet. ... To oppose treacherous computing, we must join together and confront the situation as a collective choice.
This quote should be better known:
"I must say that, as a litigant, I should dread a lawsuit beyond almost anything short of sickness and death."
-- Judge Learned Hand, from "The Deficiencies of Trials to Reach the Heart of the Matter", in 3 "Lectures On Legal Topics" 89, 105 (1926), quoted in Fred R. Shapiro, "The Oxford Dictionary Of American Legal Quotations" 304 (1993).
In further replying to Ed Felten regarding Seth Schoen Makes a Doubleplusgood Point, and to my point about Restricted-Purpose Language - note the description of Newspeak matches the purpose of Digital-Rights-Management, on the intended expression side in addition to the suppression side.
That is, for Newspeak, we have:
Its vocabulary was so constructed as to give exact and often very subtle expression to every meaning that a Party member could properly wish to express, while excluding all other meanings and also the possibility of arriving at them by indirect methods.
Don't we have, exactly, for Digital-Rights-Management:
Its vocabulary was so constructed as to give exact and often very subtle expression to every business model that a content industry member could properly wish to sell access, while excluding all other access and also the possibility of arriving at them by indirect methods.
Remember, this was and will be heavily backed by law. Un-dmca codespeak doubleplusungood for the speaker.
This works, though:
Digital-Rights-Management ... the Newspeak of the 21st century.
I've been through N2H2's (a censorware company) motion to dismiss the case Edelman v. N2H2 (regarding the right to examine censorware blacklists). Let me write just one note today.
On page 3 of the motion, N2H2 starts off a section by writing:
"Edelman, who has in the past been paid to examine N2H2's system opposed to Internet filtering, ...."
Now, now, N2H2, you're not a charity. Moreover, it's not as if employees of N2H2 haven't been paid by parties seeking to impose mandatory censorware laws, such as: (emphasis added)
Courts: Library Filtering Of Internet Sites Found Unconstitutional
Filtering companies reacted defensively. "I think they are just holding filters up to too high a standard," said David Burt, director of public relations for N2H2. Although Burt's company is officially neutral on CIPA, he was a paid consultant to the Justice Department on defense strategy.
National Journal's Technology Daily, 05-31-2002
Note: Myself, I've never gotten paid for any of my anticensorware investigations, but I wish I had!
I believe that code is speech, and I believe that its status as speech is not just a legal technicality but a deep truth about the social value of code. What the code-regulators want is not so different from what the speech-regulators of 1984 wanted.
I agree with all of this!
But I'd say the comparison works well for exactly the opposite reasons as intended. Newspeak doesn't conjure up images of the idea that you can't make a language where certain concepts are unexpressible, therefore the Party was silly and stupid to even try. Rather, it conveys images that you can have an official system which is restrictive and oppressive and works to impoverish the vast majority of the population. That is, the comparison to Newspeak is not "it can't work", but "it can work, so beware".
Suppose we remove the literary flourishes from the description of Newspeak. That is, rather than proclaiming:
The purpose of Newspeak was ...to make all other modes of thought impossible. It was intended that when Newspeak had been adopted ... a heretical thought ... should be literally unthinkable ...
Let's have a more qualified, less hyperbolic:
The purpose of Newspeak was ...to make all other modes of thought cumbersome and onerous. It was intended that when Newspeak had been adopted ... a heretical thought ... should be difficult to articulate, easy to be derided and mocked, readily attacked when conveyed to others.
This lacks the punch and flourish of the stark statement of impossible. But it's a much more accurate description of what would likely be the case in practice.
And the idea of computer-language libraries in fact supports this point. What's one big problem with C and C++? The fact that there are so many different libraries which do similar, but not quite identical, functionality. Merely having the ability to extend the language by new definitions is not adequate. There must also be a process to have those definitions accepted in "society" as common, otherwise the process of communication breaks down. Every time a program needs to ported from one library to another, it's a proof that there's a big difference between having the ability to express something, and doing it in a fashion which can be effectively used by other people.
Let's also remember that the strictures of Newspeak weren't going to be enforced by its language merits. Rather, people who started creating unauthorized language-extensions were going to quickly become unpeople - rather like the idea of the DMCA, etc. that programmers or researcher who publish unauthorized expressions are going to be fined/jailed.
I've now gone through the Eldred oral argument transcript. If I were to put my finger on the key point, I think it's here:
JUSTICE BREYER: Why -- I mean, I think you have a point on this equity principle. I wonder, is there any review there? That is, suppose you have a statute, as this one arguably is, where 99.9 percent, many billions of dollars of benefits, are going to the existing holders of copyright on grounds of equity, and the effect of the statute in eliciting new works is near zero. I mean, that would seem -- where this equity idea is the camel and the production idea is the gnat, and is there any -- can we say something like that, or does Congress have total leeway in respect to --
GENERAL OLSON: Well, it --
JUSTICE BREYER: -- who they want to give the money to, basically?
When Breyer asks "... is there any review there", he's putting the Constitutional question in a nutshell - basically, is the Court going to say that Congress has gone too far? Several justices seem to think Congress has, but are they going to make that law? I'm uneasy with predicting that the conservatives are going to do something which is bad for Disney.
I'm heartened, though, to see these comments about "limited times",
JUSTICE SCALIA: General Olson, you say that the functional equivalent of an unlimited time would be a violation, but that's precisely the argument that's being made by petitioners here, that a limited time which is extendable is the functionable, functional equivalent of an unlimited time, a limited time that 10 years from now can be extended, and then extended again, and extended again. Why -- their argument is precisely that, a limited time doesn't mean anything unless it means, once you have established the limit for works that have been created under that limit, that's the end.
Great minds think alike? :-)
Just a general remark that's been on my mind recently: perhaps I'm stating the obvious, but Microsoft/Palladium/TCPA/"Trusted Computing", etc, etc, is not being developed because it's such a neat-o nifty-keen geeky concept to play around with, "Gee, whatever could we use this do ...". It's not to add abstract capabilities to a computer system, as an experiment in the advancement of computation. It's for certain reasons, having to do with rights and permissions and control.
I think programmers tend to forget that practical aspect, in a rush to play with the concepts. But we should know better than anyone how wide a gulf there can be, between concept and implementation.
I have to disagree strongly with the idea that the best example for "The Fallacy of the Almost-General-Purpose Computer" is "The Fallacy of the Almost-General-Purpose Language" In fact, I'd say this example undercuts the point, and actually strongly argues the reverse.
I think we get too wrapped-up in the idea of "impossible", along the lines of the idea that Newspeak was to make it impossible to speak frankly about politics. Yes, right, nothing will ever make it "impossible". But my own experiences with Libertarianism thoroughly convince me that it's certainly common to have a political language that makes it very difficult to express certain thoughts. I can't remember how many times a Libertarian has told me that a concept is invalid, because the English sense of the word used to describe the concept doesn't have that sense in the specialized argot of Libertarianism. As in, for example "censorship means ...". The problem is that the word "censorship" has several different meanings in English, but only a single meaning in Liberspeak ("by the government"). Thus in so many conversations, it's a massive chore to convince the Libertarian that just because their definition is restrictive, doesn't make the concept invalid _per se_. And the Libertarian is likely to endlessly repeat some variant of the idea that because the word in Liberspeak has only a specific Libermeaning, other concepts are invalid. It's not utterly and completely beyond human achievement to explain the differences between Liberspeak and English. But wow, it's an amazingly difficult task, and requires a great deal of analytic and writing skill. It's the best example I've ever seen of how Newspeak would actually function in action.
There's a computer-language version of this too. After all, what's the whole point of the Software-As-Speech argument? Programming languages are designed to make it easy to express certain abstract concepts, where English or other languages don't work well. It's not impossible to express the concept in those same languages, but it is much harder and more error-prone. And then it follows that other concepts may be more difficult to express in the programming language. I remember a parody song, where the punchline was "We're a string-processing in FORTRAN shop". Why is that considered hilarious? Because FORTRAN, as a language is so ill-suited for string-processing as to make doing it typically so difficult as to be a joke. Now, it's not impossible to do string-processing in FORTRAN - but it is certainly cumbersome and hard.
So in the abstract, what Hollywood wants might be impossible. But I'm starting to think the focus on the impossibility is leading to ignoring a much more frightening practicality.
One of people's first reactions to the increase in communications from the growth of the Internet, has always been roughly "Oh my God - there's too much information available - we've got to find some way to control it, some means where people who shouldn't have certain information, can be prevented from being able to read it.".
This reaction was not, as sometimes imagined, exclusive to governments concerned with political subversion. In fact, it was a very standard reaction by many people, regarding many types of information (sex, racism, etc.)
It's entirely logical, even expected, that copyright-based businesses should have exactly this reaction too, when faced with exchanges of information which they feel are threatening - namely, that which has not been paid-for.
Either you make a general-purpose computer that can do everything that every other computer can do; or you make a special-purpose device that can do only an infinitesimally small fraction of all the interesting computations one might want to do. There's no in-between.
Here's my try at such an explanation, geared to Washington concepts:
Suppose you want telephone calls answered, for an office. You can either hire a human and have that person be a receptionist, or buy an automated telephone answering machine. The human receptionist who has the task of answering telephone calls will also be able to answer letters or do any other clerical task. The automated telephone answering machine will never be able to do anything other than answer telephone calls. There is no in-between, where there's a machine which will do all general clerical work, but nothing else.
Moreover, to continue the analogy, the human receptionist, as a consequence of general-purpose ability, will also be able to tell unauthorized people who has been telephoning the office. And perhaps even what the contents of the telephone calls contain (copying!). An automated telephone answering machine will never be able do that either (on its own).
This is simply two sides of the same coin of having general-purpose ability. Note this problem has been well-known since ancient times - where rulers would maim servants in various ways (e.g. cutting-out the tongues of slaves) in brutal attempts to prevent what might be called nowadays, unauthorized information transfer. Recent legislative proposals are perhaps the modern equivalent of those crippling practices.
I received another 250 or so hits from a mention in a news compilation from " Heise Online", a German site. I think the article is noting poor coverage of Internet issues, and links to me in a sentence reading "Are there no unbanned books"?
Another German-based site, stop1984.com, had my report on their news list, which was good for around 100 hits.
I'm without PR in my own land. I received more coverage here from websites in Germany, than websites in the US!
It's DMCA exemption rulemaking time again.
The Copyright Office is preparing to conduct rulemaking proceedings mandated by the Digital Millennium Copyright Act, which provides that the Librarian of Congress may exempt certain classes of works from the prohibition against circumvention of technological measures that control access to copyrighted works. The purpose of this proceeding is to determine whether there are particular classes of works as to which users are, or are likely to be, adversely affected in their ability to make noninfringing uses due to the prohibition on circumvention. This page will contain links to published documents in this proceeding.
Last time, I was one of the people who persuaded the Librarian of Congress to grant one of two exemptions, for censorware.
But the exemptions only hold for three years each time, and now we have to do it all over again:
There is a presumption that the prohibition will apply to any and all classes of works, including those as to which an exemption of applicability was previously in effect, unless a new showing is made that an exemption is warranted. Final Reg., 65 FR 64556, 64558. Exemptions are reviewed de novo and prior exemptions will expire unless the case is made in the rulemaking proceeding that the prohibition has or will more likely than not have an adverse effect on noninfringing uses. A prior argument that resulted in an exemption may be less persuasive within the context of the marketplace in the next 3-year period. Similarly, proposals that were not found to warrant an exemption in the last rulemaking could find factual support in the present rulemaking.
Amusingly, the last time around they said in part (emphasis added)
A number of commenters urged that a broader encryption research exemption is needed than is contained in section 1201(g). See, e.g., C185, C30, R55, R70. Dissatisfaction was expressed with the restrictiveness of the requirement to attempt to secure the copyright owner's permission before circumventing. C153. See 17 U.S.C. 1201(g)(2)(C). Most of the references to statutory deficiencies regarding encryption research, however, merely state that the provisions are too narrow. See, e.g., PH20.
That last reference was me. Wow, can I give them examples this time!
I don't usually bother pointing to articles which are publicized several orders of magnitude better than this poor blog. The following message is a mailing-list gem. Anyone who isn't overloaded with the Eldred copyright case by now should read law professor Peter Junger's excellent analysis of the First Amendment strategy.
It's also useful to keep in mind the "First Amendment" discussion section in the case's most recent opinion.
I decided to see if I could come up with a good optimistic well-grounded argument for the commons side to prevail in the Eldred copyright case. The best place to look seemed to be in the last round of court hearings, the case's dissent. Two judges dissented, what were the reasons? You can't go too far wrong quoting a bona-fide dissent. The most promising material here seemed to be the following:
Contrary to my colleagues, I do not accept that it is sufficient for Congress to merely articulate some hypothetical basis to justify the claimed exercise of an enumerated power. The Copyright Clause only bestows the power "to promote the progress of science and useful arts." In exercising this power, Congress "may not overreach the restraints imposed by the stated constitutional purpose," which is "the promotion of advances in the 'useful arts.' " Graham v. John Deere Co., 383 U.S. 1, 5, 6 (1966). I accept that extending copyright terms for future works may well increase creative efforts at the margin. Once a work is published, however, extending the copyright term does absolutely nothing to induce further creative activity by the author--and how could it? The work is already published. A simple finding by Congress to the contrary is not sufficient to demonstrate that the exercise of that power is "necessary and proper." As the Supreme Court noted in Lopez and again in Morrison, that Congress concluded a given piece of legislation serves a Constitutional purpose "does not necessarily make it so." United States v. Lopez, 514 U.S. 549, 557 n.2 (1995) (citation omitted); United States v. Morrison, 120 S. Ct. 1740, 1752 (2000).
Bluntly, the losers from those decisions were going to be gun-control advocates in the former, and violence-against-women activists in the latter. Here, the biggest loser would be Disney. Maybe that's an overly political view. But it's something to think about.
Well, everybody's talking Eldred. So I might as well do it too.
As the saying goes, prediction is difficult, especially about the future. Here's my worries about the Eldred case:
There's something interesting in the logic the Supreme Court uses in copyright vs. the First Amendment, e.g. where in the past they've claimed in the Harper & Row case:
In our haste to disseminate news, it should not be forgotten that the Framers intended copyright itself to be the engine of free expression. By establishing a marketable right to the use of one's expression, copyright supplies the economic incentive to create and disseminate ideas.
It doesn't sound as if they're going to be amenable to First Amendment arguments, stirring as those may be.
The Court can duck the issue of "limited times" becoming finite-yet-unbounded, by saying the issue isn't absurd yet. If they have to face it again, in another twenty years, when (likely, not if) copyright terms are extended another twenty years, then that's someone else's problem.
There's an avenue for the court to slap down the copyright changes as exceeding Congress's power. But the famous recent time they did this, the Lopez case, that was about guns, a topic which stirs a certain passion in many conservatives, which copyright cannot match.
I've basically been trying to think like a conservative Supreme Court justice, and not found reason for optimism.
Another thought on
The following is not a strong argument, because the legal contexts aren't identical. But I think there's a kernel of an idea here. Is there a way to have " limited times" be thought of in the same manner as the Constitution's Fourth Amendment prohibition against "unreasonable searches and seizures", or the Eight Amendment "Excessive bail shall not be required, nor excessive fines imposed, nor cruel and unusual punishments inflicted."?
Those prohibitions would be nullified in practice if one adopted the view that "unreasonable" is up to Congress to determine, or "excessive", "cruel", "unusual", are all purely a matter of legislative discretion. Though a problem here is on occasion, that very nullification may arguably have happened.
Another obvious flaw here is that those prohibitions are restrictions, but the copyright clause is a grant of power, so they can't be treated identically.
So I'm not putting forth that this ideas works as stated. But maybe it'll inspire something better.
October 9 is Free Mickey day (that's free as in speech, not free as in beer). I've been musing more about the "limited times" issue in copyright. Someday, I think the question of "how much" is going to have to be answered strongly and directly by the intellectual-commons side. The finite-yet-unbounded side has given their answer - infinite in practice, not as long as one penny of profit remains. But I'm thinking the opposition might someday need to have a specific rebuttal.
I was writing about trusted computing and the claim that trusted computing systems give you new features without taking away features you had before. ...
In particular, the suggestion is that you can run any software which you could run before. ...
So there seems to be a clear technical sense in which you can do what you did before and you are only gaining capabilities and not losing them.
Still, people who believe this may still believe that Palladium is not a good thing overall for many users, or will still introduce disadvantages. How can that be?
To argue that point by analogy, you'd want to find examples of where gaining something, or possessing something you didn't possess before, is a disadvantage to you in the end.
[Long list follows in entry]
Why overcomplicate things? It seems very simple to me: Any system which allows control where it didn't exist before, can be said to being "gaining" or "possessing" the brand-new ability to enforce that control. Concretely, consider a Libertarian-esque "ability to sell oneself into slavery". Now, you don't have to sell yourself into slavery. But if it's an option - that is, if you gain or possess something you didn't have before, namely, the option of selling yourself into slavery - it should be clear how it can be a disadvantage. An option to give up rights can leave you worse off than not having such an option, via an expectation or arrangement that makes such giving-up of rights, commonplace (which is exactly what these systems are designed to do, enforce the giving-up of usage rights).
A more realistic example is gaining the unchecked ability to request physician-assisted suicide in the case of serious illness. One might ask, as long as it's an option, how can it be a disadvantage? Well, think of a possible interaction with "cost-containment". Suppose an insurer offered a lower premium if you agreed in a contract, that if a suffering a terminal illness, past a certain point, you would request physician-assisted suicide instead of medical treatment ("cost-containment" with a vengeance ...). While this is a somewhat macabre example, the economic logic of it should be clear. As well as the way it could turn out to be a disadvantage.
More humorously, to become, with one click, Bill Gates' Towel Boy, may not be a blessing.
Myself, I'd just say something like "Gaining the ability to sell yourself into slavery is not necessarily good for you.".
I'm still pondering various nastiness in legal briefs. I recalled a part of the DeCSS case where the plaintiffs even introduced Slashdot comments as evidence against the defendants. I went and looked-up that portion, and it even had invocation of prohibitions against reverse-engineering.
The following is from the John Hoy declaration in the DeCSS (president of the DVD DVD Copy Control Association, Inc. -"DVD CCA")
... To my knowledge, all of the software licensees of CSS technology, including Xing, require end users to enter a "click wrap" license which specifically prohibits reverse engineering.
8. Moreover, the Internet postings of those individuals who were developing and/or discussing the means to hack through the master keys to gain access to CSS technology, referred to in the Shapiro Reply Declaration, demonstrate that they knew, or had reason to know, that such actions were wrongful.
9. For example, postings on slashdot.org as early as July 1999 clearly establish the state of mind of the hacker community. The following is a sample of posts made on July 15, 1999:
(much Slashdot flaming follows)
Hmm - "the state of mind of the hacker community"? It's amazing what gets into these things.
The nastiness in the material from N2H2 in Edelman v. N2H2 reminds me a great deal of the judicial flaming done by Judge Kaplan in the DeCSS case. In particular, the remarks about the plaintiffs and defendants here:
In the final analysis, the dispute between these parties is simply put if not necessarily simply resolved.
*346 Plaintiffs have invested huge sums over the years in producing motion pictures in reliance upon a legal framework that, through the law of copyright, has ensured that they will have the exclusive right to copy and distribute those motion pictures for economic gain. They contend that the advent of new technology should not alter this long established structure.
Defendants, on the other hand, are adherents of a movement that believes that information should be available without charge to anyone clever enough to break into the computer systems or data storage media in which it is located. ...
His sympathies were clear ...
I read through the entire N2H2 response (the motion to dismiss) in the Edelman v. N2H2 case. I'm not sure what to write, what I can write. It's times like these that I think about what's at stake, and the costs involved. I often tell people "This is not a game". There's nothing like reading legal briefs full of lawyer-flaming to bring that home.
The Internet filtering company N2H2 Inc. is asking a judge to dismiss a lawsuit that a Harvard University law school student brought against the firm.
In a filing in August with the Securities and Exchange Commission, N2H2 stated that it will take legal action against those who threaten its trade secrets.
I believe it is helpful to read all The Wit And Wisdom of N2H2 as encompassed in that filing (my emphasis below):
On July 25, 2002, Benjamin Edelman filed suit against us in the U.S. District Court for the District of Massachusetts. Mr. Edelman is purportedly a computer researcher who seeks to conduct a quantitative analysis of the accuracy and comprehensiveness of our "Bess" and "Sentian" Internet content filtering products. If Mr. Edelman downloads our filtering software to conduct his analysis, he will be required to enter into our standard license agreement. The license agreement prohibits users from copying or decrypting our software and from using or disclosing confidential information that belongs to us and cannot be obtained through normal use of the software. Mr. Edelman's proposed activities would violate these provisions of the license agreement or applicable law, or both. He seeks a declaratory judgment that he cannot be held liable for breach of certain provisions of the license agreement as a result of his proposed activities. In addition, Mr. Edelman seeks a declaration that he will not be prosecuted for violations of the Copyright Act of 1976, the Digital Millennium Copyright Act, or laws protecting trade secrets if he conducts his proposed analysis. Finally, Mr. Edelman seeks to enjoin us from initiating litigation against him on the basis of his proposed activities. We intend to defend the validity of our license agreement and to enforce the provisions of this agreement to protect our proprietary rights. We also intend to assert all of our legal rights against Mr. Edelman if he engages in future activity that violates the agreement or our proprietary rights. To the extent that this matter is resolved in Mr. Edelman's favor, however, it could have a material adverse affect on our business, future results of operations, financial position and cash flows. Even if Mr. Edelman's claims are not successful, the litigation could result in substantial costs to the company and divert management's time and attention away from business operations.
I've mentioned the following before, but this portion, right from the horse's, err, mouth, deserves repeating (my emphasis):
Our filtering services have been accused of overbreadth by free speech groups.
In a recent federal court case, a federal appeals court held that certain provisions of the Children's Internet Protection Act resulted in an unconstitutional restriction of freedom of speech. These provisions required public libraries receiving federal funds to install Internet filtering programs like N2H2's on all of their computer terminals. The basis for this ruling is, in part, that such programs are overbroad in the types of speech that they filter out. This ruling is currently on appeal to the United States Supreme Court. To the extent that this decision is upheld, it will negatively impact our ability to market our products to libraries without modification, which could be time-consuming and costly.
Again, they said it, not me ....
It looks like Michael Moore's latest message is another false-positive for spam by SpamAssassin. This might be a repeat of the situation discussed earlier, in SpamAssassin and Crypto-Gram . The mailing reads:
"Michael Moore's Mailing List" ... 10/01/02 1:57PM
October 1, 2002
"YOU ARE EITHER WITH US, OR YOU ARE FIRED!"
I was going to write you a letter about what a pathetic liar George W, Bush is -- but then I figured, hey, why waste your time telling you something you already know!
[body of message snipped]
If you wish to be be unsubscribed from this mailing list, please click the link below and follow the instructions.
Now, SpamAssassin (version 2.31) sees in part:
SPAM: DEAR_SOMEBODY (-0.7 points) BODY: Contains 'Dear Somebody'
SPAM: DEAR_FRIEND (3.1 points) BODY: How dear can you be if you don't know my name?
(net 3.1-0.7 = 2.4 points)
"Dear Friends," is the problem here.
SPAM: CLICK_BELOW (1.5 points) BODY: Asks you to click below
SPAM: UNSUB_PAGE (2.6 points) URI: URL of page called "unsubscribe"
Well, yes, it had unsubscribe instructions, that's generally considered good practice for a mailing list.
SPAM: DOUBLE_CAPSWORD (1.1 points) BODY: A word in all caps repeated on the
"YOU ARE EITHER WITH US, OR YOU ARE FIRED!"
There may be a few other adjustments, my copy of the Michael Moore message is from a website, so it isn't pure with regard to "subject" line and mail headers. Still, the above is basically enough to have the message marked as spam with the default setting (5 points). Now if it gets automatically reported to the distributed spam-killing systems, it'll again get killed from those systems too.
Once more, I hate the spam-wars.