DMCA 1201 Exemption Transcript, California, May 14 - Censorware

                           UNITED STATES OF AMERICA

                                  + + + + +

                             LIBRARY OF CONGRESS

                                  + + + + +

                              COPYRIGHT OFFICE
                                 SECTION 1201

                                  + + + + +

                         ACCESS TO COPYRIGHTED WORKS

                                  + + + + +

                                 May 14, 2003

                                  + + + + +

           The hearing was held at 9:00 a.m. in the 2002-4C, UCLA

Law School Moot Courtroom, Los Angeles, CA, Marybeth Peters,

Register of Copyrights, presiding.


MARYBETH PETERS                  Register of Copyrights
DAVID CARSON                     General Counsel of Copyright
CHARLOTTE DOUGLASS               Principal Legal Advisor

ROBERT KASUNIC                   Senior Attorney of Copyright
STEVEN TEPP                      Policy Planning Advisor

                             NEAL R. GROSS
                           1323 RHODE ISLAND AVE., N.W.
(202) 234-4433             WASHINGTON, D.C. 20005-3701         (202) 234-4433

[HTML'ization by Seth Finkelstein, along with certain technical corrections.]
[See also the DMCA 1201 Exemption Hearing, April 11.]

Page 2





           James Tyre . . . . . . . . . . . . . . . . . . . . . . . . . 6

           Steve Metalitz . . . . . . . . . . . . . . . . . . . . . .     16

Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . .       26





           Brewster Kahle . . . . . . . . . . . . . . . . . . . . . .     72
           Marian Selvaggio . . . . . . . . . . . . . . . . . . . . . 115

           Barbara Simons . . . . . . . . . . . . . . . . . . . . . .     80

GEORGE ZIEMANN . . . . . . . . . . . . . . . . . . . . . . . . . .        92

           Steve Metalitz . . . . . . . . . . . . . . . . . . . . . .     96

Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105




           Gwen Hinze . . . . . . . . . . . . . . . . . . . . . . . . 154

           Robin Gross  . . . . . . . . . . . . . . . . . . . . . . . 170

           Steve Marks  . . . . . . . . . . . . . . . . . . . . . . . 176

           Mark Belinsky  . . . . . . . . . . . . . . . . . . . . . . 184

Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190

Page 3


9:10 a.m.

MS. PETERS: Good morning. I'm Marybeth Peters, the Register of Copyrights. And I would like to welcome everyone to the first day of hearings in Los Angles in this Section 1201 anti-circumvention rulemaking.

The purpose of this rulemaking proceeding is to determine whether there are particular classes of works as to which users are or likely to be adversely effected in their ability to make noninfringing uses if they are prohibited from circumventing technological measures that control access. That's quite a sentence.

Today we have several sessions. And the first one will deal with filtering software. The second will deal with malfunctioning, damaged and obsolete technological protection measures, as well as research security in the public domain. And the afternoon session will deal with copy protected CDs.

You should know that comments, the reply comments and the hearing testimonies will form the basis of evidence in this rulemaking which, in consultation with the Assistant Secretary for Communications and Information of the Department of Commerce will result in my recommendation to the Librarian of Congress. The Librarian must make a determination before October 28, 2003 on whether or not there

Page 4

will be any exemptions to the prohibition during the next three year period.

The entire record of this, as well as the last 1201 rulemaking, are on our website. We will be posting the transcripts of all hearings approximately one week after each hearing.

The transcripts as posted are uncorrected, but each witness does have an opportunity to correct the transcripts.

Let me take this moment to introduce the rest of Copyright Office panel. To my immediate left is David Carson, who is our general counsel. To my immediate right is Rob Kasunic, who is senior attorney and advisor in the Office of the General Counsel. To his right is Charlotte Douglass, who is a principal legal advisor to the General Counsel.

I'm going to try to change this. Last time I said to the far was Steve Tepp. That's the far left. And he said I've never been characterized that way, Marybeth. So, to the left of the General Counsel is Steve Tepp

MR. TEPP: That's even worse.

MS. PETERS: Whatever. Policy planning advisor in the Office of Policy and International Affairs.

The format of each hearing is that each panel has 3 parts. First, the witnesses present their testimony, and obviously this is your chance to make your case and your

Page 5

chance to rebut his case. Then we get to ask questions and, hopefully, they will be equally tough for each side. You should not take any of our questioning as an indication of what we think. This is just the exercise by which we dig out information. Even our facial expressions should not in anyway be taken to reflect what we think. Because the truth is at this moment we have made no decision, and we haven't even sat down amongst ourselves to talk about any particular exemption or what the evidence is. So it's all totally wide open.

If in fact this hasn't happened there's an opportunity to the panel for each of you to question happen. Mostly it's happened that during our questioning you sort of question each other.

Obviously, because we have some time constraints here, we do reserve the right to ask each person who testifies to answer any additional questions. And, obviously, those questions will be made and the answers will be made available to everybody.

I want to at this point thank David Nimmer of USCLA who was instrumental in getting these very nice facilities for us, and actually thank UCLA for all the work in making this possible.

So without further ado, I should mention that Jeff Joiner has joined us, and he's an attorney with NTIA, National Telecommunications and Information Administration. So

Page 6

he's representing the Assistant Secretary that I referred to as having a consultation involving in this process.

The first panel is dealing with filtering software. And the witnesses are James Tyre from Censorware Project and Steve Metalitz, who filed on behalf of many copyright owners a very extensive statement.

So we start with the proponent of an exemption and then we go to the other side. So we will start with you, Mr. Tyre.

MR. CARSON: The microphones.

MS. PETERS: Oh, yes, the microphones. The microphones are actually not to project the sound to everybody who is here. The microphones are solely to assist the recorder. So, when you speak as when we speak, you need to really speak out so that everybody in the room can, in fact, hear you. Okay? Thank you.

MR. TYRE: Thank you. My James Tyre, as you indicated. I'm here on behalf of the Censorware Project.

I'm probably at least a little bit of a mystery both to you on the panel and to Mr. Metalitz because, unlike the people who spoke in Washington, all of whom I know fairly well, and also unlike Mr. Metalitz, I was unable to submit written comments. So I come here as a bit of a blank slate. And that being the case, I want to tell you just a little bit about myself and what the Censorware Project is, to put the

Page 7

testimony I'm going to give in perspective.

I am a lawyer here in the Los Angeles area. I have been in practice since 1978. Much of my practice, though not all of it, has been devoted to First Amendment issues. And it was the First Amendment aspect of Censorware that brought me into this particular field that got me interested in it: First, really as something interesting just to explore, then working really with it. Then starting to think about the legal ramifications of it.

The Censorware Project is a group currently consisting of four people, myself, Jonathan Wallace, Jamie McCarthy, Bennett Haselton. Originally there were two others, including Seth Finkelstein from whom you heard a great deal when you had a session in Washington. Seth has not been a part of the group since about 1998/1999, somewhere in that area. But certainly he was essential when we started the group.

What happened is that it was around 1995 when the issue of Censorware began to become an issue. Seth was telling you that he had been on the Internet since 1985. He had been seeing a lot of changes in it. I cannot tell you that I'm much of an Internet veteran. But fairly shortly after I did get onto the Internet, I happened upon an email discussion group that had to do generally with issues of censorship regarding the Internet, and specifically it was

Page 8

censorware. And I got interested in it, not so much in the sense that I was immediately thinking about filing a legal case or anything of that sort, but I got interested in the implications, specifically First Amendment implications, at some point other possible theories that might be available for use with censorware. And, obviously the First Amendment implications would apply only if the censorware was being used in a public institution.

We have never taken the position, I don't know anyone that's ever taken the position, that if a family chooses to use censorware in the home or if a private corporation chooses to use it at the workplace, that there are any First Amendment issues there. We may criticize it because we don't like censorware does, but we make no claims that there's any particular legal significance to it.

In any event, it was in 1995/1996 when this was really a hot topic, and it became quickly apparent that there was a group of us that had a fairly common interest. And I should also indicate that one of the other witnesses from whom you heard a lot in Washington, David Burt, was a part of these discussions. I believe I first encountered him on the Internet in 1996 or possibly 1997.

So many of us who have been working in this field, regardless of which side we're on, are old acquaintances. Whether we're friends or not is a different

Page 9

story, but we've known each other for quite a long time.

But what happened was, and I know you've heard a little bit about the Mainstream Loudoun case in Virginia. That case, actually, was essential to how the Censorware Project came to into being. And it's actually a good illustration of what the kind of work we do and what the effect has.

Jonathan Wallace, one of the founding members of the Censorware Project, like myself, is also an attorney. And he had done some writing on his own site, "The Ethical Spectacle," about what he viewed as some of the legal issues involving censorware. And it was a very good essay he wrote. This would have been probably in 1996/1997. And it was about that time when in Loudoun County, Virginia the public library was considering putting in censorware, and specifically a particular version of X-Stop called the Felony Load. And a lot of censorware companies and censorware products have changed names, so I just indicate that the product that then was known by X-Stop then was manufactured by a company called Log-On Data Corporation. That product actually is the product of one of the three companies that signed on to David Burt's comments, that being 8e6 Technologies. At some point the company changed its name. So we're talking about a product of that company.

But there was a group in Loudoun County called

Page 10

Mainstream Loudoun. It was extremely concerned with the implications of censorware being used in their libraries. So the head of that group sent an email to Jonathan Wallace and said we really like what you've written in your essay, but can you help us? Can you give us something more tangible. And, again, this was before the Censorware Project as a group existed. But Jonathan contacted two people: Myself, Seth Finkelstein, said can we do something to help these people. The answer was yes.

You've heard about some of the decryption work that Seth Finkelstein did. At that time he decrypted the X-Stop blacklist. He and I together poured through that list looking for the flaws in it and we fed the results from that, from our work there to Jonathan Wallace who wrote a scathing article about X-Stop.

One of the interesting things was that X-Stop was a fairly new product on the market at the time. And it had gotten a number of glowing endorsements from quite a number of people, including specially David Burt, who at that time was still a librarian not working for N2H2.

And we put out that report. And everybody went, in effect, "Oh, my God." And everybody who endorsed that product, including David Burt, ran away from it as fast as they could. Everyone except Loudoun County Public Library system.

Page 11

So, the lawsuit was filed with a lawyer by the name of Bob Corn-Revere representing the plaintiffs, who were library patrons. Shortly thereafter a group of website owners whose content was being blocked in the libraries represent by Ann Beeson of the ACLU intervened on the plaintiff's side in that case. The lawsuit went forward.

David Burt makes a technically correct statement but very misleading statement in his joint reply to the effect of there's nothing in the court record to indicate that the Censorware Project in general or Seth in particular had anything to do with developing the evidence in the case. That statement is 100 percent correct and 100 percent misleading. Because what happened was Seth decrypted the list not just once, but on many, many, many different occasions because you want to see what happens as they find out about new bad blocks, whether they unblocked them, what new they've added to the blacklist, things like that. Through the Censorware Project we were analyzing the lists, we were going through the lists. We were feeding the list bad blocks to the appropriate people involved in the case.

So it may well be that the court record says that library patron X has a declaration that says "I found these 6 bad blocks using the library terminals and, thus, using X-Stop as installed in the libraries." Guess where he found out where to look at those websites?

Page 12

That was the impetus of how the Censorware Project was formed. The three of us working on that and then we added in three other people as we went on to other projects.

The first project we did as a group was a dissection, also based on decryption of CyberPatrol, which you've heard a good deal about, specifically in the context of the Microsystems lawsuit. A lot of these products, as I said, have changed names over the years and CyberPatrol along with another product SurfWatch now have been merged into a product called SurfControl, which I'll be talking about a little bit today. So I want to sort of keep the players straight.

It's interesting one of the things that's said in the joint reply comment; and for this purpose, I'm talking about joint reply, hopefully you will just assume that I'm focusing on the joint reply filed by Mr. Burt. I have no intention of slighting or ignoring Mr. Metalitz' comment, and I will address some of the things you have. But I'm sure he would agree that there's a great deal more detail, and properly so, in the joint reply of the censorware companies than in that which Mr. Metalitz put together.

Mr. Burt said, and I believe this was actually in his testimony, as opposed to in the joint reply, he said -- have reports based upon decryption ever really helped you at all? And he said "No, they don't help us at all." And, of

Page 13

course, I'm paraphrasing. I don't have an exact quote in front of me. Because they just talk about a few sites here and there. They're really not of any use to us.

Well, there's this interesting little phenomenon because every time we have done a report, regardless of what the software it is, and we have done major reports upon CyberPatrol, X-Stop, SmartFilter, WebSense and -- I'm missing one. There's one other, I'm temporarily blanking on it. But five of them. Every time we've done a report, within 2 days the appropriate censorware company has gone through our reports, whether they were based on decryption or some other techniques, and guess what? The sites that we said were bad blocks suddenly are off the list. It's folly to say that the censorware companies do not pay attention to what we do and that they put little credence into the reports that are based upon decryption or other techniques.

We started the Censorware Project in 1997. We've been doing this since then. We're strictly a volunteer group. We all have real jobs, other things to do.

These kinds of reports, frankly, are a great deal more difficult to do than they used to be. I remember the good old says when a censorware blacklist might have 10,000 or 15,000 items on it. It was big news in the industry when the first censorware blacklist had 100,000 items. Now, according to David Burt's testimony a month ago, and I believe

Page 14

him, the N2H2 blacklist has 4 million items on it. It's hard work to go through these lists. So it's not as easy to do these kinds of reports as it used to be. But, every report that we have done based upon decryption and based upon other techniques we have used, has been taken very seriously by the censorware companies and by other people.

My primary purpose today is to go through and counter some of the statements that Mr. Burt made, both in his written comments and in his oral testimony. And really focus on one broader issue.

You've heard testimony that, in essence, there are three types of ways of doing this sort of work. The first way is to start off my decrypting the encrypted database and having decrypted, analyze it by whatever means one does, drawing whatever conclusions and making whatever report one wants to make based upon that. That's what's at issue here today.

But what's relevant to whether this exemption should be extended for another 3 years isn't just that question. I think one thing that's unique about this particular class, both as the exemption was granted 3 years ago and if it should be granted again for the next 3 years, is nobody disputes that the study of censorware is an incredibly important, very legitimate course of study. There is nothing silly about it. There is nothing frivolous about it. It is

Page 15

socially important. It is legally important. No one has ever disputed those contentions. Certainly David Burt never has, and I don't think that Mr. Metalitz will, though I certainly presume to be able to read his mind.

The only question here is whether the importance of being able to continue doing encryption based studies as opposed to other techniques is sufficient to justify the continuation of the exemption. So when I get into my testimony, and I realize you want to keep the opening statement short and I've spent a fair amount of time just giving you some of my background so I'll hold off on this until we get into the question period, but I do want to spend a fair amount of time focusing on the specific issue of the benefits of doing decryption study versus doing what is called either database querying or sampling versus what has been called log file analysis. And in some cases log file analysis really is nothing more than a subset of database querying or sampling. In some cases it's a little bit different.

One project we as the Censorware Project did is a little bit different. We've done them all, so I'm in a position that not many are in to speak to the benefits and detriments of all of them. And I'd like to spend the bulk of time, hopefully once we get into the questions, talking about the differences, specifically talking about the weaknesses with database querying. And as a subset of that, very much

Page 16

talking about the weaknesses of the URL checkers, which you've heard a lot about by N2H2 and some, but by no means, all of the other censorware companies offer.

And with that, I suspect, I've talked more than enough for what you want to hear as an opening statement, so I will defer to Mr. Metalitz and then get to questions later.

MS. PETERS: Okay. Thank you very much, Mr. Tyre.

Mr. Metalitz?

MR. METALITZ: Thank you very much. It's a pleasure to be back here.

I was thinking back to the last time that I was in this position before this panel, which was 3 years ago in Palo Alto. And much has changed since then. We live in a different world, some might say, than we did in the summer of 2000.

And on a less consequential scale, things have changed in the nature of this proceeding as well. And if I might, if I could just take a minute for some general observations before I turn to the subject of filtering software.

I really want to talk about three things that have changed that are quite relevant to this proceeding and that I hope will be reflected in the decision that ultimately results from this proceeding.

Page 17

The first change, of course, is that the prohibition that we're talking about 1201(a)(1) is now in force, and it wasn't three years ago. So, you know, I think this proceeding can now turn to what Congress said should be its main focus, which is determining whether a substantial adverse impact on the availability of works for noninfringing uses is actually occurring rather than focusing as was inevitable in the 2000 proceeding on speculation or prediction about what would occur once the prohibition went into effect.

So I think that the burden that the proponents of exemptions must carry in this proceeding, as they did in 2000, they had the burden of persuading you to recommend to the Librarian that an exemption be granted for a particular class of works, but they also needed to come forward with concrete evidence of the substantial adverse impact that is actually occurring and that is caused by the presence of 1201(a)(1).

Similarly, if they challenge the interpretations that you have made of the statute, whether these be procedural ground rules for the proceeding or the substantive conclusions that you reached in 2000, that is also a burden of persuasion that they must undertake and they would need to persuade you why you were wrong in some of the conclusions that you reached last time.

The second thing that has changed is that we now

Page 18

have some court decisions that have really vindicated the interpretations that you recommended to the Librarian in 2000 and that he adopted them on some key aspects of Section 1201. Of course, there haven't been any court decisions directly on Section 1201(a)(1), but the decisions on other aspects of the statute have clearly established a point that is consistent with your conclusions three years ago, and that is that fair use, one of the noninfringing uses we're talking about here, does not encompass a guarantee of access to copyrighted material by a preferred method or in a preferred format. That's stated very clearly in the Corley decision in the Second Circuit, echoed in the ElCom decision in the District Court here in California. And I think it's quite consistent with the conclusion that you reached 3 years ago.

The third change that has occurred over the last 3 years, and one that I will come back to later on today and tomorrow, is that there has been a huge expansion of availability of all kinds of works in digital formats for noninfringing uses. Really we can speak of a digital cornucopia that is now available to the American public to a much greater degree than was the case 3 years ago. And much of this is attributable to the use of formats and distribution methods that rely upon technological protection measures, and particularly upon access controls. And we've given some examples in our reply comments.

Page 19

We'll talk more about the DVD tomorrow. We'll talk about online music distribution this afternoon as well in the software field, entertainment software, business applications, digital and online delivery of text and database. The fact is that today measured against 3 years ago, we have far more availability by far more people to far more material in digital form than we did 3 years ago.

And the significance of this is really twofold. One, your mission is to determine whether the availability of these materials for noninfringing uses has been substantially adversely affected by Section 1201(a)(1). And this includes the availability through licenses, through permitted uses and other types of noninfringing use. So if those have increased, then the availability of these works has also increased and you need to take that into account.

Second, I want to emphasize that as you recognized in your conclusions in 2000, you are really performing here not a one sided calculation, but a net calculation. And even in instances where you find some adverse impact on the availability of works for noninfringing uses, you also have to look at the degree to which technological protection measures have facilitated this use. It is a net calculation, and I think Congress was correct when it said the question here is whether on balance there has been an adverse impact on the availability for noninfringing use that is

Page 20

substantial enough to justify an exemption.

So this is a question I'm going to come back to, not really as a promotion for what the 17 organizations that I represent here have done in terms of making material available to the public, but simply as a way to shed light on the balance that you need to strike in the proceeding that we're engaged in.

Well, let me turn now to the question of filtering software and just briefly summarize our position on this.

First of all, the exemption that's been proposed is verbatim the same or almost the same as the one that is in existence now. So it presents squarely the question of how you should proceed in judging whether the exemption should be recognized for an additional 3 years. And I think nothing is clearer from the legislative history and also from your prior conclusions that this is a de novo determination. The burden remains on the proponents. And the fact hat there has been an exemption in effect for the current 3 years does not weigh in the balance as to whether there should be a new exemption recognized for an additional 3 years.

I think with regard to filtering software, unlike the other exemption that we'll talk about later on today, I think at least some of the proponents of the exemption have made an effort to shoulder that burden and

Page 21

tried to present to you with information to demonstrate how the exemption has operated in practice and why it is needed, why it is still needed or why it should be renewed. I think Mr. Tyre's presentation also was along that line. But I did want to underscore the de novo nature of the determination and the fact that the burden remains on the proponents to bring forward, again, concrete evidence about what is actually occurring.

Now, in the 2000 rule recommendations that was adopted by the Librarian, you essentially had an uncontested proceeding. I think the conclusion virtually states that, and there are several conclusions that were drawn there. For example, people who wanted to make fair use of the type of comment and criticism use that Mr. Tyre's talked about of these lists of websites had no alternative but to decrypt them. That there was no other legitimate way to obtain access to this information. And you also had no other evidence before you at that point, according to your conclusion, that these technological protection measures were at all use-facilitating or that granting an exemption for decrypting them would decrease their availability in anyway.

I think all of those points are now very hotly contested in the proceeding before you. You have an extensive submission from several of the companies, and you had testimony April 11th. And I know Mr. Tyre will be rebutting

Page 22

some of that testimony as well. My point is simply that you now have the issue joined before you, and I think you're in a position to determine whether the proponents of the exemption can carry the day. But certainly the record before you raises a question about whether you can, in fact, find out without decryption whether any given site is blocked by one of these programs. And you also have evidence, which I'm sure Mr. Tyre will comment on, that there has been a great deal of research and comment and criticism that's been undertaken of these programs by methods that do not involve circumvention of technological controls.

Now, one other factor that I think is extremely relevant here, which is what use has been made of this exemption during the period since it came into force in October of 2000 up until today, I think that as least as of the beginning of this hearing the record was quite murky about that, as I read the transcript of the April 11th hearing. It wasn't clear what the witness testifying there actually had done.

Now Mr. Tyre's testimony that describes a little bit of what he did and perhaps he will pursue that further to find out whether those acts of decryption took place before or after the exemption came into force. But as we pointed out in our reply comments, it is relevant what use is being made of this, how often it's being used, how many people are using it.

Page 23

And I hope you can develop the record on that before you reach a conclusion about this exemption.

Now, I'm not sure that the organizations that filed our joint reply comments really have much light to shed on how some of these contested issues should be resolved. But I do want to just refer to three aspects of the evidence as it stands now that I think are relevant.

First, I think you have to determine whether what the proponents are seeking is the preferential or optimal means of obtaining access of this information for their fair use purposes or by contrast, do they have sufficient access to it now, is it sufficiently available for them to carry out these types of activities without circumventing? And this, of course, has to be gauged in the light of the conclusion that you reached in 2000 and that the courts reenforced in the ensuing two years that fair use does not necessarily mean fair use in the preferred or optimal format. Just noting access to material in a preferred or optimal format.

The second issue is the scope of the adverse impact. Is it de minimis or widespread? And, again, this gets to the question of what actually is being done under the shelter of this exemption today.

And the third point which I hope that the record will be developed on is whatever adverse impact there is can be ameliorated or even eliminated in other ways such as

Page 24

through private agreements. And I thought there were some tantalizing hints of this in the testimony you heard on April 11th about the potential availability of these lists to bona fide researchers under agreement with the proprietors, the people that compiled them and that have the copyright interest in them.

I think it's Mr. Tyre's right that some of these reports have been taken very seriously, and there may be a very active interest on the part of some of these companies in cooperating with researchers, which might correspondingly reduce the need for any exemption in this area.

Now, finally, I just want to come to our main concern about this exemption. And I hope I don't get too deeply into the arcane and metaphysical question that I'm sure we will grapple with today and tomorrow, which is what is a particular class of works in terms of the statute. I think this is actually a simpler question as to whether this class that you recognized in 2000 is too broad. I'm going to assume for now that the class you recognized fits the criteria of the statute. In other words, it describes a particular class of works.

And I want to emphasize this point, because we do live in a different world today than we lived in in the year 2000. And I think our concerns about computer security and about protection of the safety and security of our

Page 25

computer networks is heightened today contrasted to where we might have been in the year 2000.

We know that filtering software that may fit the description that appears in the exemption that exists now, is one of the key tools in keeping our network safe and secure. And many of those filtering software packages may include lists of websites that either are the sources of viruses or the source of spam, which is of course is a scourge that we're all having to deal with increasingly now.

In other words, that programs that really I don't think anyone in Mr. Tyre's would consider censorware may be swept within the ambit of this exception with potentially very serious consequences in terms of compromising the security and safety of computer networks.

Now, of course, there's no evidence in this record whatsoever that there has been any substantial adverse impact on the availability of copyrighted materials for noninfringing uses or that were would be any of the action of circumventing access to those types of security software lists were to be prohibited. So there's really no basis for extending or maintaining such a broad definition of this particular class of works with the breadth that would include those kinds of security programs.

And I think one thing that I hope that the panel will is, and I think Mr. Tyre and his group could probably

Page 26

make a very important contribution here, is to more narrowly focus this exemption if you conclude based on the testimony that you hear and the contested issues that are before you, that it is justified and that the proponents have met their burden with respect to censorware, then I think the exemption needs a definition of censorware. The exemption needs that in order to more tightly focus it on the area where the need for it has been shown.

And, again, because of the name of this project, I'm sure Mr. Tyre can provide you with a proposed definition of censorware that might be useful to you and that might fit better within the definition of a particular class of works that Congress urged you to look at.

So, I will conclude there and be glad to try to answer any questions you may have either about my general remarks or about the filtering software exemption. Thank you.

MS. PETERS: Thank you.

Let me start the questioning, and actually you asked the questions that I sort of had identified.

Mr. Tyre, you talked about the three ways in which people try to deal with what's in the fire of CyberPatrol or whatever. And you mentioned decrypting and analyzing, and then reporting database inquiry log file analysis. Could you tell us why the database inquiry and the log file analysis is not sufficient and why the decryption

Page 27

method is not only the preferred, but the only way that you can do what you want if you can do that? And comment a little bit about Mr. Metalitz' issue with regard to wouldn't special agreements work?

MR. TYRE: Okay. I'd be perfectly glad to talk about that. I think that's the main reason why I'm here today, as a matter of fact. And this actually does go both to what Mr. Metalitz has said today and what he has in joint reply, and also what happened in the Washington testimony.

I'm going to break it down into segments. And let me refine one thing that you just said.

We have never contended that the other methods based upon any technique other than decryption of doing this kind of work are completely inadequate. We've done studies using log file analysis and database querying ourselves. There's lots of things you cannot find out using those methods. They are not nearly as good as decryption and analysis based upon description. But we are not saying, and I want the record to be clear on this, that they are useless.

MS. PETERS: So you think they're too limited?

MR. TYRE: Yes.


MR. TYRE: Yes.

Now, I want to start off with database querying or sampling, and I want to start even more focused on that

Page 28

with the specific question of so-called URL checkers because Mr. Burt told you and he gave screen shots in his joint reply comments of the URL checkers that four censorware companies, his own, N2H2, WebSense, SmartFilter and SurfControl, which is what used to be CyberPatrol have. They're web interfaces. You can go to them. You can type in a URL and it'll tell you it's not blocked, it's blocked in this category, it's blocked in that category. Great. What's the problem?

Problem number one: Mr. Burt used very careful language to tell you about those four and no others. If you want to take a look at my Exhibit 2 in your booklets, this is just a little survey I did on Monday just confirming results I already knew.

I checked the nine major censorware copies. How many of those censorware companies even offer URL checkers? Exactly the four that Mr. Burt mentioned and not one more. Four out of nine offer them.

And I should note that two of the three who signed onto Mr. Burt's joint reply companies, 8e6 Technologies and Be Safe Online do not offer them. So we've got nine major censorware companies, five don't even have them. So let's completely throw them out for purposes of talking about URL checkers. That's half the industry right there.

Now, there are other players than just these nine, but I choose the nine major players because I didn't

Page 29

want to make this list too extensive. And between these nine we have most of the field covered.

Then I want to talk specifically about one particular URL checker, that being the URL checker of WebSense. And I ask you to flip over quickly to Exhibit 3. WebSense's URL checker is different from that of all the others. Because with all the others, N2H2, SurfControl, you just go there, you type in to your heart's content, you get whatever results they give you. Not WebSense. WebSense as you can see from the form here they make you register using a real email address, you can't even use a webmail address such a or, or something like that. You also can't use an address or an address, or something of those sorts because they consider those to be addresses for home users, not for serious business Internet uses. That's an interesting assumption on their part, but that's the assumption they offer. And it's spelled out right here in this little exhibit. It's one of the reasons why it's printed out.

So as long as you have a good enough email address to satisfy their criteria, then they will email you a password and if they email you the password, then and only then can you access their URL checker.

And if you look at the very bottom of page 1 of Exhibit 3 going over to page 2, you'll find their terms of

Page 30

service. And their terms of service say, in a nutshell, you can use this if you are a customer or you're seriously considering becoming a customer of WebSense.

So the minute I clicked on that, I violated their agreement. They can sue me if they want. I'm saying it openly. I have no intention of ever becoming a WebSense customers, but that's what I had to do to get access to their URL checker.

Then here's the real flaw in WebSense. Let's go to Exhibit 4. It's a big exhibit, you do not have to look at all pages.

The first URL I called up on their URL checker just because it might amuse you was something called And you'll be happy to know that you are classified as a government site in their web checker. It might have made for a good joke if you were classified as a porn site, but they got this one right.

MR. CARSON: There's a lot of scurrilous information in there.

MR. TYRE: Now, if you want at your leisure, you can go through the next 21 pages. I don't really care. What I want you to do right now, this is a test I ran going through this just manually entering URLs at random. For the purposes of this test I don't care whether their classification of any particular website was right or wrong. What I do care about,

Page 31

and I've replicated this experiment more than several time; this was not an anomaly, is that after running 21 pages, what you see in the first 21 pages of this exhibit. You get to page 22, and please forgive me if I have to squint a lot when I'm reading things, but I don't have a whole lot of eyesight.

But on page 22 WebSense site look up tool. "Your organization has exceeded the maximum number of lookups for a single day. Please try again tomorrow. WebSense has implemented a limit to ensure the use of the master database for WebSense customers and prospects only. Thank you for your understanding." Twenty-one a day. That's very helpful. I hope the record reflects I was being highly sarcastic in saying that.

I think we can pretty well discount WebSense URL checker as a valuable research tool. So now we're down to only three companies out of nine that have even potentially valuable URL checkers.

The next exhibit, Exhibit 5, all of these were done from N2H2's URL checker. These were not done to show any particular problem with N2H2's URL checker. It has had problems in the past. Those problems apparently do not exist anymore, so I'm not going to talk about those problems.

I created these exhibits to illustrate in a fairly tangible fashion what some of the problems with database querying is. And for purposes of this, it does not

Page 32

matter whether in this particular case I happened to be using a URL checker, as I did for this exhibit, or whether I happened to have a running copy of N2H2 and I'm doing more extensive database querying. The problem is the same.

In the CIPA trial, CIPA being the Children's Internet Protection Act the formal case being American Library Association v. United States. There was expert testimony, and this necessarily was very rough, that there are approximately 2 billion webpages out there. That was a year ago. We don't need an expert to sit here today and tell us that same expert would give us a much larger number today. And it wasn't actually 2 billion webpages, it was 2 billion indexable webpages. Only those pages that can be found and indexed by search engines, which is a subset of the entire web.

I could explain that if you want, but I think the figure of 2 billion by itself is big enough to make one of my points.

Then you have something like N2H2, which has a database of 4 million entries, according to David Burt. That doesn't necessarily mean that they block 4 million websites. Those 4 million entries could block, for all we know, 7 or 8 million websites. For example, as all of the censorware companies do, they have blocks in certain of their blocking categories on the free web page services. All of them block Geocities or what used to be geocities. Now it's

Page 33 in at least one of their blocking categories. That's only one entry in their database, but that entry in their database puts a block on however many tens of thousands or maybe even hundreds of thousands pages there are on Geocities, as I still prefer to call it because I'm just used to saying that.

You think about those numbers, 4 million entries in the database, 2 billion webpages. Not websites, webpages. How is one going to devise a statistical sampling of a database query that it's going to find truly meaningful ways of discovering what the problems in the database are?

And this next set of exhibits is intended to illustrate for any database querying method, not just for N2H2 URL checkers, that there are problems with that can be solved by decrypting, looking at the list, but that cannot be solved effectively simply by database querying.

Now you'll see on the first page of Exhibit 5 I called up the site to see how it was classified. And it's classified not currently categorized in the N2H2 database. Great. Peacefire's clean. Don't have to worry about it. Move on to the next domain name, right? Wrong.

Turn to the next page. Go to a subdirectory in, That subdirectory is blocked by N2H2 as a loophole site. And I believe you heard just a little bit about what a loophole site is, so I'm not

Page 34

going to further burden the record with that. I just chose that one because I happened to know that it was there, not because I want to further burden the record talking about what false sites is.

So, what do you do when you build a database for purpose of doing a database inquiry? Do you do it just with domain names? Do you do with directories? Do you do it with subdirectories? How do you build that database and how do you even know what subdirectories that you are to include in the database? This is a problem.

Another example, the same problem. And I'm glad they're sitting behind me, because I wouldn't want to be talking their back. But the next page of Exhibit 3 I called up They're clean. Not categorized. Wrong. Turn to the next page, their Blue Ribbon Campaign, which they've been running since perhaps 1993/1994 is in the world according to N2H2 a drug site. And I thought it was important that you know N2H2 thinks it's a drug site, because later today and tomorrow you're going to be hearing a lot from EFF personnel, and you really ought to know the quality and caliber, at last according to N2H2 of who you're dealing with. Who in this right mind who has ever looked at the EFF Blue Ribbon site could possibly think it's a drug site? How could one imagine searching that particular subdirectory, and yet there it is in the N2H2 database, it's a drug site. So I have a bunch of

Page 35

druggies sitting behind me according to N2H2.

Now, I told them I was going to tell a joke at their expense. I can't see behind me to see if they're laughing or they're starring at me.

Now, we turn to the next one and we get to a very interesting example. The next page in the exhibit is UK being the country code for the United Kingdom. That's the basic root domain. And we see that N2H2 blocks in the games category.

So suppose I want to find out how that website is blocked or it's because I happen to be the owner of that website, which I'm not, I type in the website address. I see, okay, it's games. I don't care if it's blocked in games. I only care if it's blocked in the category that a public library likely would use. So I won't do anymore searching because I'm not concerned with the games category. Once again, please turn to the next page we start going down to a subdirectory level. We've got -- uh-ho censorware. And guess what. That's illegal. So depending upon where we are on that site, we have N2H2 taking the same site, categorizing it under two completely different categories. If I was just setting up random database, how would I know, particularly if I didn't have the knowledge and experience that I had, to know that gosh, they may classify part of the site one way, they may classify another part of

Page 36

the site a different way?

And then I want to turn to the final example where I'm going to walk you through a series of 4 pages to show just how far you have to dig to find some of these.

This next site is, AU being the country code for Australia. The root domain name free bill of health from N2H2.

Let's go down one directory to the next page, Clean bill of health. No problem.

Let's go to the next page, down one more subdirectory level, Well, that censorware site's okay. No problem.

Let's go to the last page of the exhibit going really deep into that site, Uh-oh, we've got profanity there.

Now, how far have we had to dig into that site to find something N2H2 blocked? How could anybody in the real world as opposed to in some completely theoretically world even think to go down that far in the directory structure of that website to look to see if there's a block or not. Maybe Danny Yee, of the owner of this site, might think of that. But I have no clue who else would think of that.

And if you're wondering, well, how did I know this if nobody else would think of that? There was some

Page 37

dispute about whether Seth Finkelstein had decrypted the N2H2 blacklist. I asked Seth to find me examples to prove a point I wanted to make here today. He did not give me the entire decrypted blacklist. I do not have it. I have never asked it. But I specified to him what I wanted, find examples. He sent me examples.

These examples that I just gave to you came from Seth's decrypted blacklist which Mr. Burt claims Seth never decrypted. That's how I know about these examples, and it's unlikely I ever could have found them without Seth having decrypted the blacklist and given me these examples.

MS. PETERS: So you're basically saying that decryption is the only way to have gotten this?

MR. TYRE: Sure. For this purpose, yes.


MR. TYRE: Suppose hypothetically I had a list of every domain name in every top level directory, whether it be the big three .com, .org, .net, whether it include the sponsors TLDs whether it be yours, .gov, .mil, whether we get into country codes such as a .au or a .uk; suppose I had the list of every single one of those, could I write a script that would feed every single one of those through N2H2 or SurfControl or so forth? I personally couldn't, but I know many people who could.

Let me very quickly say that I personally do not

Page 38

do decryption because I do not have the technical skills for it. It is a very, very skilled thing to do. And I do not have those skills, but I know a lot about the results of it because I've worked with people who do it.

But let's get back to what I was saying. I feed through every single domain name in the world regardless of what TLD is, it's going to give me a picture. It's not going to tell me everything because it's not going to tell me whether a particular site instead of being blocked at the domain level is going to be blocked at a directory level or a 3 level below subdirectory level. It's not going to tell me with that snark.freeserve. site whether it's going to have one kind of block at the root or main level and another kind of block at the lower level. These are the reasons why database querying is not as effective as decrypting the entire blacklist and going through it.

One uses tools to go through it. One can't simply read and blacklist or else one would go crazy. And by the time one finished reading it, it would be completely out of date in any event. But the only way to find blocks at this level of granularity is by doing decryption.

Give you another example, this is an example from the past but it's a good example of why database querying is not good.

Most of the studies we do at Censorware Project

Page 39

we look for so-called overblocking or blocks are wrong or they're bad blocks. Occasionally we've done the other side where we look at underblocking where they don't block what we were supposed to do. We did a study with N2H2 where we did both. But that's one of the few times we've done both sides of it. But there's a very famous example that we did with CyberPatrol.

A site called It's a youth soccer league in Massachusetts. You all know what youth soccer leagues are. You can all pretty well imagine what would be on the website of a youth soccer league. Here are the teams, here are the standings, here's the schedule, here's the age groups, all that. Who would think to put that into a database query as part of a sampling?

CyberPatrol blocked it. Why did CyberPatrol block it? Because it talked about teens age 13 to 15. Uh-oh, that could be sexual. Could be child pornography. Could be a variety of other things. It wasn't.

And the funny thing about that was we exposed that block, CyberPatrol, as did all of the other companies, went back and unblocked. Then they went back and they reblocked it. We exposed the fact that they're stupid, they reblocked this site. They unblocked it. Went back and reblocked it. Not because they're malicious, but because they do most of this by computer robots, not by human review, and

Page 40

the computer robots are stupid. Computers are not smart for this kind of work. They never have been. Some day they may will be, but they surely are not today.

So we did that a second time. They unblocked it, they reblocked it. I won't tell you exactly how many times we went through this cycle, but eventually I decided to have some fun with this.

I wrote an open letter, you know, to the President of CyberPatrol: From the President of Cyberpatrol to the PR Director for CyberPatrol, who was actually on one of these discussion lists I was telling you about, and was very active in the discussion. At that time people from all sides really were talking about this. Her name was Susan Getgood. And the memo said something to the effect of "Susan, they're killing me. You've got to find a way that we can't keep reblocking this site. Those Censorware Project guys are just driving us nuts. Fix our program. Do something."

They kept reblocking it. They kept unblocking it. Eventually they fixed the problem. And that story is not just a fun little story, but it's an answer to a question that was raised in the first hearing. You know that during the first hearing Seth Finkelstein did have on one or two occasions access to the N2H2 blacklist. But then N2H2 stopped letting him have it, not surprisingly, but they stopped. Was it enough for him to have it once? To analyze

[Editorial note from Seth: James Tyre means here that N2H2 let me have an ordinary product demo, not the plain text of the blacklist. Later, they would not even allow me an ordinary product demo]

Page 41

it once, yes. Was it enough for him to determine how many new mistakes they kept making, whether the mistakes are isolated instances, whether they're a problem at the system level? The only way you can do that is if you keep doing this over and over and over again.

In the Mainstream Loudoun we went through probably 8 or 9 different iterations of X-Stop because it was important to see not only whether in the course of discovery the bad blocks that were being revealed were being unblocked, which for the most part they were, but what new bad blocks were being added. It's like the old Jay Leno commercial for Doritos, "we make more." It's guaranteed every time censorware companies add more to their blacklist, there's going to be more mistakes on them. You have to have continuous access to the list to find out what's on it. It's all fine and good to know what was blocked two months ago, but that doesn't tell you what's blocked today and how systemic the problems are.

Now, that's why combining those factors together, doing database querying although it has its uses, is not as effective as doing decryption and having the ability to do the decryption as frequently as possible.

MS. PETERS: I asked about private agreements, and you just basically cited and said that Mr. Finkelstein basically had the list but no longer did. Is that a comment

Page 42

on what agreements might be reached that maybe you can get an agreement to get it once, but having continuous access is a problem?

MR. TYRE: The practices vary somewhat from company-to-company. But the normal practice is that you fill out a form, you give them your information. Anytime I've ever done this, I've used truthful information, no fictitious identity. And I believe that the same is true for Seth and other people I know who have done this. You fill out the form, they don't do any particular checking on it, you just enter your information. As soon as it's entered, you can download the 30 day trial.

The only time I've known of when that was not the case was with a product called SmartFilter when their sales person after I registered actually called me. And before he called me, he did a search on me and he saw I was a member of the Censorware Project and saw what the Censorware Project did. And he still let me have a sample. It's the only time I know of that's ever happened when a company has agreed to let someone like the various members of the Censorware Project -- I think I'll pass on defining whether we're reputable or not. That's for others to decide. Has actually let any of us have something like that with knowledge of who we are.

David Burt's testimony in Washington was very specific with a reputable lab, such as Consumer Reports or

Page 43

something along those lines, we've talked about this within N2H2, but we've not really decided. Maybe if they let us be present while they do their testing, maybe if they sign a nondisclosure agreement, then maybe we'd let them have the information and we'd give it to them in a decrypted form. We wouldn't even make them go through the trouble of figuring out how to decrypt it. So if that was maybe, he was in no position to say that, yes, faced with a request like that, that the company would agree to that.

And if you're talking about folks like us, folks who are not a reputable lab such as Consumer Reports, even though what we do is far more in depth than what Consumer Reports does, there's many maxims of jurisprudence. One of those maxims of jurisprudence here in California, which is in our civil code, is that the law does not require idle acts. I can tell you, that if I were to go to a censorware company today or if Seth were to go to a censorware company today or if certain other people were and say this is who I am, this is why I want it, it would be the ultimate idle act. They would never agree.

MS. PETERS: So your answer is no?

MR. TYRE: If I remember the question, yes.

MS. PETERS: Can this problem be ameliorated through private agreements?

MR. TYRE: In my opinion, no. First of all, I

Page 44

don't think the censorware companies ever would agree. And second, if part of the agreement was an NDA, then what would be the point? Our purpose is to expose the flaws.

MS. PETERS: Okay. One last question, I don't want to hog it all. Mr. Metalitz said even if the case is proved, the class is too broad and the focus is on censorware and can you come up with a definition. Is it possible to come up with a definition for censorware that distinguishes it from the broader class of filtering software that would deal with security and other things?

MR. TYRE: Well, I'm going to turn that around a little bit. And I'm doing this not just as a lawyer's trick, but because from the first moment I read Mr. Metalitz' comment, I had an idea of what he was talking about but I wasn't sure. I've asked a lot of people, not just other censorware people, but computer security people who are among my client list. And no one has been able to figure out exactly what is meant by what Mr. Metalitz wrote and exactly what definition, if any, would satisfy his request.

So I'm going to suggest to this panel that the burden should not be on me or any other proponent of censorware of this exemption to limit the proposed exemption. The burden should be on Mr. Metalitz as the one who proposed this amendment or limitation, or whatever you want to call it, to specify in writing that can be analyzed as opposed to being

Page 45

just a theoretical construct exactly what it is that he does or does not want. And your having indicated at the beginning that there will be a chance for supplemental comments after this is over, I think that's the appropriate forum to do that in. I don't think it's appropriate today.

Again, not because I'm playing games, but seriously because no one, including computer security experts who are clients of mine, really understands it. I'm very uncomfortable taking on the burden of trying to deal with it at all before I see something more tangible from Mr. Metalitz.

MS. PETERS: Okay. Do you want to comment at all?

MR. METALITZ: Yes. Sure. We have put something in writing to say we think the filtering software that was covered by the evidence that's been presented here, and it's on page 13 of our joint reply comments. "Filtering software used to prevent access to Internet sites containing material deemed objectionable to children or otherwise inappropriate for some segment of the public or for display in a public setting."

Now, that may not be a very good definition, and I would think that people who have the word "censorware" in their name would have probably a sharper definition of what kinds of material they're talking about. But the burden, of course, is on the proponent throughout this proceeding and

Page 46

this panel can't recommend an exemption unless there's evidence to support it that shows a substantial adverse impact on the availability of something, some copyrighted work or noninfringing purposes. So I would suggest that, you know, we've taken a stab at it and I'm sure Mr. Tyre can do a lot better. But we just think that whatever finding is made here ought to conform to the evidence and not extend much more broadly to get into areas that aren't covered by the evidence.

MS. PETERS: We may do a question. The way the supplemental come in is if we actually come up with questions that we believe we need further input from. So, we'll handle it that way.

MR. TYRE: May I quickly respond to that?

MS. PETERS: Yes. Sure.

MR. TYRE: Certainly we can provide a more precise definition of censorware. I don't have one in writing in front of me, but that can be done. That's not the problem.

The problem is dealing with the other aspects of what Mr. Metalitz proposes, and that these things other than what would be defined as censorware. And one of the specific reasons why that's a problem, is because there's been so much consolidation in the industry, the relevant industry segment, that it's not a surprise that you have companies such as Symantec which are offering integrated products which consist both of traditional censorware and of firewall protection,

Page 47

antivirus protection things of that nature.

And what I'm asking for, I don't know whether I'll get it, but what I'm asking for is something from Mr. Metalitz that tells us how we deal with something like that, how we deal with an integrated product. And further, how we deal with what I would call a pure censorware company such as N2H2 not suddenly grasping onto this newly limited category and by making a few minor changes into its database, suddenly turning itself into a company that in addition to doing censorware has some minor security functions, some minor virus protection. And all of a sudden because of however this definition may work, finds itself because of imprecise wording or any other reasons no longer subject to an exemption, assuming of course that there's going to be an exemption at all.

So I'm really troubled by how all of this will play out. And that's why, though I may not get my wish, I am wishing that you will put the burden on Mr. Metalitz to give us something far more concrete to consider than what has been given.

MS. PETERS: I've basically hogged the questions. So, David, how about you.

MR. CARSON: Let me just suggest to you, don't assume we're going to put a burden on you or Mr. Metalitz. But it would be in your interest to provide a more precisely

Page 48

defined class and what you would like to see if we were to go in that direction.

I assume you're not saying that there is a reason why people should be able to have access to lists of what a virus swapping software blocks? Is that true or is that of interest to you?

MR. TYRE: Speaking for myself and for the Censorware Project, that is not of interest to us. Whether it would be of interest to other security researchers, I have no knowledge or comment.

MR. CARSON: Right. But they haven't come forward in any event, so that's not really before us, I don't think.

I'm not sure I've heard a precise answer to this question, and I think it's perhaps an important one. Can you tell us how people have since October 28, 2000 been taking advantage of the exempted class for compilations of consisting of websites blocks by filtering software applications?

MR. TYRE: That's an easy question to answer and it's a difficult question to answer because there's not really a whole lot that I can say about that that wasn't already said in Washington.

MR. CARSON: Well, not a whole lot was said, unfortunately, in Washington.

MR. TYRE: I'm quite well aware of that. I have

Page 49

gone through that transcript more than once.

Mr. Burt contends that Mr. Finkelstein hasn't even done the work that he says he's done. I personally got a rather large chuckle about Mr. Band's comment about the Iraqi Information Minister. I sincerely hope that this panel does believe that Mr. Finkelstein has, in fact, done what he says he has done. And I've told you straight out that some of what I've presented to you today is based upon the work that Mr. Finkelstein has done, and that specifically decryption work of N2H2, not other work that has been done.

There really isn't a great deal that I personally know of that has been done in the last 3 years, but I think there is a couple of reasons for that. And I think there's also a quick response I want to make that's related to that to one of the remarks that Mr. Metalitz made in the beginning. And that is that I believe he has incorrectly stated what the appropriate considerations are for the Copyright Office and for the Librarian of Congress.

There's no doubt that what has or has not been done in the last 3 years is a relevant factor. You'll never hear me say otherwise. But Mr. Metalitz indicated in his opening statement today that that's the only relevant factor. I believe that's incorrect, both from reading the statute and from reading your notice of inquiry, I believe that regardless of whether it's an exemption that never has existed or it's a

Page 50

request to in effect renew an exemption that already has requested, such as this one, the focus is the same. The focus is "in/or," in either or not an "and". Either what has happened before or what is likely to happen in the future.

MR. CARSON: Could I stop you for a second? Do you dispute that, Mr. Metalitz?

MR. METALITZ: If I understand what Mr. Tyre is saying, no I would not say that what is actually occurring now is the only relevant factor. But Congress said that should be the main focus of this proceeding.

MR. CARSON: So you don't dispute -- I'm sorry. Go ahead.

MR. METALITZ: And now that the prohibition is in effect, I think it's highly relevant what use is being made of it.

MR. CARSON: But you don't dispute that at least in theory, even if nothing were happening now, if we could predict that it's more likely than not that in the next 3 years it's going to happen, it's perfectly relevant for us to come up with an exemption if that's where it takes us?

MR. METALITZ: Yes. If it meets the criteria that are in the statute and legislative history. And I think you've spelled them out in the conclusion in 2000 what the burden would be in that situation.


Page 51

Sorry for interrupting you. I just wanted to clear it up. Please go ahead with your --

MR. TYRE: That's quite all right. It was useful.

Now, let's get back to that. I cannot cite to you any specific examples that are not already in the record. I'd love to be able to, but I'm not going to make up facts that don't exist. What I can tell you is that there's sort of a unique dynamic that's at play here, and this was not really discussed at the Washington hearing.

This whole exemption has many unique qualities about it, not the least of which it's one of the two exemptions that you granted to 2 years ago. Most of the proposed exemptions that were requested then were rejected. And so this is one that at least to some extent has had the opportunity to be field tested.

But you've heard a great deal of testimony already about how hard this work is. And I'm not talking about what's been said about the legal risks involved. I'm talking about that this is extremely difficult work to figure out how to decrypt these programs in the first place. This is not work for an amateur. This is work for trained professionals who focus specifically on knowledge of cryptography. There aren't a whole lot of people who are capable of doing this kind of work, and it's a continuing arms

Page 52

race as one version of the program gets decrypted, then the censorware companies respond as you would expect them to. They make better encryption so then you need more skill to decrypt it. It's hard work. It's time consuming work.

I cannot say this of my own personal knowledge, but having gone through this with people who have figured out how to decrypt this - Seth being one of them, not the only one - I have pretty solid knowledge of how much is involved in doing this.

Given how hard the work is, there's another factor that comes into play here. Sure, it's true that this exemption has been on the books since October of 2000. But 2 months later or 3 months later in December 2000 CIPA was passed, the Children's Internet Protection Act. And with, I believe -- I'm not even sure if it was the day after the legislation was signed. It may have even been the day before it was signed. I don't recall, I don't care. The twin lawsuits by the American Library Association and the ACLU were filed challenging the constitutionality of CIPA. And those lawsuits were on a fairly fast track. You know they went to trial. You know they were decided. Approximately a year ago the three judge trial court found that CIPA was unconstitutional as applied to public libraries. The matter since has been argued in the Supreme Court. And at some point before you make your final rulemaking, the Supreme Court presumably will decide

Page 53

that case.

I make no prediction on what that decision will be. But I think it plays an important psychological dynamic here because everyone has said on both sides - Mr. Burt said I think, I know Mr. Band said it, I know Mr. Finkelstein said it - that what does or does not happen in the CIPA case will have an impact on how this work is done in the future. And by that I mean specifically decryption work where you can get into some of the in depth things such as the loophole sites that you cannot get into simply by doing database querying or log file analysis.

The people who do this, do this in their spare time. They put in an awful lot of time to do it. And there has been a feeling on the part of those people, myself included, that is it really worth investing a lot of time now when this major court case is out there and this major court case may have a huge impact on what the relative value of this work is in the future. That's a psychological issue. That may or may not resonate with you, but it's a real issue. That issue that CIPA became law and was challenged in the court within a few months of when this exemption came to effect is one of the reasons why there hasn't been a lot of this work done in the last 2 years. But by the same token, knowing that the Supreme Court will be deciding the case within the next month or at least in theory it should be - I'm certainly

Page 54

not going to tell them what to do - that there is a good likelihood, which is the standard, that once the CIPA case is decided and we know again where the landscape is that those who have been in the field, those who may be interested in getting into the field will resume their work.

MR. CARSON: I'm going to follow up on a question that the Registrar asked you with respect to the experience of getting access voluntarily from the censorware suppliers to those lists. Have there been cases where the Censorware Project or people in a similar situation have tried to get access to those lists and it's been flat out refused?

MR. TYRE: I'm sorry. I did not hear the last part.

MR. CARSON: Have there been cases where the Censorware Project or people in similar situations have requested access to lists of blocked websites and that access has been refused?

MR. TYRE: Yes.

MR. CARSON: Okay. Give me some idea of the nature and quantity of those attempts?

MR. TYRE: Well, you already have in the record that N2H2 flat out turned down Seth Finkelstein once.

MR. CARSON: Yes, that's once.

MR. TYRE: Once.

MR. CARSON: I'm trying to get a sense of

Page 55

quantity of the problem, the nature of the problem.

MR. TYRE: There was a time when I tried to get one and, honestly, I'm blanking on which product it was. There are so many of them, they sometimes blend together. And they turned me down.

A lot of times you can get it the first time because a lot of times you can get it the first time because their system is automated. You give them legitimate information, 2 minutes later you're eligible to download it, you download it. It's the second time that's the problem.

You do it the first time, then we go out and we do a report. You do it a second time, no. They'll not give it to you. Sometimes there are other ways of getting a hold of it. But if you ask for it, will they give it to you? No.

MR. CARSON: And you're telling us that based upon a single experience of Mr. Finkelstein and a single experience by you, is that correct?

MR. TYRE: Two experiences plus having dealt with all these companies and knowing that particularly after we've done a particularly scathing reporting on them that if we asked for it again, they'd just laugh at us.

MR. CARSON: And the two specific experiences were both with a single company, N2H2, is that correct?


MR. CARSON: Oh, I'm sorry. Mr. Finkelstein was

Page 56

with N2H2 and yours was with?

MR. TYRE: Yes. I apologize for not remembering which mine was with. There's been a lot of consolidation in the industry and I'm not specifically remembering what it was. But I will state for a fact that it was not N2H2. I have never made that request of N2H2.

So we have two instances, two companies and I'd be willing to make a rather substantial wager that that doesn't answer your question. But if I were to go ask the other companies, I'd know what the answer would be.

MR. CARSON: So you're asking us to make judgments based upon your prediction, based upon your experience?

MR. TYRE: Oh, no. I know to a moral certainty what the responses will be. I'm not asking you to --

MR. CARSON: You think you've shown us two moral --

MR. TYRE: I'm not asking you to take that as evidence.

MR. CARSON: Okay. All right. Thank you.

MS. PETERS: How about going to Steve.

MR. TEPP: Okay. Thank you.

Just sort of following on what we've already been talking about, Mr. Tyre, when we were in Washington Mr. Finkelstein was asked about how many people take advantage of

Page 57

this exemption. And notwithstanding your comments about the CIPA case and whatever chilling effect you think that has, you made a comment about the limited number of people who have the technical skills to do this given the level of detail of knowledge that's required.

Mr. Finkelstein told us he thought about 6 people were using this exception. Do you think that the number -- needless to say, that's an extremely small number given the population of the United States. What it in your estimation is the number of people who are capable and interested in doing this so that, for example, if the CIPA decision goes the way you and your colleagues would like what should we expect to see in the next 3 years should this exemption be renewed?

MR. TYRE: I'll give you somewhat of an anecdotal example to that. I've been involved in a number of the DMCA lawsuits, including the 2600 cases in Amicus and the Felten case as one of the attorneys for Ed Felten and his researchers at Princeton and Rice. I've done a lot of speaking on DMCA. And it's reasonable to conclude that my views on the DMCA do not coincide with those of Mr. Metalitz. But we're not here to talk about that today.

What I think is absolutely fascinating is that I believe there's a conference called Crypto which takes place on an annual basis in Santa Barbara. It is considered by many

Page 58

to be the leading conference of cryptographers in the world. People come from all over the word to that conference. Of course, one of the reasons why it's in late summer in Santa Barbara and it's hard to find a better place to be at that time of year, but still the talent that is assembled there is extraordinary. That's your class of the people who could get into this field if they wanted to get into this field.

When I was there speaking one of the persons there, a nationally known expert on computer security, Matt Blaze came up to me afterwards and said to me "Wow, Jim, you're my hero." Not because of anything I had done because of the DMCA, but because of my Censorware Project work. I didn't have the heart to tell him that I wasn't the person who was actually doing the decryption. I do not have those technical skills, as I've said before. But he found, and quite a number of people at that conference, we more interested in talking with me about censorware decryption work than they were about talking with me about DMCA. Because DMCA is just lawyers and cryptographers don't want to talk to lawyers. They want to talk to people who are doing work. And I've got these cryptographers who are world famous cryptographers coming up to me and saying tell me about censorware. What can we do? How can we help? Is this something that we can get into?

Will any of them actually do it if the exemption is renewed for another 3 years? I don't know. If it is, oh, I

Page 59

can put together a very long list of people who I would want to talk to if I wanted to expand the field of people who have the appropriate skill set to learn how to do this and to get involved in this. Because we could use more than those we have.

MR. TEPP: Okay. Well, just to get a sense of the value of your anecdote, how many people come to this conference in Santa Barbara on average?

MR. TYRE: Several hundred minimum, maybe more. When I did my speaking gig there we were in an auditorium that I would guesstimate sat about 200. The house was packed, standing room only. They hadn't come to listen to me talk about censorware. They came to listen to me talk about the DMCA at this particular session. But was the sole purpose of that session. So there had to be at least 250 to 300 people in that room, and they were maybe not from every single continent on the world, but most of them.

MR. TEPP: Okay. Thanks.

One other thing in a similar sort of vein, you referred earlier to how the reports that have been done almost invariably result in one of the companies whose product is being analyzed making corrections in line with the critique in the reports. Can you give us a sense of how many reports have been done in the last 3 years, or more precisely since October 29, 2000.

Page 60

MR. TYRE: Okay. Yes. Zero. If that's precise enough for you.

We haven't done it, in large part, for the reason that I mentioned. Seth is not the only member of the Censorware Project and as I've indicated he is a former member, he has not been a member since before October 2000 or anytime in 2000, who is capable of doing this kind of work but for the reason that I mentioned that there has been a feeling that given the focus on the CIPA case that there is maybe not the energy level that there was to continue doing these kinds of reports. Given the energy that's involved in them, given the time consumption that's involved in them we haven't done any.

Will that change once CIPA is decided and if the exemption is renewed? I think it will. I believe strongly that it will. But our last report, which happened to be on Mr. Burt's company N2H2 was in 2000 but probably -- it was in 2000. I'm not certain when in 2000 it was. It may or may not have been after October 2000. But with that one qualification we have not done any.

MR. TEPP: Okay. Thank you.

One last question, this one for Mr. Metalitz. Looking at the opposite side of the equation, the potential harm done to right holders over the past 3 years and should the exemption be renewed prospectively in the coming 3 years,

Page 61

when we look at the situation that's been described you talked about the burgeoning number of copyrighted works available on the Internet; Mr. Tyre's talked to us about the explosion of the number of sites on filtering lists and there appear to be several filtering companies, it doesn't appear to be at first blush to be an industry in distress. Can you comment for us about what, if any, harm there might be should this exemption be renewed for the coming 3 years?

MR. METALITZ: In terms of the health of the censorware industry, I'm not sure I can add anything to what Mr. Burt has submitted in his testimony. He's much more knowledgeable about that than I am. I'm not sure that the balance sheets of the particular companies or whether they've consolidated or not is necessarily the right test. But I don't have any information really that would shed much light on that with regard to the censorware companies.

MR. TEPP: Or does it have any effect on the 17 entities that you're representing today?

MR. METALITZ: I'm not sure if any of the companies that are involved here are members of any the associations that I represent. To my knowledge, they are not. So I don't know that it has any direct impact on them. And I think I'm not really the person to ask about that.

MR. TEPP: Well, you're the closest we've got today, so I thought I'd give it a try. Thank you.

Page 62

MS. PETERS: Okay. Thank you.


MR. KASUNIC: Okay. I have just a couple of questions, mostly for Mr. Metalitz. Mostly we haven't heard him talk as much. And in the interest of time I'm going to censor myself today.

MR. TYRE: You can't do that. You have to speak freely.

MR. KASUNIC: Mr. Metalitz, you had mentioned that this is a net calculation and we do have to look at the overall balance. And in line with that last question just so we're absolutely clear, if we do find any evidence of more than de minimis harm that then we would looking to what the adverse effect on the industry would be. And one thing we do have in the record that was in N2H2's annual report was that this exemption final rule will not affect the value of lists of blocked websites. So that there's a statement that this would have seemingly no adverse effect on the value of these sites. There's nothing else to add in terms of what harm the exemption has had or is likely to have in the next 3 years?

MR. METALITZ: Well, I think you're using harm to the industry as a shorthand for the statutory standard, really, which has there been any adverse impact on the availability of this copyrighted material for noninfringing purposes. And I think the record shows that a lot of this

Page 63

material is available for the noninfringing purpose that Mr. Tyre wants to promote or at least a close cousin of that purpose. Because the record shows that a lot of evaluations, criticism and comment about these products has taken place.

Now, I don't say that it's possible there could be more of that criticism, comment of that noninfringing use that we're talking about if the exemption were extended. But this really gets into the question of to what extent has the exemption contributed to that availability.

Obviously, the health of whether the extension of the exemption or the renewal of the exemption would have a specific impact on the bottom line of a particular company is a somewhat different question. They obviously could be related, and I don't really know what significance to ascribe to the statement that you just read that came from one of their securities filings. That partly would have to do with how diversified their business is, and I just frankly don't know the answer to that question.

MR. KASUNIC: Okay. Well, in line with that then in your reply comment you state that we should be looking at - - and this is a follow up on what Mr. Tepp was asking - how many members of the public, how often and how frequently and how much they expect to utilize this in the next 3 years. But given the limits that may be placed on harm and probably the very small number of people who could accomplish or make use

Page 64

of any recommendation we make to continue the exemption, what possibility of adverse effect would you foresee in the next 3 years that we haven't seen in the last 3 years?

MR. METALITZ: Well, I think you maybe -- if I can suggest, you might be looking at this through the wrong end of the telescope. I think the question is if the exemption is allowed to come into force -- excuse me. If the prohibition is allowed to come into force for these products, for these works, which it has never done because the Librarian issued an exemption on October 28, 2000; if the exemption comes into force, will it have a substantial adverse impact on the availability of this material for noninfringing uses? I think that's the question that's before you. And only if you find that it will have a substantial adverse impact, can you justifiably extend the exemption.

Now, the number of people who can do it and how often they do it, and what use they make of the exemption is relevant because Congress said if you find that the adverse impact is de minimis, then you should not recommend an exemption. It doesn't necessarily mean that if only six people can do it, is necessarily de minimis. But I think it's a factor that you would want to take into account.

MR. KASUNIC: But isn't the question there whether the adverse effect is causing an adverse effect on noninfringing uses?

Page 65


MR. KASUNIC: Not on whether people, if there is an exemption, they will be able to accomplish it? If this is a theoretical exemption anyway in some instances, if so many people will not be able to accomplish, take advantage of the exemption because of the technological savvy that would be required to effect the exemption, can we use that technological hurdle as a barrier to finding the exemption in the next 3 years?

MR. METALITZ: Well, I think the problem with that reasoning is that it seems to say that the stronger the encryption, the lower the bar to recognizing an exemption. If you had an encryption that only two cryptographers in the world were competent to break, does that necessarily mean that the harm of recognizing an exemption be de minimis? So I don't think it really correlates necessarily with the number of people who are able to do it.

I think the focus has to be on what substantial diminution of the public's access to or the availability of this material for noninfringing uses is attributable to 1201(a)(1) as a causation element in here as well. And if in fact it only impedes a very few people from taking an action that, according to the testimony today, hasn't resulted in any reports that would fall within this category of noninfringing during the past 3 years, then I think that's a relevant issue

Page 66

for you to look at in deciding whether the statutory standard has been met.

MR. KASUNIC: Well, the last thing I just want to clarify, I raised this in Washington but since it was in your reply comment, I just wanted some clarification.

What authority do you believe that we have that -- at one point of your reply comment you mentioned that if we do find an exemption, it should be limited in some way. And where do you find that we have authority either placing conditions on an exemption such as requesting permission from the company beforehand, how would that be possible in terms of designating a particular class of work that we could fashion such conditions or such limitations on the exemption?

MR. METALITZ: That's a big question that I'm sure we'll be returning to during the day and tomorrow. I think the primary way in which this exemption -- if you decide to recognize it, -- ought to be limited is by shaving down the category of works to which is applies so that it only applies to censorware, whatever the right definition of that is and I'm sure Mr. Tyre can do a better job than I can of giving you one, and that it not apply to all these other types of security-related and other lists of websites that would appear in filtering software.

Now the reply comment does mention this issue of consent or whether there's a likelihood that access to this

Page 67

information would be granted or whether there's in effect an exhaustion requirement that someone using the exemption would have to first ask for permission. I think that's probably better looked at in terms of trying to decide whether there's a basis for an exemption at all. And the testimony I heard, and I don't know that this is correct, that basically it's very easy for someone to get at least one free bite at this database without going through decryption. It seems to relevant to me and it indicates that perhaps means other than an exemption would help to cure whatever adverse impact you find in this area. But, obviously, that's a contested issue before you and people's views are going to differ on it. But I think that that's where that evaluation would best fit.

MS. PETERS: Okay. Thank you.

Charlotte, do you have a few questions.



MS. DOUGLASS: I have one question here, Mr. Tyre, and one to Mr. Metalitz.

We talked a little bit, a lot actually, about whether or not it would make any sense to request permission from different companies because you wouldn't be able to get it. It seems to me that when we met in April there was talk from Mr. Burt of probably maybe an industry-wide agreement or an industry-wide consensus that there might be a possibility

Page 68

that they would be in a position to give you the lists. But you've read the testimony as well. Is it your sense that an industry-wide agreement would be also as useless as asking company by company. If for example, Mr. Burt represents a number of say the nine big -- did that make any sense to you?

MR. TYRE: I do understand the question.

MS. DOUGLASS: Okay. Okay.

MR. TYRE: And with respect, I think it slightly misstates what he said.


MR. TYRE: And I actually can't see if he's sitting here behind me or not, but I almost hope that he is.

MS. DOUGLASS: I don't see him.

MR. TYRE: But first off, he make it very clear that in this context he was speaking only about his own company, N2H2. He was not speaking about either of the two companies that joined him in the joint reply, 8e6 Technologies and Be Safe Online. And he certainly was not speaking on behalf of any of the various other censorware companies such as WebSense, SmartFilter, SurfControl. We've all heard the list beforehand.

What he said, as I understand it, is that they've had some internal discussions, never resolved, within N2H2 that maybe if a reputable research organization such as Consumer Reports came to them and maybe if they agreed to an

Page 69

NDA, and maybe if they agreed to certain other factors, then they would let them have it.


MR. TYRE: There is zero chance on the face of the work on this earth that regardless of how reputable I might be in your eyes or in anybody else's eyes, that Mr. Burt would consider me to be reputable. There is zero chance that I would agree to sign an NDA. Because what's the point of it if I sign an NDA? That's a nonstarter.

MS. DOUGLASS: Okay. Thank you for clarifying that.

Now, Mr. Metalitz, if an entire community of users consisted of a group, say, of about ten people all of whom sought to do what was more or less clearly noninfringing work and they all experienced the same problem, would you say in your estimation that this ten person group is an insignificant number, by definition, or could this be in light of the importance of the indispensability of the research that they're doing and the entireness of the community, could be that be considered?

MR. METALITZ: I don't think there's any litmus test or any magic number below which it's automatically de minimis. I think you have to look at the type of noninfringing use that they're talking about. And my impression, anyway, is that they're really talking about

Page 70

criticism and comment, the types of reports whether they're formal reports or not or critiques of these various products. And I think that output is probably what you should be looking more than the number of people that have contributed to the output. But, again, this type of fair use, and I'm assuming this is fair use, like any type of fair use for purposes of this proceeding, is not necessarily the case that the goal needs to be the preferred or optimal means of access in order to make fair use of the material. So you have to consider whether this is sufficiently available through other means that don't require conduct that's covered by 1201(a)(1) in order to justify the exemption. I don't think there's any magic number or any per se rule that would flow from that.

MS. DOUGLASS: Sure. I was just getting at the sort of numerical calculus.

Thank you very much.

MS. PETERS: Okay. Comment?

MR. CARSON: Yes. Just wanted to clarify something.

I didn't mean to be unfair to you, Mr. Tyre. So the comment I made about how you perhaps ought to think about and get back to us with a more strict definition of what censorware is or what it is that you want us to exempt aside from the current one, which is this list of websites that are blocked by filtering software. But the same goes for you, Mr.

Page 71

Metalitz. You're the one who is proposing we narrow it down. I think it would serve your interests if you come up with the best definition you can come up with with what you think we ought to be narrowing it down with, understanding that when you're doing that you're not necessarily asking us to exempt anything at all, but if we're going in that direction what is it you want. We'll look at what you've both given us and we'll decide whether to do anything, and if so how to narrow it down, if at all.

MR. METALITZ: We'll certainly do that.

MS. PETERS: Thank you very much.

The first panel is concluded. We'll take a 10 minute break and be back starting at 11:15. We're already significantly behind.

(Whereupon, at 11:05 a.m. a recess until 11:23 a.m.)