Home Contents Extra! About Archive Discuss Electronic School online
CENSORWARE

How well does Internet filtering software protect students?

It's every school technology coordinator's bad dream: A young student sits down in front of a networked classroom PC to do some research for a class assignment about the presidency. Opening a web browser to go to the White House web site, she types www.whitehouse.com, unaware that the web site housed at 1600 Pennsylvania Ave. exists in the .gov -- not .com -- domain. The mistake is minor, but what happens next isn't: This student's online field trip to the Oval Office is about to be rudely hijacked to a commercial pornography site featuring graphic photos of sex acts on its front page.

The anecdote might be hypothetical, but the "White House" porn site is real. Vendors of Internet filtering and blocking software are fond of telling this story because it illustrates how easily children can be inadvertently exposed to online pornography. In declaring the Communications Decency Act unconstitutional last June, the U.S. Supreme Court placed the onus on schools to shield students from online indecency, and makers of so-called censorware have been quick to respond.

Their solution is simple: Buy our software, and your students will be safeguarded from exposure to pornography, hate speech, violent imagery, and other inappropriate content on the Internet.

But vocal opponents of censorware see a different picture. They see schools abdicating their supervisory role to software companies that are ill-equipped to discern which sites are educationally appropriate and -- in some cases -- are motivated by conservative agendas to block students from liberal points of view. In addition to blocking access to a great deal of educationally valuable information, critics say, censorware products provide no guarantee against porn or other truly objectionable material because the vendors can't hope to keep tabs on a web that -- by some estimates -- doubles in size every six months.

In the midst of the spirited debate over the use of censorware, however, one consensus among school technology leaders emerges: Schools should consider their goals and options carefully and conduct hands-on evaluations of several competing products before making a judgment about whether -- or how -- to filter Internet access for students.

How filters work

Censorware products typically use a combination of several filtering and blocking strategies, and school officials can often choose which of these strategies to enable or disable. The most unsophisticated weapon is keyword blocking, which compares the text of web pages and other Internet content against a list of undesirable words and then either removes the words or blocks the offending page altogether.

The simplicity of the keyword blocking approach can easily lead to cases of mistaken identity, though. On the lookout for words such as "XXX," "sex," and "dykes," censorware products have blocked web pages such as those for Superbowl XXX, Mars Explorer, and the University of Kansas Medical Center's Archie R. Dykes Library, to name just three examples.

One product, CYBERsitter, yanks offending words from web pages without providing a clue to the reader that the text has been altered. The mangled text that results from this intervention might change the meaning and intent of a sentence dramatically. For example, because "homosexual" is in the list of CYBERsitter's forbidden words, the sentence, "The Catholic church is opposed to all homosexual marriages" appears to the user as, "The Catholic church is opposed to all marriages." (Brian Milburn, CEO of Solid Oak Software, the maker of CYBERsitter, declined to talk to Electronic School for this story.)

A more sophisticated approach, used by many vendors, is to block individual web pages by specific URLs. Typically, vendors use automated web crawlers to search for suspicious pages. Human reviewers then look at each page in turn and rate it accordingly. For example, Cyber Patrol, a popular product that also licenses its database to several other vendors, rates sites according to the following categories: violence/profanity, partial nudity, full nudity, sexual acts, gross depictions, intolerance, satanic or cult, drugs and drug culture, militant/extremist, sex education, questionable/illegal and gambling, and alcohol and tobacco.

Most vendors allow schools to pick and choose which categories they wish to block, but none permit educators to view the full list of blocked sites, which vendors encrypt to prevent misappropriation by competitors or students. Schools have no way of knowing whether a particular site is blocked -- or why -- without trying a site and seeing what happens. This is an important limitation, many educators say, because vendors often incorrectly categorize sites.

"We have about a dozen people looking for sites," says Susan Getgood of Microsystems Software, the maker of Cyber Patrol. As is typical for the industry, Microsystems Software does not require that its site raters have backgrounds in library science, but they must be either a parent or a teacher, Getgood says. With the aid of automated web crawlers, it's "highly doable" for Cyber Patrol to keep track of bad sites, she says. "But no site is added to the list unless it has been viewed by a human being," Getgood adds.

Yet at the time this was written, Electronic School discovered by simple trial and error that Cyber Patrol blocked access to the "Educators' Home Page for Tobacco Use Prevention" on a web site run by the Maryland Department of Health and Mental Hygiene's Local and Family Health Administration. On the other hand, Cyber Patrol allowed access to a web site called "How to Tell Right From Wrong," featuring half a dozen graphic photos of aborted fetuses -- presumably because the Cyber Patrol reviewers were unaware of the page's existence.

Critics argue that these twin drawbacks are inherent flaws in censorware products: Some sites that should be accessible get blocked, and some sites that should be blocked manage to slip through. Vendors respond that schools can add to or delete from the list of blocked sites as they see fit or as need arises. But some educators wonder whether a student is likely to ask a teacher to unblock a site that deals with a personally sensitive issue such as teen pregnancy, abuse, homosexuality, or sexually transmitted disease. Indeed, students might not realize that such sites exist.

Rating systems

Ratings are another approach to blocking. The Platform for Internet Content Selection (PICS) protocol, which has been adopted by Microsoft's Internet Explorer web browser and will likely be adopted by Netscape Navigator as well, has enabled several rating systems. The RSACi rating system (developed by the Recreational Software Advisory Council) and the SafeSurf rating system depend on Internet publishers to rate their own web pages, while the Net Shepherd rating system is based on ratings by third parties.

PICS is unlikely to be a realistic solution for schools anytime soon, though, as only a small proportion of web sites have been rated so far. Critics say the self-rating systems lack incentive, take too much time and effort, and are not applicable to many sites. Even the White House, which promotes web ratings as a means to protect children, had not rated their own site as this article was written. The MSNBC news site recently abandoned an attempt at self-rating as unworkable.

Third-party rating systems have problems, too, because any system that depends on strangers to apply subjective ratings to a vast universe of web pages runs the risk of being out of touch with local community norms. When Electronic School performed a sample search for the word "breast" using an online demo of the Net Shepherd product, three of the hits were links to photos of nude breasts.

"It would appear that the opinion of the person who reviewed the site is that these images are not offensive to them," said Ron Warris, vice president of technology for Net Shepherd, Inc., when told of this result. Warris added that he would have the pages rerated.

"You have to settle for an approximate match" when relying on a third-party rating system, says Paul Resnick, an associate professor at the University of Michigan's School of Information and the chairman of the PICS working group at the World Wide Web Consortium, the MIT-based organization that authored the PICS standard. "That is the nature of relying on someone else's judgement about material."

Installing censorware

Censorware can be installed in several ways. Client-based censorware is designed to be installed and configured on each computer for which Internet access is to be restricted. Periodic updates of the list of blocked sites must be downloaded manually to each computer, which can quickly become a large administrative task. (Some client-based products do allow updates to be performed over a local-area network, however.)

For schools or districts with a large installed base of networked computers, proxy server-based products can be a more manageable and technically sophisticated solution. In this configuration, the blocking takes place on a special server that is located "upstream" from the classroom computers on the school network and that updates itself automatically from the vendor's online database of blocked sites. A proxy server also has the added benefit of speeding up access times by storing frequently accessed pages in a cache memory.

Using proxy-based filtering in combination with a network operating system that assigns each user a logon ID and password, such as Windows NT Server, schools can set up different filtering criteria for different groups of students. This solution can go a long way toward age-appropriate filtering, for example by allowing only high school students to access sites that have been placed by the filtering vendor in the "safe sex" category.

But the proxy server solution has drawbacks, too: Each computer's web browser must be manually configured to direct its requests through the proxy server, a time-consuming task when there are a large number of networked computers to set up. And wily students might be able to route their browsers around the proxy server. To prevent this, some school districts use a firewall in combination with a proxy server. A firewall -- a hardware or software filter that guards the intersection of the school's network and the Internet -- can be configured to disallow any traffic that does not pass through the proxy server. This solution also provides the side benefit of protection against hacker intrusion from the outside.

To block or not to block

How many school districts are using Internet filtering and blocking software? Exact figures are hard to come by, but in a recent poll of 295 teachers, technology directors, school board members, and other educators attending the national Technology+Learning conference, 51 percent said they were currently using censorware for all or some students in their district.

Not surprisingly, educators are divided on the efficacy and appropriateness of the use of Internet blocking and filtering software in schools.

"Using a computer that had Surfwatch installed on it, I was able to download information on how to build a bomb, how to contact a satanic cult, how to sabotage various systems within a building, read up on neo-Nazi propaganda, and learn how to commit crimes using cellular telephones," says Bill Lowenburg, a librarian and technology trainer in the Stroudsburg (Pa.) Area School District. "On the other hand, I was not able to access the English Server at Carnegie Mellon University, because it apparently had 'objectionable' content on it."

Yet many school technology coordinators argue that the inexact science of Internet filtering and blocking is a reasonable trade-off for greater peace of mind. Given the political reality in many school districts, they say, the choice often comes down to censorware or no Internet access at all.

"It would be politically disastrous for us not to filter," says Joe Hill, supervisor of math and technology at the Rockingham County (Va.) Public Schools. "All the good network infrastructure we've installed would come down with the first instance of an elementary school student accessing some of the absolutely raunchy sites out there. Parents trust that schools are safe sites for their children in all ways, and that includes the Internet. It is much better to err on the side of caution in blocking sites."

Kerry Day, technology specialist for the North Sanpete School District in Mount Pleasant, Utah, agrees.

"A conservative group called the Eagle Forum recently tried to persuade the state legislature to cut off all Internet access to public schools," Day says. Although that effort was not successful, he says, "it wouldn't take too many incidents for them to have enough ammunition to succeed."

Politics aside, schools and communities need to carefully consider all their options when making decisions about implementing censorware, says Karen Schneider, a government librarian and library-press columnist. Last year, Schneider headed up The Internet Filter Assessment Project (TIFAP), a six-month-long evaluation of more than a dozen censorware products by a group of librarians scattered across the Internet. The TIFAP study -- which provided the basis for Schneider's newly released book, A Practical Guide to Internet Filters -- concluded that filters hamper legitimate information gathering unless administrators disable keyword blocking and all blocking categories except for those that cover pornographic sites.

"I try to tell people, 'Slow down and think carefully about the impact of what you're doing," Schneider says. "Give these tools as much scrutiny as you would any other purchase, because they do affect what information is available. And if you're looking for guarantees -- there are none."

Whose agenda?

The concept of local control of curriculum through the process of neighborhood citizens serving on school boards is a long-standing and cherished tradition in American public education. But many argue that with censorware in place, school districts give up ultimate control of what students can and can't see.

"The problem with filtering is that you let one group or organization set your agenda," says Carol Simpson, a library technology administrator in the Mesquite, Texas, public schools. "When filters block animal-rights sites because of 'gross depictions' but not antiabortion sites for the same reason, we're not dealing with a pornography filter, we're dealing with a political filter. I tell people, 'Do you want some software company in San Francisco deciding what your kids can see?'"

The program that has come under the most fire from free-speech advocates over the past year is CYBERsitter, which blocks the sites for the National Organization for Women as well as the Gay & Lesbian Alliance Against Defamation. Until recently, the program also blocked the web site of a teen anticensorship group called Peacefire, which is critical of CYBERsitter's blocking policies.

Peacefire's webmaster, Benjamin Jenkins, is a 17-year-old senior at Community High School in Ann Arbor, Mich. The school is involved in a project with the University of Michigan to develop "a new way of teaching science, which includes computer technology highly integrated into the curriculum," Jenkins says. As a sophomore, Jenkins was hired by the university to maintain the computers and network involved in the program. Internet use at the school has successfully relied on education and enforcement of an Acceptable Use Policy, Jenkins says: "Students are informed of the tentative nature of our connection to the Internet--they respect that, and behave responsibly online.

"We have always felt that filtering software is not only ineffective, but also a violation of the trust between students and staff," Jenkins adds. "Unfortunately, most of the censorware companies block anything controversial, not just pornography. I find it very discouraging that this includes information like suicide prevention, safe sex, and gay youth resources."

Indeed, it is at the high school level that the most serious free-speech issues arise over the use of censorware, says Ann Beeson, a national staff attorney for the American Civil Liberties Union. As counsel for plaintiffs in ACLU v. Reno, Beeson was a primary architect of the landmark case in which the Supreme Court last year declared the federal Communications Decency Act unconstitutional.

"The basic problem is that the filters aren't perfect and they tend to overblock," Beeson says. Although the extent to which students have First Amendment rights is not clear, Beeson says, older minors have a more clearly defined need for information on topics such as safer sex, AIDS, and gay and lesbian issues. And even if the use of censorware doesn't put a school in a worse position legally, Beeson says, it does create a false sense of security.

"As a practical matter, schools are not worse off for trying to screen," agrees Jonathan D. Wallace, a New York-based attorney and software executive and author of the book Sex, Laws and Cyberspace. "But the single most important thing is that filtering software is a placebo. These issues can be handled perfectly by a teacher standing in the classroom, seeing what's on the screen. It's complete self-deception to think we can make a software program that can make these kinds of decisions for us."

And as for the hypothetical case of the student who mistakenly ends up at the "White House" porn site? Carol Simpson puts it this way: "That's what the 'Back' button on the web browser is for."

Lars Kongshem is an associate editor and webmaster of Electronic School and The American School Board Journal.

By Lars Kongshem

INTERNET FILTERING, BLOCKING, AND MONITORING PRODUCTS

Access Management Engine. Bascom Global Internet Services, Inc.

AUP Action Tools. iTECH, Inc.

BESS Internet Filtering Service. N2H2, Inc.

BorderManager. Novell, Inc.

ChoiceNet. Livingston Enterprises, Inc.

Click and Browse Junior. NetWave, Inc.

Cyber Patrol. Microsystems Software, Inc.

Cyber Sentinel. Security Software Systems, Inc.

CYBERsitter. Solid Oak Software, Inc.

Cyber Snoop. Pearl Software, Inc.

Disk Tracy. WatchSoft, Inc.

EdView. EdView, Inc.

Elron Internet Manager. Elron Software, Inc.

GuardiaNet. Landmark Community Interests, LLC.

I-Gear. Unified Research Laboratories, Inc.

InterGate. Internet Products, Inc.

Internet WatchDog. Algorithm, Inc.

The Library Channel. vImpact, Inc.

Library Safe Internet System. NetFilter Technologies.

Net Nanny. Net Nanny Software International, Inc.

NetRated. PC DataPower.

Net Shepherd. Net Shepherd, Inc.

NetSnitch. NetSnitch, LLC.

SafeSurf. SafeSurf.

Sequel Net Access Manager. Sequel Technology.

SessionWall. AbirNet, Inc.

SmartFilter. Secure Computing Corp.

SurfCONTROL. JSB Computer Systems Ltd.

Surf Guard. Smart Team, Inc.

SurfWatch. Spyglass.

Triple Exposure. Innovative Protection Solutions Corp.

The Wall. Raptor Systems, Inc.

WatchGuard. WatchGuard Technologies, Inc.

WebChaperone. WebCo, Inc.

WebNOT. Raptor Systems, Inc.

WebSENSE. NetPartners Internet Solutions, Inc.

Web Traffic Express. IBM Corp.

WizGuard. WizGuard Company.

X-STOP. LOG-ON Data Corp.

 

RATING SYSTEMS

PICS. Platform for Internet Content Selection.

RSACi. Recreational Software Advisory Council.

Net Shepherd.

SafeSurf.

evaluWEB.

 

FURTHER READINGS

ACLU. American Civil Liberties Union.
- Cyberliberties page
- Fahrenheit 451.2: Is Cyberspace Burning?

ALA. American Library Association.
- Statement on library use of filtering software

CDT. Center for Democracy and Technology.
- Free speech issues

Censorware search engine.

CPSR. Computer Professionals for Social Responsibility.
- Filtering FAQ

EFF. Electronic Frontier Foundation.

Enough is Enough.

EPIC. Electronic Privacy Information Center.
- Censorware page
- EPIC Report: "Faulty Filters"

The Ethical Spectacle.
- The Censorware page
- Blacklisted by Cyber Patrol

Family PC magazine review of filtering software.

Filtering Facts.

IFEA. Internet Free Expression Alliance.

Inventory of filtering technology.

PC Magazine review of filtering software.

Peacefire.

SAFE. MIT Student Association for Freedom of Expression.
- Information about Labeling and Rating Systems

TIFAP. The Internet Filter Assessment Project.

Reproduced with permission from the January 1998 issue of Electronic School. Copyright © 1998, National School Boards Association. This article may be saved to disk, printed out for individual use, or reproduced in quantities of less than 100 copies for academic use only, provided this copyright notice remains intact on each copy. This article may not be otherwise transmitted or reproduced without the consent of the Publisher. For more information, contact Magazines Coordinator Jo Surette, (703) 838-6739.

Home / Contents / Extra! / About / Archive / Discuss