Internet Filtering: An Interpretation of the Library Bill of Rights
91´«Ã½
In the span of a single generation, the Internet has revolutionized the basic functions and operations of libraries and schools and expanded exponentially both the opportunities and challenges these institutions face in serving their users. During this time many schools and libraries in the United States have installed content filters on their Internet access. They have done so for a variety of reasons, not least of which is the requirement to comply with the Children’s Internet Protection Act (CIPA) in order to be eligible to receive federal funding or discounts through the Library Services and Technology Act, Title III of the Elementary and Secondary Education Act, and the Universal Service discount program (E-rate), or to comply with state filtering requirements that may also be tied to state funding. Their rationale for filtering is that it is better to have filtered access than no access.
CIPA specifically requires public libraries and schools seeking e-rate discounts for internet connections to install technology protection measures, i.e., content filters, to block two categories of visual images that are unprotected by the First Amendment: obscene images and images of child pornography. These are categories of images the Supreme Court has consistently ruled outside the constitutional protection of the First Amendment. CIPA also requires those libraries and schools to block a third category of images for minors under the age of 17 that courts deem "harmful for minors" that are constitutionally protected for adults but not for minors. CIPA does not require libraries and schools to block any other constitutionally protected categories of images, or any constitutionally protected categories of speech.
Research demonstrates that filters consistently both over- and underblock the content they claim to filter. Filters often block adults and minors from access to a wide range of constitutionally protected speech. Content filters are unreliable because computer code and algorithms are still unable to adequately interpret, assess, and categorize the complexities of human communication whether expressed in text or image. In the case of websites containing sexually explicit images, the success rate of filters is frequently no greater than chance. In addition, the use of content filters cedes vital library and school resource and service decisions to external parties (private companies and contractors) who then exercise unknown and unaccountable influence over basic functions of the library or school and users' access to library or school resources and services.1 In addition to this research, the experience of librarians and educators working within the constraints of CIPA suggests that filters are unreliable and routinely circumvented by technologically adept users.
Most content filters are designed and marketed for a much larger market than libraries and schools, and offer options for filtering wide categories of protected speech such as objectionable language, violence, and unpopular or controversial opinion, as well as entire categories of Internet-based services such as e-mail and social media. In addition many content filters operate on an “opt out” model where the filter defaults “on” unless the user is given the option to shut it off. Categories frequently are set to default to the most stringent settings and may only be adjusted by administrative intervention.
Unblocking for adults on request was a key factor in the Supreme Court decision to uphold CIPA in public libraries.2 This has proved to be equivocal in actual practice in some libraries, because of the unwillingness or inability of libraries to unblock when requested, especially when system administrators may be outside of library administrative control. While some filtering systems allow librarians at the local or end user level to modify the filter settings, others restrict that authorization to the highest administrative levels, creating lengthy delays in the processing of user requests to unblock erroneously filtered content.
This same situation also occurs in schools. Such delays represent de facto blocking for both library users and K-12 students, because most users rarely have the flexibility or time to wait hours or even days for resources to become available. This dilemma is exacerbated by the secrecy surrounding category definitions and settings maintained by the filtering industry, frequently under the guise of trade secrets. There are also issues of user privacy when users must identify themselves and their interests when asking for specific websites to be unblocked. Certainly, both adults and students researching highly personal or controversial topics will be reluctant to subject themselves to administrative review in order to have access to information that should be freely available to them.
In schools, the CIPA requirements have frequently been misinterpreted with the result of overly restrictive filtering that blocks many constitutionally protected images and texts. Educators are unable to use the wealth of Internet resources for instruction, and minor students are blocked from content relevant to their school assignments and personal interests. Interactive websites and social media sites are frequently restricted, and are thus unavailable to educators for developing assignments that teach students to live and work in the global digital environment. In many cases students are prevented from creating and sharing their documents, videos, graphics, music and other original content with classmates or the wider world; thus valuable learning opportunities are lost. These situations occur in schools when librarians, educators and educational considerations are excluded from the development and implementation of appropriate, least-restrictive filtering policies and procedures. Minor students, and the librarians and educators who are responsible for their learning experience, should not be blocked from accessing websites or web-based services that provide constitutionally protected content that meets educational needs or personal interests even though some may find that content objectionable or offensive. Minors and the adult educators who instruct them should be able to request the unblocking of websites that do not fall under the categories of images required to be filtered under the Children's Internet Protection Act.
CIPA-mandated content filtering has had three significant impacts in our schools and libraries. First, it has widened the divide between those who can afford to pay for personal access and those who must depend on publicly funded (and filtered) access. Second, when content filtering is deployed to limit access to what some may consider objectionable or offensive, often minority viewpoints, religions, or controversial topics are included in the categories of what is considered objectionable or offensive. Filters thus become the tool of bias and discrimination and marginalize users by denying or abridging their access to these materials. Finally, when over-blocking occurs in public libraries and schools, library users, educators, and students who lack other means of access to the Internet are limited to the content allowed by unpredictable and unreliable filters.
The negative effects of content filters on Internet access in public libraries and schools are demonstrable and documented. Consequently, consistent with previous resolutions, the 91´«Ã½ cannot recommend filtering.3 However the 91´«Ã½ recognizes that local libraries and schools are governed by local decision makers and local considerations and often must rely on federal or state funding for computers and internet access. Because adults and, to a lesser degree minors, have First Amendment rights, libraries and schools that choose to use content filters should implement policies and procedures that mitigate the negative effects of filtering to the greatest extent possible. The process should encourage and allow users to ask for filtered websites and content to be unblocked, with minimal delay and due respect for user privacy.
1 Kristen R. Batch. “” (91´«Ã½ OITP & OIF Policy Brief No. 5, June 2014)
2 United States v. 91´«Ã½, Inc.,
3 “” (1997) and “” (2001)
Adopted June 30, 2015, by the 91´«Ã½ Council.
See Also
- (2017), provides specific steps to reduce the impact of filtering on intellectual Freedom.