On Sept. 15, 2016, AL Live! hosted a webinar featuring (in order of presentation) Bob Bocher (Fellow, ALA’s Office for Information Technology Policy), Doug Archer (Peace Studies and Global Affairs librarian at University of Notre Dame’s Hesburgh Libraries), Michael Robinson (Chair of the ALA’s Intellectual Freedom Privacy Subcommittee and Head of Systems at the Consortium Library at the University of Alaska Anchorage’s Consortium Library), and Deborah Caldwell-Stone (Deputy Director of the ALA’s Office for Intellectual Freedom). I served as moderator.
This lively, well-attended workshop focused on four key issues.
There’s money on the table
First, as Bob Bocher relayed, 2014 E-rate changes have put a lot more money on the table. The funding has increased from $2.4 to $3.9 billion annually. The practical result, he said, is that now all applications will be funded. The savings to a library can be significant. E-rate provides discounts of 20-90% on:
- Telecommunication services (Category 1)
- Internet access (Category 1)
- Internal connections (Category 2)
If libraries do not accept E-rate money (or other applicable federal funds), the Children’s Internet Protection Act (CIPA) does not apply to them. However, with the acceptance of E-rate money, libraries will find themselves subject to two CIPA provisions. CIPA compliance requires that libraries adopt a “technology protection measure (filtering); they must also adopt an Internet use policy. Filtering is mandated by CIPA to address Internet access, internal connections (routers and hubs), but not generic telecommunications. The E-rate changes result in sufficient funding, but also come with a loss: E-rate discounts for Plain Old Telephone Service (POTS) are being phased out. Meanwhile, Bocher reported that ALA’s Office for Information Technology Policy (OITP) is reviewing CIPA requirements and focusing on ways to more easily disable filters on request. For example, libraries with authentication systems can implement a process to allow adult patrons sitting at an internet-connected workstation to self-select that they want unfiltered access.
The consequence of this new funding is clear: many libraries may reconsider their decision not to filter.
ALA has a history with filtering
Doug Archer recounted a brief history of ALA’s stance on filtering. In the early days of CIPA, filtering had many spectacular failures. People were unable to get to breast cancer treatment information, for example. ALA filed a lawsuit to overturn CIPA. However, the Supreme Court upheld CIPA, and decided that filtering was permissible provided that it blocked access to three specific categories of imagery unprotected by the First Amendment (see Law and Enforcement, below), and that the filtering could be promptly disabled upon adult request.
More recently, the Office for Information Technology Policy (OITP) and the Office for Intellectual Freedom (OIF) issued an updated report on filter effectiveness. (See Fencing Out Knowledge, Policy Brief No. 5, June 2014, by Kristen Batch.) The two original problems of filtering software remain. First, it under blocks. That is, it fails to consistently block access to the three categories of illegal imagery. Second, it over blocks. It continues to prevent access to perfectly legal and constitutionally protected material. As a consequence, the official ALA position is that it cannot recommend the use of filtering; filtering violates First Amendment rights.
But as Archer took pains to explain, ultimately the decision of libraries to filter or not to filter is local, and may be driven by a host of concerns: local politics, financial need, or when required by state law. While ALA cannot recommend filters, it supports libraries whether they filter or not.
But if libraries do decide to use filters, they should take active steps to minimize their failings. These steps include:
- select the most flexible filter possible (which probably means a more expensive filter),
- maintain as much local control as possible in implementation and operation of the filter, and
- use the lowest settings possible. That is, only block the three illegal kinds of imagery. Don’t activate all the bells and whistles of content categories the filter might offer.
Michael Robinson then revealed something that hasn’t gotten much play in the library world: some kinds of filtering represent a significant privacy and security threat.
He explained that filtering has three basic approaches:
- Access is blocked (blacklist), or exclusively permitted (whitelist) to specific domains or URLs.
- Access is blocked according to specific protocols or ports (http, https, ftp, ssh, proxies, streaming, etc.)
- Access is blocked based on real-time inspection of web pages’ keywords, phrases, or patterns in content, types of embedded content (such as media or scripts), the source of the content (Youtube), or metadata of embedded content (the name of a jpg).
Many websites have moved or are moving to https, a secure, encrypted protocol to communicate between the user’s browser and the remote site’s web server. Today, about 50% of the web is encrypted. This is a best practice to ensure that patron communications – with their bank or doctor, for instance – cannot be intercepted. But https means that while the domain name is unencrypted, the specific page beneath is hidden. To deal with this, many filters offer the ability to break encryption in order to use the “real-time inspection” approach on https pages. That is, the filter inserts itself between the browser and the website to examine traffic. It presents a certificate in which the filter poses as the target website. Now financial, commercial, legal, medical, usernames, passwords, account numbers and other Personally Identifiable Information is exposed. In technical terms, this is a “man-in-the-middle” attack – a hack that compromises both personal and system security. The intent of the vendor may not be malicious, but how would we know what is being recorded and with whom it is being shared? (Note also that in states which have patron privacy laws, these laws very likely apply to web use too.)
Law and Enforcement
Deborah Caldwell-Stone then summarized specifically what is actually illegal:
- First, CIPA requires that for adults, the filter must be set to block visual images that are obscene or child pornography.
- Second, CIPA requires that for minors (under the age of 17), the filter must be set to block visual images that are obscene, child pornography, or harmful to minors.
CIPA does not require:
- Blocking access to narratives or other text-based material.
- Blocking access to controversial viewpoints or subjects.
- Blocking access to social media sites or search tools.
- Tracking or monitoring users’ web surfing habits.
Child pornography is probably the easiest to define: it involves photos of minors engaged in sexual activity. Obscenity consists of sexually explicit content deemed to lack any literary, artistic, political, or scientific value; this is a complex finding of the courts. “Harmful to minors” includes constitutionally protected content that is legal for adults to view, but not protected for minors. It, too, is a finding of the courts. CIPA does not use the term “pornography,” which has no legal definition and can include content that is not illegal under CIPA.
Who oversees enforcement of CIPA? That job falls to the Federal Communications Commission, which has given libraries wide latitude on how to implement the requirements. Enforcement is a civil, administrative matter – not a criminal proceeding. In other words, libraries have the opportunity to make unique local and innovative decisions. They don’t have to accept the defaults of a proprietary, commercial system.
To summarize: All currently extant filtering software fails to fully comply with the provisions of CIPA. Furthermore, filters introduce unconstitutional restrictions on our patrons. Yet ALA supports its members and their local decisions, and recognizes that the availability of new E-rate money may urge a reconsideration of their previous decisions. If libraries do choose to implement filters, they need to carefully implement its use, restricting it only to what CIPA requires, and avoiding the man-in-the-middle security breach.
Author: James LaRue, Director, Office for Intellectual Freedom
James LaRue, Director
Office for Intellectual Freedom &
Freedom to Read Foundation