Filters

What is Filtering Software?
How Does Filtering Software Work?
What are Some of the Uses for Filtering Software?
How Well Does Filtering Software Work?
Who Makes Filtering Software?
Why is Filtering Software Controversial?  

What is Filtering Software? 

Internet filtering software is software that is designed to enable organizations or individuals to restrict access to specific types of web sites, which may be deemed dangerous or inappropriate for a user or group of users.  Web sites that may be filtered typically include sites that contain malicious code, such as phishing sites, or sites that contain content that is inappropriate for a workplace or home, such as pornographic websites or online casinos. 

There are three basic ways to accomplish Internet filtering: “real time filtering” – the blocking web pages based on an automated evaluation of web content; “white listing” – filtering that only allows access to a “white list” of approved sites; and  the most common form of filtering, “list-based” filtering.  Some filters combine two or even all three methods, or offer all three as options, as does CyberSitter [i]

How Does Filtering Software Work? 

List-based Filtering
Nearly all of the major filtering vendors rely primarily on list-based filters.  A list-based filter uses a database of websites with assigned categories.  A filtering database, in its most simple form, consists of a file of records. Each record or entry in the database contains two fields or elements: a field containing a Uniform Resource Locator (URL) of a web page or an Internet Protocol (IP) address and a field containing one or more subject categorizations. Below are some sample entries from a hypothetical filtering database:

URL Field  Category Field 
Hustler.com Pornography
News.yahoo.com News
Sports.yahoo.com Sports
192.168.1.1/bank-login-page.exe Phishing

The above example oversimplifies many current filtering databases, which usually contain other fields such as dates and technical references.  

The advantages of a list-based filter are many.  First, a list-based filter is highly customizable -an administrator can select one or more of dozens of categories to block or monitor for the users on the filtered network or local computer.  Second, a list-based filter is generally more accurate than real-time filtering, because URLs can be manually inspected before they are added to the database.  Finally, a list-based filter is much faster than a real-time filter because there is no need for artificial-intelligence related processing in real-time.  The primary disadvantage of a list-based filter is that the filtering company must know of the existence of a website to categorize the website.   

Real Time Filtering 

Real time filtering is also sometimes referred to as “artificial intelligence” or “dynamic filtering.”  Real time filtering is a descendent of the earliest and simplest form of filtering, “word blocking” which blocks websites based on words and phrases.  Some early filters in the mid-1990s sometimes relied heavily on world blocking, with products that would block every page that contained certain words and phrases like “free porn” or “nude pics.”  Overreliance on word blocking sometimes produced unanticipated blocking of websites, as occurred during the release of the explicit Starr Report in 1998. [ii] 

Real time filtering now uses many other website clues in addition to words to make an on-the-fly determination of a website’s content.  Real time filtering is usually not offered as a stand-alone product, but rather as an add-on to a list-based filter as an added layer of protection.  A good example of this is SurfControl’s   Enterprise Security Suite, which offers a “Dynamic Threat Database” in addition to “Real-time Threat Prevention.”[iii] 

White List Filtering 

A white list is really an “inverse filter” in that instead of blocking a list of websites, it blocks every web site except an approved list of sites.  The advantages and disadvantages of this approach are obvious – a white list is highly secure, but very limiting.  Consequently white list filtering is typically used for introducing children to the Internet, or a specialized workplace Internet application, such as a company workstation that is only used to access company websites.  A good example of a white list filter is The Children’s Internet

 TCI and a team of educators screen, approve and index every Web page and then use proprietary SafeZone Technology R to “lock” the pages into the system, so parents can be confident their children are safe from harmful, dangerous or offensive Internet content[iv]. 

What are Some of the Uses for Filtering Software? 

Filtering software is extremely flexible, and has may uses for securing networks and homes.  Among the most common uses are: 

            Protecting Computers from Websites Hosting Malware 

Protecting computers from malware has become a strong driver of filtering adoption in recent years, as a result of the increased use of the web as a malware attack vector.[v]     Most business filtering companies now stress malware protection as the most important value proposition for filtering, as does Secure Computing: 

The best way for organizations to protect against online threats is via a layered approach with prevention management as a key component. Because SmartFilter® manages and controls initial access to the Web, it is the best solution for preventing users from accessing risky sites that host Spyware, phishing scams, spam and other malicious content.[vi] 

Filtering companies that sell this type of filtering include: Secure Computing 

Surfcontrol, Websense, 8e6, and Blue Coat

Protecting Children from Harmful Content 

The filtering industry originated in the mid 1990s in response to the new problem of “cyber porn,” and other inappropriate content for children.  The problem still exists today, as a 2007 survey published in the journal Pediatrics found that, “Forty-two percent of Internet users aged 10 to 17 surveyed said they had seen online pornography in a recent 12-month span. Of those, 66 percent said they did not want to view the images and had not sought them out.”[vii] 

Use of filtering by families has increased in response, as a 2005 survey by the Pew Internet & American Life Project found that “More than half (54%) of internet-connected families with teens now use filters.”[viii] 

Filtering companies that sell this type of filtering include: Content Watch, AOL

Microsoft, bSafe, CyberSitter, and Symantec

Protecting Companies from Liability 

In the early 2000’s a series of court cases held companies liable for creating a “hostile work environment” by allowing employees or customers to openly view pornography in the workplace. In 2003, the Minneapolis Public Library paid $435,000 to settle such a case.[ix] Consequently, filtering became a leading driver for the adoption of filtering by companies, and is frequently cited by filtering companies, as in this example from Websense: 

Inappropriate use of internet resources can expose organizations to legal liability. Content filtering software helps organizations define and enforce internet use policies that prevent employees from engaging in inappropriate behavior.[x] 

Filtering companies that sell this type of filtering include: Secure Computing, Websense, 8e6, and Blue Coat

Enhancing Productivity by Preventing “Cyber-Slacking” 

Various surveys over the years have shown that a significant number of employees with Internet access visit sites that are not related to work.  A 2005 survey by Salary.com put this number at 44%, and a Harris survey put this number at 50%. [xi]  List-based filters are particularly well-suited for addressing cyber-slacking, as the employer can select categories of non-productive sites such as “Sports” or “Shopping” to block or limit.  

Filtering companies that sell this type of filtering include: Secure Computing, Websense, 8e6, and Blue Coat

Saving Bandwidth  

Internet use not related to work can costs companies money – particularly bandwidth-intensive applications like streaming media and file-sharing networks.  Secure Computing notes that: 

Organizations can spend tens of thousands of dollars a month providing this costly resource. As new users are added and activity increases, many simply buy a “wider pipe” to handle the increased load. But how much of an organization’s resources are used for non-essential bandwidth-intensive content like streaming audio, video, MP3’s and other downloads?[xii] 

Filtering companies that sell this type of filtering include: Secure Computing, Websense, 8e6, and Blue Coat

Preventing Inappropriate Use of Public Internet Workstations 

When public places such as Internet cafés, public libraries, and copy stores began offering public Internet access in the 1990s, some of these organizations decided that some Internet content was inappropriate for public viewing.  The issue of filtering in public libraries was particularly controversial among free speech advocates, and still is today. 

Filtering companies that sell this type of filtering include: Secure Computing 

Surfcontrol, Websense, 8e6, and Blue Coat

Restricting Access Illegal or Culturally Sensitive Websites by Governments 

According to the OpenNet Initiative, dozens of governments filter web content.  The ONI states that: 

The number of states that limit access to Internet content has risen rapidly in recent years. Drawing on arguments that are often powerful and compelling such as “securing intellectual property rights,” “protecting national security,” “preserving cultural norms and religious values,” and “shielding children from pornography and exploitation,” many states are implementing extensive filtering practices to curb the perceived lawlessness of the medium.[xiii] 

How Well Does Filtering Software Work? 

Filtering research literature is diverse, and not all researchers agree upon appropriate methodology and sampling techniques. Most filtering studies attempt to measure effectiveness by replicating user behavior, such as searching for websites that a filter should block, then measuring the degree of “under blocking.”  Some filtering studies also measure “over blocking” by searching for content that should not be blocked by the filter. 

Most test of “underblocking” have found that pornography filters typically block between 90 and 97 percent of pornography.[xiv]  Underblocking tests of non-pornographic materials such as hate speech and violence have produced poorer results than pornography tests.  Tests of “over blocking” have generally shown that of all blocked sites, 5-10% are in error,[xv] though when viewed as a portion of all Internet traffic, overblocked sites have been shown to represent far less than 1% of Internet traffic. 

University of Michigan Professor Paul Resnick, one of the authors of the regarded Kaiser Family Foundation study, “See No Evil“, has written a helpful article on the topic of filtering research in the journal Communications of the ACM entitled “Calculating Error Rates for Filtering Software.”  

This website maintains a database of filtering effectiveness tests here. 

  

Who Makes Filtering Software? 

Filtering companies are usually divided into two broad areas: Business (or “enterprise”) and Consumer.  These categories are often subdivided further into Business/Education and Consumer/client and Consumer/ISP.  The Business/Consumer division also usually reflects a technology differentiation as well, since business products are usually server-based and consumer products are usually client-based (run on a single PC).  This technology division is increasingly breaking down, as filtering — like other software, is being offered as a service over the Internet. 

Some of the companies that sell filtering software for the home include: 

bSafe, Centipede Networks, Content Watch, CyberSitter, FilterGate, GuardWare, Internet Safety 

Security Software, and Symantec 

Some of the companies that sell filtering software for business, schools, and libraries include: 

Secure Computing, Websense, 8e6, Blue Coat, St. Bernard, Trend Micro, Aladdin, Burstek

Barracuda Networks, ClearSwift, Content Watch, CornerPost, Cymphonix, Dan’s Guardian, Fortinet, Pearl Software, Shalla Secure, and URLBlacklist.com 

   

Why is Filtering Software Controversial? 

Filtering software has been controversial since its inception in the mid-1990s.  Some civil liberties advocates dislike the entire idea of filtering software, since they believe filtering to be censorship, particularly in institutional settings, as does the group EPIC:[xvi] 

When the U.S. government requires blocking in public schools and libraries, the government mandates censorship in direct conflict with the U.S. Constitutional guarantees to free expression and freedom of association.  

Other critics, like the Online Policy Group, argue that filters are not effective: 

However, blocking technology can’t work because it often fails to block content it aims to block, often blocks other content it did not intend to block, and causes other administrative, technical, and ethical difficulties[xvii] 

  

Still other critics worry that filtering software can result in social or political bias in the filtering databases themselves: 

Blocking technology blocks “controversial” materials related to certain issues or communities disproportionately more than other materials, thus unfairly discrimination against whole communities of people accessing, publishing, or broadcasting on the Internet.[xviii] 

Defenders of filtering software reply that numerous tests and product reviews show that filtering software is effective; and that the use of filtering software by governments and institutions can be beneficial to end users. 


[i] CyberSitter website, “Advanced Features” http://www.cybersitter.com/advanced.htm[ii] Wired News, “Will Filters Nix Starr’s Report?,” 09.10.98 http://www.wired.com/culture/lifestyle/news/1998/09/14950[iii] SurfControl website http://www.surfcontrol.com/Default.aspx?id=346&mnuid=1.4[iv] The Children’s Internet website, http://www.thechildrensinternet.com/product.html#faq[v] See “The Web as an Attack Vector,” Websense http://www.websense.com/docs/WhitePapers/AvoidingNewestSecurityThreatswebisanattackvector.pdf[vi] Secure Computing website, http://www.securecomputing.com/index.cfm?skey=1441[vii] Wolak, Janis, et al. “Unwanted and Wanted Exposure to Online Pornography in a National Sample of Youth Internet Users.” Pediatrics 119 (2007); 247-257.[viii] Pew Internet & American Life, Nov. 2005. http://www.pewinternet.org/pdfs/PIP_Filters_Report.pdf[ix]  See Adamson v. Minn. Public Library settlement, http://www.aele.org/law/Digests/empl198.html[x] Websense website http://www.websense.com/content-filtering/index.php 

[xi] “Internet Misuse Costs Employers,” http://www.technologynewsdaily.com/node/1068 

[xii] Secure Computing website http://www.securecomputing.com/index.cfm?skey=1371 

[xiii] ONI website, http://opennet.net/about-filtering 

[xiv] See filtering effectiveness test page at http://filteringfacts.files.wordpress.com/2007/11/table_of_filter_tests.pdf, particularly Philip Stark Report, eTesting Labs Report, Kaiser Family Foundation study. 

[xv] See filtering effectiveness test page at http://filteringfacts.files.wordpress.com/2007/11/table_of_filter_tests.pdf, particularly Certus Consulting Group Report 

[xvi] Online Policy Group website, http://www.onlinepolicy.org/access/blocking.shtml EPIC 

[xvii]Epic website, http://www.epic.org/free_speech/censorware/ 

[xviii] EPIC website, http://www.epic.org/free_speech/censorware/

Follow

Get every new post delivered to your Inbox.

Join 286 other followers

%d bloggers like this: