Spam as a Symptom of Electronic Communication Technologies that Ignore Social Requirements

Spam as a Symptom of Electronic Communication Technologies that Ignore Social Requirements

Brian Whitworth (New Jersey Institute of Technology, USA)
Copyright: © 2009 |Pages: 10
DOI: 10.4018/978-1-60566-652-5.ch107
OnDemand PDF Download:


Spam, undesired and usually unsolicited e-mail, has been a growing problem for some time. A 2003 Sunbelt Software poll found spam (or junk mail) has surpassed viruses as the number-one unwanted network intrusion (Townsend & Taphouse, 2003). Time magazine reports that for major e-mail providers, 40 to 70% of all incoming mail is deleted at the server (Taylor, 2003), and AOL reports that 80% of its inbound e-mail, 1.5 to 1.9 billion messages a day, is spam the company blocks. Spam is the e-mail consumer’s number-one complaint (Davidson, 2003). Despite Internet service provider (ISP) filtering, up to 30% of in-box messages are spam. While each of us may only take seconds (or minutes) to deal with such mail, over billions of cases the losses are significant. A Ferris Research report estimates spam 2003 costs for U.S. companies at $10 billion (Bekker, 2003). While improved filters send more spam to trash cans, ever more spam is sent, consuming an increasing proportion of network resources. Users shielded behind spam filters may notice little change, but the Internet transmitted-spam percentage has been steadily growing. It was 8% in 2001, grew from 20% to 40% in 6 months over 2002 to 2003, and continues to grow (Weiss, 2003). In May 2003, the amount of spam e-mail exceeded nonspam for the first time, that is, over 50% of transmitted e-mail is now spam (Vaughan-Nichols, 2003). Informal estimates for 2004 are over 60%, with some as high as 80%. In practical terms, an ISP needing one server for customers must buy another just for spam almost no one reads. This cost passes on to users in increased connection fees. Pretransmission filtering could reduce this waste, but creates another problem: spam false positives, that is, valid e-mail filtered as spam. If you accidentally use spam words, like enlarge, your e-mail may be filtered. Currently, receivers can recover false rejects from their spam filter’s quarantine area, but filtering before transmission means the message never arrives at all, so neither sender nor receiver knows there is an error. Imagine if the postal mail system shredded unwanted mail and lost mail in the process. People could lose confidence that the mail will get through. If a communication environment cannot be trusted, confidence in it can collapse. Electronic communication systems sit on the horns of a dilemma. Reducing spam increases delivery failure rate, while guaranteeing delivery increases spam rates. Either way, by social failure of confidence or technical failure of capability, spam threatens the transmission system itself (Weinstein, 2003). As the percentage of transmitted spam increases, both problems increase. If spam were 99% of sent mail, a small false-positive percentage becomes a much higher percentage of valid e-mail that failed. The growing spam problem is recognized ambivalently by IT writers who espouse new Bayesian spam filters but note, “The problem with spam is that it is almost impossible to define” (Vaughan-Nichols, 2003, p. 142), or who advocate legal solutions but say none have worked so far. The technical community seems to be in a state of denial regarding spam. Despite some successes, transmitted spam is increasing. Moral outrage, spam blockers, spamming the spammers, black and white lists, and legal responses have slowed but not stopped it. Spam blockers, by hiding the problem from users, may be making it worse, as a Band-Aid covers but does not cure a systemic sore. Asking for a technical tool to stop spam may be asking the wrong question. If spam is a social problem, it may require a social solution, which in cyberspace means technical support for social requirements (Whitworth & Whitworth, 2004).

Complete Chapter List

Search this Book: