Watching the Web: An Ontological and Epistemological Critique of Web-Traffic Measurement

Watching the Web: An Ontological and Epistemological Critique of Web-Traffic Measurement

Sam Ladner
Copyright: © 2009 |Pages: 15
DOI: 10.4018/978-1-59904-974-8.ch004
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter aims to improve the rigor and legitimacy of Web-traffic measurement as a social research method. I compare two dominant forms of Web-traffic measurement and discuss the implicit and largely unexamined ontological and epistemological claims of both methods. Like all research methods, Webtraffic measurement has implicit ontological and epistemological assumptions embedded within it. An ontology determines what a researcher is able to discover, irrespective of method, because it provides a frame within which phenomena can be rendered intelligible. I argue that Web-traffic measurement employs an ostensibly quantitative, positivistic ontology and epistemology in hopes of cementing the “scientific” legitimacy they engender. But these claims to “scientific” method are unsubstantiated, thereby limiting the efficacy and adoption rates of log-file analysis in general. I offer recommendations for improving these measurement tools, including more reflexivity and an explicit rejection of truth claims based on positivistic science.
Chapter Preview
Top

The Origination Of Web-Traffic Measurement

Web-traffic measurement was not designed or intended to be a tool for social researchers. It emerged out of the technical need to monitor Web server performance. The World Wide Web, which was born in 1990 and popularized by the mid-1990s, was created when client computers, connected to the Internet, requested to see files from server computers (World Wide Web Consortium, 2007). These server computers were increasingly asked to “serve up” more and more files, making their response time and performance an issue.

Key Terms in this Chapter

Interpretivism: A tradition in social and humanities research that assumes findings are to be interpreted by the researcher. This contrasts with positivism, which assumes the researcher “finds” or simply “observes” findings.

Electronic Commerce Research: All forms of investigation of online selling of goods or services.

Research Methodology: General knowledge approaches to conducting and designing research.

Positivist Epistemology: Also referred to as “positivism,” refers to the school of research thought that sees observable evidence as the only form of defensible scientific findings. Positivist epistemology, therefore, assumes that only “facts” derived from the scientific method can make legitimate knowledge claims. It also assumes the researcher is separate from and not affecting the outcomes of research.

Clickstream Tracking: The passive collection of data that computer users generated when they click the mouse on a Web site. A computer user’s “clickstream” is the list of events they have initiated by clicking their mouse.

User Experience: Refers to the immersive character of technology use and is typically evoked by designers of technology. The “user experience” is assumed to be architected by interaction designers.

Sociology of Computing: A stream in sociology that researches the interactions between humans and computers as well as the social effects of using computers.

Web Analyst: A job title used by private-sector practitioners, which typically involves analyzing Web-traffic data.

IS Research Methodologies: Refers to the common research methods used by information scientists.

Complete Chapter List

Search this Book:
Reset