Web Search Privacy Evaluation Metrics

Web Search Privacy Evaluation Metrics

Copyright: © 2023 |Pages: 17
DOI: 10.4018/978-1-6684-6914-9.ch003
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Privacy quantification methods are used to quantify the knowledge the adverse search engine has obtained with and without privacy protection mechanisms. Thus, these methods calculate privacy exposure. Private web search techniques are based on many methods (e.g., proxy service, query modification, query exchange, and others). This variety of techniques prompted the researchers to evaluate their work differently. This section introduces the metrics used to evaluate user privacy (protection). Moreover, this section also introduces the metrics used to evaluate the performance of privacy attacks and theoretical evaluation approaches.
Chapter Preview
Top

Introduction

Web search is the most dominant online activity due to the sheer abundance of information on the Web over the last two decades (El-Ansari, Beni-Hssane, Saadi, & El Fissaoui, 2021; Khan & Ali, 2013; Khan, Ullah, Khan, Uddin, & Al-Yahya, 2021; Preibusch, 2015). This abundance of information exceeds human processing abilities and prevent them from finding their desired contents (such as information, product, services and others) (Khan, 2020; Khan et al., 2021; Ullah, Islam, Khan, Aleem, & Iqbal, 2019). Web search engines provide the most relevant web content to the users based on their query, location, history (user profile) and other parameters (Khan et al., 2020; Khan & Islam, 2017; Ullah et al., 2021; Ullah et al., 2022). Usually, web search service providers claim that they offer their service free of cost and make a profit with the advertisements displayed alongside query results (Khan & Islam, 2017; Khan, Islam, Ullah, Aleem, & Iqbal, 2019; Preibusch, 2015). However, maintaining user profiles may cause serious privacy breach concerns as these profiles may contain private and sensitive queries about users (Khan, 2020; Preibusch, 2015; Ullah et al., 2019). Eurobarometer reported in 2016 that 82% of European web users say that user activity monitoring tools should only be used with their permission (Monteleone, 2017; Zuiderveen Borgesius, Kruikemeier, Boerman, & Helberger, 2017).

Private web search and private information retrieval are the techniques for retrieving the desired information from web search engines or a database without disclosing the user's identity, intentions and other tracking information (Saint-Jean, 2005). These techniques are proposed to tackle the user privacy infringement problem (Khan et al., 2021). There are numerous techniques available to counter privacy infringement, such as proxy networks (Berthold, Federrath, & Köpsell, 2001; Mokhtar et al., 2017), profile obfuscation techniques (Nissenbaum & Daniel, 2009), query scrambling techniques (Arampatzis, Drosatos, & Efraimidis, 2015; Arampatzis, Efraimidis, & Drosatos, 2013), private information retrieval protocols (Reiter & Rubin, 1998; Romero-Tris, Castella-Roca, & Viejo, 2011; Romero-Tris, Viejo, & Castellà-Roca, 2015; Ullah et al., 2019; Ullah et al., 2021; Ullah, Khan, & Islam, 2016a, 2016b; Ullah et al., 2022; Viejo, Castella-Roca, Bernadó, & Mateo-Sanz, 2012) and others (Chen, Bai, Shou, Chen, & Gao, 2011; Mokhtar, Berthou, Diarra, Quéma, & Shoker, 2013; Mokhtar et al., 2017; Petit, Cerqueus, Mokhtar, Brunie, & Kosch, 2015; Shapira, Elovici, Meshiach, & Kuflik, 2005).

Key Terms in this Chapter

Evaluation Metrics: Evaluation metrics are quantitative measures that are used to assess the performance of a model or algorithm. They are used to determine how well the model is performing in terms of accuracy, precision, recall, F1 score, and other measures. The choice of evaluation metrics depends on the type of problem being solved and the objectives of the model.

Privacy risk: Privacy risk measures the likelihood of a privacy breach or the potential harm that can result from a breach. It can be evaluated using metrics such as expected loss or probability of data breach.

Privacy Metrics: Privacy metrics are quantitative measures that are used to assess the level of privacy protection in a system or process. They help to evaluate the effectiveness of privacy-preserving mechanisms and to identify areas that need improvement.

Anonymity: Anonymity measures the ability of a system or process to protect the identity of individuals. It can be quantified using metrics such as k-anonymity and l-diversity, which evaluate the degree to which personal information can be linked to specific individuals.

Information Leakage: Information leakage measures the amount of sensitive information that is exposed by a system or process. It can be quantified using metrics such as mutual information or entropy, which evaluate the amount of information that is revealed about an individual or a group of individuals.

Differential Privacy: Differential privacy measures the degree to which individual records in a dataset can be distinguished by an attacker. It is typically measured using metrics such as epsilon, which evaluates the amount of noise that must be added to a dataset to achieve a desired level of privacy.

Query Privacy: Query privacy measures the extent to which a search engine protects the queries made by the user. It can be quantified using metrics such as query uniqueness, which evaluates the degree to which queries can be linked to specific users.

Private Web Search Metrics: Private web search metrics are quantitative measures used to evaluate the effectiveness of privacy-preserving mechanisms in web search engines. These metrics help to assess the level of privacy protection provided by a search engine and identify areas for improvement.

Result Privacy: Result privacy measures the extent to which a search engine protects the search results. It can be quantified using metrics such as result diversity, which evaluates the degree to which the search results are unique for each user and the degree to which they protect the user's privacy.

Private Information Retrieval: Private Information Retrieval (PIR) is a cryptographic technique that allows a user to retrieve information from a database without revealing which information they are interested in. In traditional information retrieval systems, the user sends a query to the database, which then returns the relevant information. However, in PIR, the user sends a query to the database without revealing any information about the query or the data they are interested in.

Complete Chapter List

Search this Book:
Reset