Article Preview
TopSo far, there are mostly three types of recommender systems: collaborative filtering(Herlocker et al., 2004), content-based recommendation(Linden et al., 2003), and hybrid recommendation(Burke & Reasoning, 2002). Collaborative filtering is the primary recommender system, which the main method is to calculate the similarity of users or items by rating score and then use the similarity to recommend items to users. The CARS is integrating the contextual information into the recommendation process that can make the higher accuracy of recommendation.
Generally, the contextual information includes time, location, weather, user information, etc. The accuracy of the recommendation can be improved if the contextual information is effectively applied to the recommender systems. CARS is a multidimensional user-item-context scoring utility model, which is the extension of traditional user-item two-dimensional rating utility model. Although recommendation systems can provide users with personalized content and recommendations, any recommendation system poses a potential privacy threat to their users because it aggregates user preference data. The user information collected by the recommendation system may be inadvertently leaked by the service provider, or the user information may be stolen due to a hacker attack on the server.
Unfortunately, the privacy risks and security problems have been ignored during the study of the recommendation system. People are also paying more and more attention to the security of privacy and they are not willing to disclose their own real information. Generally, data may get be hacked when it is published, stored, and converted. The k-anonymity(Sweeney, 2002) and l-diversity(Machanavajjhala et al., 2006) are the traditional privacy protection technologies. The principle of k-anonymity is to confuse the attribute values of the data to ensure that every record in an anonymous data set cannot be distinguished from k-1 records, but it is vulnerable to consistency attack. So as to resolve the shortcoming, l-diversity technology was put forward. Nevertheless, these privacy protection technologies can't resist attacks by the attackers with data background knowledge.
The following tables shows some patients’ information of a medical system. If Alice is Bob's neighbor, Bob's disease information is in the Table 1. As a neighbor who knows him better, it is easy to figure out Bob's personal Information. Even if she is not an attacker, it also causes the leakage of patient information.
Table 1. No. | Name | Age | Zipcode | Disease |
1 | Alice | 46 | 201612 | Heart disease |
2 | Tom | 35 | 201621 | Heart disease |
3 | Bob | 44 | 201612 | Cancer |
4 | Mike | 67 | 101620 | Cancer |
5 | Eva | 56 | 201615 | Pneumonia |
6 | Jane | 39 | 201612 | Pneumonia |