protecting online social networks profiles by hiding ... online social networks profiles by hiding...

8
Procedia Computer Science 82 (2016) 20 – 27 1877-0509 Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Organizing Committee of SDMA2016 doi:10.1016/j.procs.2016.04.004 ScienceDirect Available online at www.sciencedirect.com Symposium on Data Mining Applications, SDMA2016, 30 March 2016, Riyadh, Saudi Arabia Protecting online social networks profiles by hiding sensitive data attributes Hatem AbdulKader a , Emad ElAbd b , Waleed Ead c * a,b Faculty of computers and information, Menoufia University, EGYPT c Faculty of computers and Information, Beni-suef University, EGYPT Abstract Online Social Networks (OSNs) have become a mainstream cultural phenomenon for millions of Internet users. More importantly, OSNs expose now information from multiple social spheres e.g. personal information or professional activity. We identify two stakeholders in online social networks: the OSN users and the OSN itself. On one hand, OSN users share an astonishing amount of information ranging from personal to professional. On the other hand, OSN services handle users’ information and manage all users’ activities in the network, being responsible for the correct functioning of its services and maintaining a profitable business model. Indirectly, this translates into ensuring that their users continue to happily use their services without becoming victims of malicious actions. We thus classify online social networks privacy and security issues into two categories of attacks on users and OSN. In this paper we propose a utility based association rule hiding algorithm for privacy preserving user profiles data against attacks from OSN users or even OSN applications. Experimental has been conducted on samples of real datasets. Experimental has been showed less attribute modification in the released user's profiles datasets. Keywords: online social networks; user's profiles; privacy preserving; profiles attacks. 1. Introduction Online Social Networks (OSNs) have become a mainstream cultural phenomenon for millions of Internet users. Combining user-constructed profiles with communication mechanisms that enable users to be pseudo-permanently “in touch”, OSNs leverage users’ real -world social relationships and blend even more our online and offline lives. Facebook in 2015 had more than 1.5 billion monthly active users and it was the second most visited site on the Internet [1]. Twitter, a social micro-blogging platform, claims over 500 million users. In addition, the social networking will be the fourth most popular online activity [2]. * Corresponding author. E-mail address: [email protected] Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Organizing Committee of SDMA2016

Upload: lamdung

Post on 19-Apr-2018

217 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

Procedia Computer Science 82 ( 2016 ) 20 – 27

1877-0509 Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).Peer-review under responsibility of the Organizing Committee of SDMA2016doi: 10.1016/j.procs.2016.04.004

ScienceDirectAvailable online at www.sciencedirect.com

Symposium on Data Mining Applications, SDMA2016, 30 March 2016, Riyadh, Saudi Arabia

Protecting online social networks profiles by hiding sensitive data attributes

Hatem AbdulKadera, Emad ElAbdb, Waleed Eadc* a,bFaculty of computers and information, Menoufia University, EGYPT c Faculty of computers and Information, Beni-suef University, EGYPT

Abstract

Online Social Networks (OSNs) have become a mainstream cultural phenomenon for millions of Internet users. More importantly, OSNs expose now information from multiple social spheres e.g. personal information or professional activity. We identify two stakeholders in online social networks: the OSN users and the OSN itself. On one hand, OSN users share an astonishing amount of information ranging from personal to professional. On the other hand, OSN services handle users’ information and manage all users’ activities in the network, being responsible for the correct functioning of its services and maintaining a profitable business model. Indirectly, this translates into ensuring that their users continue to happily use their services without becoming victims of malicious actions. We thus classify online social networks privacy and security issues into two categories of attacks on users and OSN. In this paper we propose a utility based association rule hiding algorithm for privacy preserving user profiles data against attacks from OSN users or even OSN applications. Experimental has been conducted on samples of real datasets. Experimental has been showed less attribute modification in the released user's profiles datasets.

Keywords: online social networks; user's profiles; privacy preserving; profiles attacks.

1. Introduction

Online Social Networks (OSNs) have become a mainstream cultural phenomenon for millions of Internet users. Combining user-constructed profiles with communication mechanisms that enable users to be pseudo-permanently “in touch”, OSNs leverage users’ real-world social relationships and blend even more our online and offline lives. Facebook in 2015 had more than 1.5 billion monthly active users and it was the second most visited site on the Internet [1]. Twitter, a social micro-blogging platform, claims over 500 million users. In addition, the social networking will be the fourth most popular online activity [2].

* Corresponding author.

E-mail address: [email protected]

Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).Peer-review under responsibility of the Organizing Committee of SDMA2016

Page 2: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

21 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

Companies are mining trends on Facebook and Twitter to create viral content for shares and likes. Employers are checking Facebook, LinkedIn and Twitter profiles of job candidates. Law enforcement organizations are gleaning evidence from OSNs to solve crimes. Activities on online social platforms change political regimes [2]and swing election results.

More importantly, OSNs expose now information from multiple social spheres. For example, personal information on Facebook and professional activity on LinkedIn that is aggregated leads to uncomfortably detailed profiles [3].

1.1. Privacy Attacks in Online Social Networks

Privacy attacks in online social networks can be classified based on the stakeholders of the OSN and the forms of attack targeted at the stakeholders. We identify two stakeholders in online social networks: the OSN users and the OSN itself. On one hand, OSN users share an astonishing amount of information ranging from personal to professional. The misuse of this information can have significant consequences. On the other hand, OSN services handle users’ information and manage all users’ activities in the network, being responsible for the correct functioning of its services and maintaining a profitable business model. Indirectly, this translates into ensuring that their users continue to happily use their services without becoming victims of malicious actions. We classify online social network privacy and security issues into the following attacks categories (Fig. 1).

Fig. 1: OSN attacks categories

(1) Attacks on Users: these attacks are isolated, targeting a small population of random or specific users. There are several types of attacks based on the attacker: (a) Attacks from other users[1, 2]. (b) Attacks from social applications [3]. (c) Attacks from the OSN [4]. (d) De-anonymization and inference attacks. OSN services publish social data for others (e.g., researchers, advertisers) to analyze and use for other purposes. (2) Attacks on the OSN: these attacks are aimed at the service provider itself, by threatening its core business. Such as Sybil Attacks [5-7], Crawling attacks [8], Social Spam[9], Distributed Denial-of-service attacks (DDoS) and Malware Attacks [10].

OSN users are facing multiple risks while using social applications. First, an application might be malicious; it could collect a high volume of user data for unwanted usage. For example, to show this vulnerability, BBC News developed a malicious application that could collect large amounts of user data in only three hours [11]. Second, application developers can violate developer policies to control user data. Application developers are supposed to abide by a set of rules set by the OSNs, called “developer policies”. Developer polices are intended to prohibit application developers from misusing personal information or forwarding it to other parties. However, reported incidents[11] show that applications violate these developer policies. For example, a Facebook application, “Top Friends” enabled everyone to view the birthday, gender and relationship status of all Top Friends users, even though

OSN Attacks

Attacks on Users

From other users

From Social Apps

From the OSN

Attacks on OSN

Sybil Attacks

Crawling Attacks

Social Spam

Malware attacks

DDoS

Page 3: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

22 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

those users kept their privacy for those information to private, violating the developer policies that private information of friends are not accessible.

Finally, third-party social applications can query more data about a user from an OSN, regardless whether needed or not for proper operation. A study by Felt and Evans [3] of150 of the top applications on Facebook shows that most of the applications only needed user name, friends, and their networks. However, 91% of social networking applications have accessed data that they do not need for operation.

Data and knowledge hiding are two research directions that investigate how the privacy of raw data, or information, can be maintained either before or after the course of mining the data. In this paper we will present a new algorithm in the class of knowledge hiding area [12, 13], known as association rule hiding based privacy utility of attributes in online social user's profiles. Therefore, the challenge is to protect users and their information from other users. The contribution of this paper may be organized as follow:

- Propose a framework for protecting OSN users and their information from others. - Proposed a novel association rule hiding algorithm based on privacy weight for each attribute in OSN user

profiles. - Provide experimental and performance evaluation of the proposed algorithm.

The rest of paper is organized as follow: section2 presents a set of related backgrounds. Section 3, includes the proposed framework for hiding sensitive knowledge from user profiles. Section 4, includes the implementation of proposed algorithm and finally section 5 performance evaluation of proposed algorithm followed by conclusion and references.

2. Related background

Privacy preserving data mining is a new research area that investigates the side-effects of data mining methods that originate from the penetration into the privacy of individuals and organizations. We assume that only sensitive are given and purpose an algorithm to modify data in database. Some private information could be easily discovered by this kind of tools such as association rule mining. Therefore, the protection of the confidentiality of sensitive information in a database becomes a critical issue to be resolved. The problem for finding an optimal sanitization to a database against association rule analysis has been proven to be NP-Hard[14]. The research can be divided into hiding sensitive rules [13, 15-17] and sensitive items [18-20]. Vassilios S. Verykios et al. [13] conducted a thorough investigation and presented five algorithms for hiding sensitive association rules. They concluded that among the proposed algorithms there is not a best solution for all the metric, including: firstly, the execution time required and secondly, the side effects produced by the proposed algorithms. Later on, mush research has been done and focused on some issues. Shyue-Liang Wang[16] proposed algorithms to hide sensitive items instead of hiding sensitive association rules. The algorithm needs less number of database scans but the side effects generated are also high. Ali Amiri [18] presented heuristic algorithms to hide sensitive items, while maximizing data utility at the expense of computational efficiency. Finally, Yi-Hung Wu et al. [17] proposed a heuristic method that could hide sensitive association rules with limited side effects. Cryptography of anonymization techniques may be used to protect user profiles data [21]. User profiles may contain data with different level of privacy or sensitivity according to user determination. Unlike such related background works we propose an association rule hiding algorithm to hide such sensitive data based the privacy setting of user in his/her profile data attributes. We do not hide all knowledge extracted by mining techniques at all but the high utility of privacy setting by user that will be hidden.

3. Proposed approach

In this paper we will concentrate on the attacks that may use the users profile to discover the identities of such users even to attack it or recommending services. We encourage recommending suitable service to user but with a tradeoff with their privacy.

The objective of this paper is to propose a novel association rule hiding (ARH) algorithm to hide the sensitive knowledge that can be discovered from publishing user's profile. The ARH algorithm will based on the utility / weight of privacy concerns of most sensitive attribute for each user to prevent breach of user privacy. The proposed algorithm will be mainly based on:

1- Reconstruct profile attributes by setting privacy level to each attribute.

Page 4: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

23 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

2- Construct association rule hiding algorithm based on utility of privacy setting. The idea of the proposed technique or a framework ;as shown in Fig. 2 and Fig. 3; is to hide high sensitive profile attribute/s based on its weight of sensitivity not only based on support and confidence thresholds. We can generate the set of frequent patterns but from privacy concerns it's not interesting by taking into consideration the profile weights of sensitivity. On the other hand, a less frequent pattern may be generated but it has a high utility.

Fig.2: The proposed framework.

Algorithm 1: User profile reconstruction Input: Set of user's profiles Output: Weighted user profile attributes

1. for each user's profile 2. do 3. set privacy weight for each profile's attributes 4. end 5. end

Algorithm 2: Profile weighted privacy Input: User's profiles with attribute-sets privacy weighted Output: Set of privacy utility of attribute-sets 1. for each attribute-set 2. Compute the weighted privacy utility in the overall profiles sets 2.1 weighted privacy utility of an attribute: sum of weighted privacy weight in each profile 3. Compute the profile weighted privacy utility 3.1 for each profile containing attribute-sets 3.1.1 sum all privacy weight of each profile contains such attribute-sets 4. end Algorithm 3: hiding sensitive profile's attribute-sets

Page 5: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

24 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

Inputs: set of frequent attribute-sets with privacy weighted utility; threshold of sensitive weighted utility Output: set of profiles sets with hidden of sensitive attribute-sets 1. for each frequent attribute-sets with privacy weighted utility > threshold of sensitive weighted utility 2. do 3. modify the profiles attribute-sets such that none of such high sensitive frequent attribute-sets can be generated. 4. release such profiles sets 5. end 6.end

Fig. 3: proposed algorithms.

4. Proposed system in practice

The following table 1 contains a sample of user's attributes profiles database. Each number of profile attribute represent a specific attribute in profile. We will use a sample datasets from [22] of Pokec social network [23].

Table 1: Setting profile attributes. Table 2: Samples of the set of frequent patterns.

# Profile attributes Weight of attribute

U1 3 ,5 ,1, 2, 4, 6 1 ,3 ,5 ,10 ,6, 5

U2 3 ,5 ,2 ,4 3 ,3, 8 ,6

U3 3, 1, 4 1 ,5 ,2

U4 3, 5, 1 ,7 6, 6, 10, 5

U5 3, 5, 2, 7 2 ,3 ,4 ,2

Each tuple in table 1 contains a set of user's profile attributes. It contains weight of privacy for each profile attribute. The weight measures the privacy level from user perspective. In table 2 a samples of the set of frequent pattern of attribute-sets with support minsup= 30 and confidence minconf=70. Mining of user's profiles frequent patterns suffer from considering the user request and/or its weight. We will find the set of frequent patterns considering the profile weight. The following table 3, shows samples of mining frequent pattern by using its weight of utility with minimum utility of 20. A minimum utility may be different according to the utility engine. Table 3: utility of frequent pattern attributes.

# Frequent attribute/s support utility

1 1 0.6 20 2 2 0.6 22

… 22 1 2 4 6 0.2 26 23 3 5 0.8 27 24 1 3 5 7 0.2 27

… 36 2 3 4 5 0.4 40

The utility of frequent attribute-set is the sum of its weight in a profile of user privacy setting; e.g. frequent attribute/s (1,3,5) in U1; utility (1,3,5)=1+3+5=9 . Now the utility of an attribute-set in the whole dataset of user's profiles is the sum of its utility in all profiles where it appears. In our case for attributes (1, 3, 5) =9(U1) + 22(U3) = 31. It is noted that attributes(1,3,5) in rule#19 (red row) in table2, don't meet the specified minsup but in the same time it has a high utility from privacy concerns such issue is a problem of mining frequent pattern that doesn't consider the utility or weight of an attribute/s.

Our proposed technique or a framework is to hide high sensitive profile attribute/s based on its weight of sensitivity not only based on support and confidence thresholds. As we can generate the set of frequent patterns but from privacy concerns it's not interesting by taking into consideration the profile weights of sensitivity. Also it may not generate less frequent pattern but it has a high utility.

Rule# Rule with support and confidence R1 5 ->3 Sup=4 Conf=100 R2 3 ->5 Sup=4 Conf=80

… R18 2 4 ->3 Sup=2 Conf=100 R19 1 5 ->3 Sup=2 Conf=100

R20 1 4 ->3 Sup=2 Conf=100 …

Page 6: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

25 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

To concentrate, the hiding of frequent pattern attribute-set will not depend on its support but also it's utility. In the table 4, we will filter utility of frequent pattern attributes in table 3 to get high utility frequent patterns and meet the specified minsup and utility. In this experiment we set the minimum utility to be between 30 and 40. Table 4: high utility attribute-sets.

# Frequent patterns support utility

2 2 4 0.4 30 3 2 5 0.6 31 4 1 3 5 0.4 31

5 2 3 4 0.4 34

6 2 4 5 0.4 36 7 2 3 5 0.6 37 8 2 3 4 5 0.4 40

The following table 5, shows the set of rules which have a high utility. Table 5: high utility rules (40>=utility >=30).

Rule# Rule with support and confidence

R4 2 ->5 Sup=3 Conf=100 R7 5 ->2 Sup=3 Conf=75

… R21 2 3 4 ->5 Sup=2 Conf=100

R22 2 4 5 ->3 Sup=2 Conf=100 R23 3 4 5 ->2 Sup=2 Conf=100

It is noticed that rules: e.g. R1, R2, R3 and R5 in table 2, will not appear, although it meets the minsup and confidence but its utility less than the predefined threshold of utility (40>=utility >=30) and measure such other rules that doesn't appear likewise. It notice that approximately half of generated rule has become less sensitive rule based on its utility. Rules R21, R22, R23 of attribute/s (2, 3, 4, 5); in table 5; has less support than minsup but it has a highest utility at all of utility=40. So such attribute-sets are more sensitive e.g. 2, 3, 4, or 5. We need to hide such rule that leads to such sensitive attribute-sets. The support and utility for each discovered sensitive attributes (2, 3, 4, 5) in the whole dataset, see table 6. Table 6: sensitive attribute/s support and utility.

Attribute support utility 2 0.6 22 3 1 13 4 0.6 14 5 0.8 15

One question may be raised which rule hiding based will use support or utility or an attribute. Start with a high utility attribute which is attribute 2. The following table 7, show the set of rules after hiding attribute 2 from dataset. Table 7: after hiding attribute 2.

rule# rule

1 5 ->3 Sup=4 Conf=100 2 3 ->5 Sup=4 Conf=80 3 4 ->3 Sup=3 Conf=100 4 1 ->3 Sup=3 Conf=100 5 7 ->5 Sup=2 Conf=100 6 7 ->3 Sup=2 Conf=100

7 3 7 ->5 Sup=2 Conf=100 8 5 7 ->3 Sup=2 Conf=100 9 4 5 ->3 Sup=2 Conf=100 10 1 5 ->3 Sup=2 Conf=100 11 1 4 ->3 Sup=2 Conf=100

But this assumption may raise a high sensitive based utility rule/s (e.g. rule 10) of attributes (1, 3, and 5) that has a utility of 31. Now try to hide attribute (5) of second high utility the resulted rules are in table 8. Table 8: after hiding attribute 5.

Page 7: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

26 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

rule # rules

1 4 ->3 Sup=3 Conf=100

2 2 ->3 Sup=3 Conf=100

3 1 ->3 Sup=3 Conf=100

4 7 ->3 Sup=2 Conf=100

5 2 4 ->3 Sup=2 Conf=100

6 1 4 ->3 Sup=2 Conf=100

This assumption also may arise a high sensitive based utility rule/s (e.g. rule 5) of attribute/s (2, 3, and 4) that has a utility of 34. If we try to hide 3 and 5 from dataset, no more rules will be generated. If we use the high support of utility attributes with is 3, hide such attributes from the dataset; see table 9. We will see a high utility frequent attribute-sets (2, 4 and 5) of utility 36. Table 9: after hide attribute 3. Table 10: After hiding attributes (2,5)

rule # rule 1 2 ->5 Sup=3 Conf=100 2 5 ->2 Sup=3 Conf=75

3 7 ->5 Sup=2 Conf=100

4 2 4 ->5 Sup=2 Conf=100

5 4 5 ->2 Sup=2 Conf=100

Experiments show that hiding top high utility attributes; attributes (2, 5); will generate a set of rules that has a less utility of privacy. See table 10 after hiding attributes (2, 5) from the dataset. Such sanitizing dataset can be securely publish to a third party application or recommender service system as shown in Fig.2 without disclosure any sensitive profile data attribute and no more high utility sensitive rule/s will be discovered.

5. Performance evaluation

A comparative analysis, in Fig. 4, in terms of number of hidden rules in case of using a privacy utility weight and without using a privacy weight of utility. It's noticed that the number of rules to be hidden in using a privacy weight is more than the other one. In addition the possibility of an adversary to recover a sensitive rule based on the non-sensitive ones will be zero. This means that the information loss in user profiles analysis is less with a trade-off of protecting the user's profiles. In addition, Fig. 5, demonstrates the changes in user's profiles attributes is less than hide sensitive attribute without setting a privacy weight of profile attribute-sets. As a result we can generate the set of frequent patterns but from privacy concerns it's not interesting by taking into consideration the profile weights of sensitivity. On the other hand, a less frequent pattern may be generated but it has a high utility.

Fig. 4: Number of hidden rules comparative analysis. Fig. 5: No.of user's profiles attributes changed.

0

5

10

15

20

25

30

1000 2000 3000

No.o

f.Hid

den

Rule

s

No. of users profiles

With privacyutility

Without privayutility

0

200

400

600

800

1000

1200

1000 2000 3000

No.o

f.Mod

ified

pro

files

at

trib

ute/

s

No. of users profiles

With privacyutility

Withoutprivay utility

rule# rule

1 4 ->3 Sup=3 Conf=100

2 1 ->3 Sup=3 Conf=100

3 7 ->3 Sup=2 Conf=100

4 1 4 ->3 Sup=2 Conf=100

Page 8: Protecting Online Social Networks Profiles by Hiding ... online social networks profiles by hiding ... only needed user name, ... Online Social Networks Profiles by Hiding Sensitive

27 Hatem AbdulKader et al. / Procedia Computer Science 82 ( 2016 ) 20 – 27

Conclusion

In this paper we have proposed a framework for hiding sensitive data attributes in OSN user's profiles. The proposed

framework based on two main steps. Firstly, reconstruct profile attributes by setting privacy level to each attribute.

Secondly, construct an association rule hiding algorithm based on utility of privacy setting. A mining analysis attack

can be conducted by other users or third party social network application on user's profiles data to discover the relevant

pattern of users. The proposed framework will protect the user's profiles sensitive frequent attribute-sets. As a future

work we will add the user recommendation preferences as a privacy utility to protect the user's profiles data.

References

1. Hwang, T., I. Pearce, and M. Nanis, Socialbots: Voices from the fronts. interactions, 2012. 19(2): p. 38-45. 2. Stringhini, G., et al. Follow the green: growth and dynamics in twitter follower markets. in Proceedings of the 2013 conference on

Internet measurement conference. 2013. ACM. 3. Felt, A. and D. Evans, Privacy protection for social networking APIs. 2008 Web 2.0 Security and Privacy (W2SP’08), 2008. 4. Fiesler, C. and A. Bruckman. Copyright terms in online creative communities. in CHI'14 Extended Abstracts on Human Factors in

Computing Systems. 2014. ACM. 5. Levine, B., C. Shields, and N. Margolin, A survey of solutions to the sybil attack. University of Massachusetts Amherst, Amherst. 2013,

MA, Technical Report, 2006, available at: https://gnunet. org/node/1432 [last accessed: 17 May 2014]. 6. Ratkiewicz, J., et al. Detecting and Tracking Political Abuse in Social Media. in ICWSM. 2011. 7. Wei, W., et al. Sybildefender: Defend against sybil attacks in large social networks. in INFOCOM, 2012 Proceedings IEEE. 2012.

IEEE. 8. Bonneau, J., J. Anderson, and G. Danezis. Prying data out of a social network. in Social Network Analysis and Mining, 2009.

ASONAM'09. International Conference on Advances in. 2009. IEEE. 9. Poorgholami, M., et al. Spam detection in social bookmarking websites. in Software Engineering and Service Science (ICSESS), 2013

4th IEEE International Conference on. 2013. IEEE. 10. Mosharraf, N., A.P. Jayasumana, and I. Ray, A Responsive Defense Mechanism Against DDoS Attacks, in Foundations and Practice of

Security. 2014, Springer. p. 347-355. 11. Kayes, I. and A. Iamnitchi, A Survey on Privacy and Security in Online Social Networks. arXiv preprint arXiv:1504.03342, 2015. 12. Aggarwal, C.C. Privacy-Preserving Data Mining. in Data Mining. 2015. Springer. 13. Verykios, V.S., et al., Association rule hiding. Knowledge and Data Engineering, IEEE Transactions on, 2004. 16(4): p. 434-447. 14. Weng, C.-C., S.-T. Chen, and H.-C. Lo. A novel algorithm for completely hiding sensitive association rules. in Intelligent Systems

Design and Applications, 2008. ISDA'08. Eighth International Conference on. 2008. IEEE. 15. Li, J., H. Shen, and R. Topor. Mining the smallest association rule set for predictions. in Data Mining, 2001. ICDM 2001, Proceedings

IEEE International Conference on. 2001. IEEE. 16. Wang, S.-L., et al. Maintenance of discovered informative rule sets: incremental deletion. in Systems, Man and Cybernetics, 2005 IEEE

International Conference on. 2005. IEEE. 17. Wu, Y.-H., C.-M. Chiang, and A.L. Chen, Hiding sensitive association rules with limited side effects. Knowledge and Data Engineering,

IEEE Transactions on, 2007. 19(1): p. 29-42. 18. Amiri, A., Dare to share: Protecting sensitive knowledge with data sanitization. Decision Support Systems, 2007. 43(1): p. 181-191. 19. Wang, S.-L. and A. Jafari. Hiding sensitive predictive association rules. in Systems, Man and Cybernetics, 2005 IEEE International

Conference on. 2005. Ieee. 20. Wang, S.-L., B. Parikh, and A. Jafari, Hiding informative association rule sets. Expert Systems with Applications, 2007. 33(2): p. 316-

323. 21. Ullah, I., et al. ProfileGuard: Privacy Preserving Obfuscation for Mobile User Profiles. in Proceedings of the 13th Workshop on Privacy

in the Electronic Society. 2014. ACM. 22. https://snap.stanford.edu/data/soc-pokec.html. 23. http://pokec.azet.sk/.