Transcript
Page 1: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

Ae

ND

a

ARAA

KIREFE

1

tdhopomia

tvnatI

tceim

1d

Applied Soft Computing 11 (2011) 4332–4340

Contents lists available at ScienceDirect

Applied Soft Computing

j ourna l ho me p age: www.elsev ier .com/ l ocate /asoc

n information systems security risk assessment model under uncertainnvironment

an Feng, Minqiang Li ∗

epartment of Information Management and Management Science, School of Management, Tianjin University, 92 Weijin Road, Nankai District, Tianjin 300072, PR China

r t i c l e i n f o

rticle history:eceived 21 April 2010ccepted 13 June 2010vailable online 18 June 2010

a b s t r a c t

Given there is a great deal of uncertainty in the process of information systems security (ISS) risk assess-ment, the handling of uncertainty is of great significance for the effectiveness of risk assessment. In thispaper, we propose an ISS risk assessment model based on the improved evidence theory. Firstly, weestablish the ISS index system and quantify index weights, based on which the evidential diagram is

eywords:nformation systems securityisk assessmentvidence theoryuzzy measure

constructed. To deal with the uncertain evidence found in the ISS risk assessment, this model providesa new way to define the basic belief assignment in fuzzy measure. Moreover, the model also provides amethod of testing the evidential consistency, which can reduce the uncertainty derived from the conflictsof evidence. Finally, the model is further demonstrated and validated via a case study, in which sensitivityanalysis is employed to validate the reliability of the proposed model.

vidential consistency

. Introduction

Organizations are increasingly relying on information sys-ems (IS) to enhance business operations, facilitate managementecision-making, and deploy business strategies. The dependenceas increased in current business environments where a varietyf transactions involving trading of goods and services are accom-lished electronically [1,2]. Increasing organizational dependencen the IS has led to a corresponding increase in the impact of infor-ation systems security (ISS) abuses. Therefore, the ISS is a critical

ssue that has attracted much attention from both IS researchersnd practitioners.

In order to prevent security breaches, businesses use con-rols (and various countermeasures) to safeguard their assets fromarious patterns of threats by identifying the IS assets that are vul-erable to threats. But, even in the presence of controls, the assetsre often not fully protected from threats because of inherent con-rol weaknesses. Thus, the risk assessment is a critical step for theSS risk management [3].

In practice, the ISS risk assessment is quite complex and full ofhe uncertainty as well [4]. The uncertainty, existing in the pro-ess of assessment, has been the primary factor that influences the

ffectiveness of the ISS risk assessment to a large extent. Therefore,n order to deal with the incompleteness and vagueness of infor-

ation, the uncertainty must be taken into account in the ISS risk

∗ Corresponding author. Tel.: +86 22 27404796; fax: +86 22 27404796.E-mail address: fengnan [email protected] (M. Li).

568-4946/$ – see front matter © 2010 Elsevier B.V. All rights reserved.oi:10.1016/j.asoc.2010.06.005

© 2010 Elsevier B.V. All rights reserved.

assessment. However, most existing approaches applied to the ISSassessment have some drawbacks on handling uncertainty in theprocess of assessment.

To address these aforementioned issues, we propose an ISS riskassessment model based on the improved evidence theory. In thispaper, the model provides a new way to define the basic beliefassignment in fuzzy measure for dealing with the uncertain evi-dence found in the ISS risk assessment. Moreover, we add a processof testing the evidential consistency to the existing evidence theorymethod. This process can effectively reduce the uncertainty derivedfrom the conflicts of evidence provided by experts.

The rest of this paper is organized as follows: Section 2 reviewsthe related work. In the next section, the basic concepts of evidencetheory are explained. Then, we discuss the process of developingan ISS risk assessment model in detail in Section 4. The model isfurther demonstrated and validated in Section 5 via a case study.Finally, we summarize our contributions and present our furtherresearch.

2. Related work

The existing approaches for the ISS risk assessment can begrouped into three major categories: the quantitative approaches,the qualitative approaches, and the combination of quantitativeand qualitative approaches.

The quantitative approaches consider the IS risk exposure asa function of the probability of a threat and the expected lossdue to the vulnerability of the organization to this threat [5,6].The stochastic dominance (SD) approach [7] focuses on answer-

Page 2: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

Compu

itcotaomsiistfviattsapsbeoiie

toyps

nmaaNf(tetT[Ialcrictttfwc

uIoa

N. Feng, M. Li / Applied Soft

ng the specific question of what contingency plan should be usedo prevent losses if a disaster occurs. To achieve this goal, the SDompares the costs associated with various backup and recoveryptions during the entire disaster recovery process in all areas ofhe organization. However, it fails to provide guidance on how tossess the failure of multiple controls pertaining to a single threatr how to assess the failure and the impact of a single control onultiple threats. The proposed approach in this paper provides a

tructure to the ISS risk assessment process by decomposing risknto its subcomponents and identifying relevant controls and theirnterrelationships. The approach based on neural networks [8] con-ists of five phases: network parameter initialization, input theraining sample and the expectation output, network self-learning,orward propagation, and back propagation. If the error functionalue is smaller than the pre-established value, the network learn-ng is stopped, otherwise turn to the second phase. While thispproach has the intelligent features such as the self-learning andhe acquisition of knowledge, which is different from the conven-ional methods, it is very difficult to get a large numbers of trainingamples for network self-learning in the process of the ISS riskssessment. The modular attack trees [9] approach is specified asarametric constraints, which allow quantifying the probability ofecurity breaches that occur due to internal component vulnera-ilities as well as vulnerabilities in the component’s deploymentnvironment. Based on the attack probabilities and the structuref the modular attack trees, security risks can be estimated for thenformation system. But, this approach has the difficulties captur-ng the uncertainty in the ISS risk environment dealing with thexistence of the incompleteness and vagueness of information.

In the qualitative approaches, such as the logic analysis [10] andhe Delphi method [11], the probability data is not required andnly the estimated potential loss is used. Since the qualitative anal-sis depends to a great extend on the analyst’s experience, both therocess and the result of the security risk assessment are relativelyubjective [12].

As information systems have become more complex in busi-ess, neither quantitative nor qualitative approaches can properlyodel the assessment process alone. Therefore, the comprehensive

pproaches combining both the quantitative and the qualitativepproaches are needed [13,14]. The approach using the Bayesianetworks (BNs) [15–17] provides an objective and visible support

or risk analysis. It consists of three phases: the BN initializationdefine the structure and the set of conditional probability dis-ributions), the risk monitoring, and the risk analysis. Using newvidence obtained from information system, this approach can con-inually estimate risk probability and identify the sources of risk.he approach based on the fuzzy comprehensive evaluation (FCE)18–20] is a mathematical method to comprehensively evaluate theSS risks using fuzzy set theory of fuzzy mathematics. Although thispproach is good at processing the ambiguous information by simu-ating the characteristic of human in making the judgment, it is notapable to provide the graphical relationships among various ISSisk factors using flow charts or diagrams. The proposed approachn this paper consists of the graphical representation of relevantonstructs through an evidential diagram, which can fully capturehe complexity of multiple controls dealing with one threat and alsohat of one control dealing with multiple threats. In addition, bothhe above approaches are suffering from the uncertainty derivedrom the conflicts of evidence provided by experts. In this paper,e propose a method of testing the evidential consistency, which

an reduce the uncertainty derived from the conflicts of evidence.In this paper, we utilize the evidence theory to model the

ncertainty involved in the process of the ISS risk assessment.n addition to representing uncertainties using the evidence the-ry, the present approach allows the decision maker to developn evidential diagram to assess the ISS risk that contains various

ting 11 (2011) 4332–4340 4333

variables such as the IS assets, the related threats, and the corre-sponding countermeasures. Next, the decision maker can input hisor her judgments about the presence or absence of threats and theimpact of countermeasures on the corresponding threats accordingto belief functions.

Evidence theory has been widely used in a broad range ofdisciplines, including audit and assurance services [21,22], artifi-cial intelligence and expert systems [23], data mining [24], faultdiagnosis of machines [25], design optimization [26], ensemblingtechnique [27], and image object recognition [28]. In this paper, wemake an improvement on the existing evidence theory. In compar-ison with the above existing works, there are several advantageswith the improved evidence theory used in this paper. Firstly, inorder to deal with fuzzy evidence involved in the ISS risk assess-ment, we improve the existing mass function of evidence theoryfor computing the BBAs in fuzzy form. Next, we add a process oftesting the evidential consistency to the existing evidence theory.This process can effectively reduce the uncertainty derived fromthe conflicts of evidence provided by experts. In Section 5, we com-pared the assessment results of the existing evidence theory andthe improved evidence theory.

3. Evidence theory

In this section, we define the terminology of evidence theoryand the notations used in this paper.

3.1. Terminology

The evidence theory, also called the Dempster–Shafer’s theory,is based on the work of Dempster during the 1960s [29] and Shaferduring the 1970s [30]. Although, the evidence theory is a general-ization of the Bayesian theory of subjective probability, it has oftenbeen applied in the reasoning under uncertainty [31,32].

Suppose we have a decision problem with n possible elementsor states of nature forming a mutually exclusive and collectivelyexhaustive set. This set is called the frame of discernment repre-sented by �. The power set of � containing all the possible subsetsof �, represented as P(�).

A basic belief assignment (BBA) is a function from P(�) to [0,1]defined by:

m : P(�) → [0, 1]

A �→ m(A), (1)

where A is an element of P(�). In addition, it satisfies the followingconditions:

∑A ∈ P(�)

m(A) = 1, (2)

m(∅) = 0. (3)

Basically, the BBA pertaining to a statement measures the degreeof belief directly assigned to the statement based on the evidence.

Given a BBA m, a belief function is defined as:

Bel : P(�) → [0, 1],

A �→ Bel(A) =∑

m (B) , (4)

B⊆A

where B is a subset of A. Bel(A) measures the total belief that theobject is in A. In particular, we have Bel(∅) = 0 and Bel(�) = 1.

Page 3: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

4 Compu

f

P

atA

3

oseiB

m

w

i

4

oit

334 N. Feng, M. Li / Applied Soft

Given a belief function, a plausibility function is defined as:

Pl : P(�) → [0, 1],

A �→ Pl(A) =∑

A∩B /= ∅

m(B). (5)

The plausibility function can also be defined in terms of beliefunction as

l(A) = 1 − Bel(Ac), (6)

where Ac is the complement of A. The plausibility function for subset of elements A is defined as the maximum possible beliefhat could be assigned to A if all future evidence were in support of. In particular, we have Pl(∅) = 0 and Pl(�) = 1.

.2. Combination of evidence

Dempster’s rule [30] is the fundamental rule for combining twor more items of evidence in the belief function framework. Forimplicity, let us illustrate Dempster’s rule for only two items ofvidence. In general, if m1 and m2 are two BBAs representing twondependent items of evidence pertaining to �, then the combinedBAs for a subset A of frame � using Dempster’s rule is given by

(A) = K−1∑

B∩C=A

m1(B)m2(C), (7)

here K = 1 −∑

B∩C=∅m1(B)m2(C), which represents the renormal-

zation constant. The second term in K represents the conflict.

. ISS risk assessment model

The process of developing an ISS risk assessment model consistsf four phases: (a) establish the ISS index system and quantify thendex weights, (b) construct the evidential diagram, (c) computehe BBAs for the assertions in the evidential diagram, (d) test the

Fig. 1. Model p

ting 11 (2011) 4332–4340

evidential consistency. Each phase is discussed in detail as follows.And, the procedure of the model is given in Fig. 1.

4.1. Establish the ISS index system and quantify index weights

The ISS index system is based on the risk analysis, which includesthe identification of vulnerabilities and threads, the analysis of thelosses arising from the threads acting on vulnerabilities [33]. Basedon the ISS risk analysis for a securities company (see Section 5), wehave established the index system (see Table 1).

For quantifying the index weights, six information systemexperts, two of which are also this company’s IT managers, wereinvited to fill in the questionnaires about the comparison table offactor weights. And then, we have quantified the index weightsusing the method in Ref. [34]. This method can effectively reducethe uncertainty in the process of quantifying index weights [34].

4.2. Construct the evidential diagram

An evidential diagram consists of assertions, evidence, and theirinterrelationships. Assertions include the main assertion and sub-assertions. The main assertion is the highest-level assertion; thesubassertions are lower-level assertions. Relationships betweenassertions (e.g., between the main assertion and subassertions, andbetween higher-level subassertions and lower-level subassertions)need to be defined using logical relationships such as “and” and “or.”And evidence represents the information that supports or negatesassertions.

In this paper, the evidential diagram is derived from the ISSindex system. Suppose a manager is interested in evaluating theISS risk involved in the ISS vulnerabilities. The corresponding evi-dential diagram is given in Fig. 2, which is a part of the evidentialdiagram for the main assertion “ISS risk” in a securities company. In

Fig. 2, the rounded boxes represent assertion nodes. And evidencenodes are represented by rectangular boxes in the evidential dia-grams. Numbers in parentheses represent weights. Evidence nodesare connected to the corresponding assertion(s) that they directly

rocedure.

Page 4: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

N. Feng, M. Li / Applied Soft Computing 11 (2011) 4332–4340 4335

Table 1ISS risk index system and index weights.

First level index Second level indexes Weights Third level indexes Weights

ISS risk ISS vulnerabilities 0.262 Hardware defects 0.134Software defects 0.369Network vulnerabilities 0.284Communication protocol vulnerabilities 0.213

ISS thread 0.246 Deletion or loss of information 0.264Breach of network resources 0.303Information abuse 0.229Information leakage 0.204

Assets loss 0.206 Tangible assets loss 0.512Intangible assets loss 0.488

Capability loss 0.173 Service interruption 0.681Service delay 0.184Service weakening 0.135

Cost of system recovery 0.113 Cost of information recovery 0.338

pwv

4

d

Iec

DTmo

Da

u

u

D

E

ertain to. For instance, the evidence “E1.1.1 vulnerabilities of hard-are protection measures” directly pertains to assertion “A1.1 ISS

ulnerabilities” and thus it is connected to that assertion.

.3. Compute the BBAs for the assertions in the evidential diagram

In this section, we improve the existing mass function of evi-ence theory for computing the BBAs in fuzzy form.

In practice, the evidence is generally described in fuzzy form inSS risk assessment [35]. For this reason, we introduce fuzzy set tovidence space and define the BBAs in fuzzy measure so that wean further reduce the degree of uncertainty in ISS risk assessment.

We assume that E is an evidence space, E = {e1, e2,. . ., en}, and = {a1, a2,. . ., am}.

efinition 1. Let F̃ be a fuzzy set on E, uF̃ : E → [0, 1] , e → uF̃ (e).hen uF̃ is called membership function for F̃ , and uF̃ (e) is called aembership from e to F̃ . Let F(E) be a set composed of fuzzy subsets

f E, then F(E) is called the fuzzy power set of E.

efinition 2. Let F̃1, F̃2 ∈ F (E). Then uF̃1∪F̃2and uF̃1∩F̃2

are defineds:

F̃1∪F̃2(e) � uF̃1

(e) ∨ uF̃2(e); (1)

F̃1∩F̃2(e) � uF̃1

(e) ∧ uF̃2(e). (2)

efinition 3. If the following conditions hold:

∈ F(E); (1)

Fig. 2. Hypothetical evidential diagram for ISS vulnerabilities.

Cost of service recovery 0.662

If F̃1, F̃2, · · ·, F̃n ∈ F(E), thenn⋃

i=1

F̃i ∈ F(E), (2)

then F(E) is called a fuzzy additive set.

Definition 4. Let P(ei) be a probability density function on E, F(E)a fuzzy additive set on E, and wi a weight of ei. If F̃ ∈ F(E), then theprobability P(F̃) can be defined as:

P(F̃) =n∑

i=1

uF̃ (ei)wiP(ei) i = 1, 2, . . . , n (8)

Definition 5. Set up a mapping � : F(E) → �. Let Aj be an elementof P(�). If ∃F̃k ∈ F(E), s.t. �

(F̃k

)= Aj (j = 1, 2,. . ., 2m; k = 1, 2,. . ., l),

then the mapping � [P] : � → [0, 1] is defined as:

� [P](Aj) =

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎩

P

⎛⎜⎜⎜⎜⎝

⋃Fk ∈ F (E)� (F̃k)=Aj

F̃k

⎞⎟⎟⎟⎟⎠

MAj /= ∅

0 Aj = ∅

, (9)

where M =∑

Aj ∈ P(

�)

Aj /= ∅

P

⎛⎜⎜⎜⎜⎝

⋃F̃k ∈ F (E)� (F̃k)=Aj

F̃k

⎞⎟⎟⎟⎟⎠. Let B̃ =

⎛⎜⎜⎜⎜⎝

⋃F̃k ∈ F (E)� (F̃k)=Aj

F̃k

⎞⎟⎟⎟⎟⎠, we

have:

M =∑

Aj ∈ P(

�)

Aj /= ∅

n∑i=1

uB̃(ei)wiP(ei)

=∑

Aj ∈ P(

�)

Aj /= ∅

n∑i=1

max� (F̃k)=Aj

{uF̃k

(ei)}

wiP(ei). (10)

Based on above definitions, we can propose the following propo-sition:

Proposition 1. � [P](Aj) is a BBA on �.

Page 5: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

4 Compu

P

tto

4

s

dbe

eimtitI

s

Do

w

da

d

w

wm

|

336 N. Feng, M. Li / Applied Soft

roof (:). If Aj = ∅, then � [P](∅) = 0;

If Aj /= ∅, we have∑

Aj ∈ P(�)

� [P](Aj) =∑

Aj ∈ P(�)

P

⎛⎜⎜⎜⎜⎝

⋃F̃k ∈ F(E)� (F̃k)=Aj

F̃k

⎞⎟⎟⎟⎟⎠

M =

1M

∑Aj ∈ P(�)

P

⎛⎜⎜⎜⎜⎝

⋃F̃k ∈ F(E)� (F̃k)=Aj

F̃k

⎞⎟⎟⎟⎟⎠ = 1.

The proposition is proved.According to above definitions and Proposition 1, the mass func-

ion, i.e. � [P](Aj), can effectively meet the requirement to deal withhe situation where there is the uncertain evidence in the processf ISS risk assessment.

.4. Test the evidential consistency

In this section, we add a process of testing the evidential con-istency to the existing evidence theory.

In the uncertain reasoning by evidence theory, if an item of evi-ence is in conflict with other(s), the reasoning result would note sound [36]. To illustrate the conflict of evidences, we give anxample as follow.

Assumed that the frame � is {a, b, c}. If the BBAs for an item ofvidence A are m1(a) = 0.99 and m1(b) = 0.01, and the BBAs for antem of evidence B are m2(b) = 0.01 and m2(c) = 0.99, then we have

(a) = m(c) = 0 and m(b) = 1 by combining of evidences. Althoughhe supports of A and B for event b is very low, the reasoning results that the event b is true. It is obviously not reasonable. Therefore,he testing evidential consistency has important significance for theSS risk assessment based on evidence theory.

Furthermore, we discuss the process of testing evidential con-istency in detail next.

efinition 6. Let SP(�) be the space generated by all the subsetsf �. A BBA is a vector �m of SP(�) with coordinates m(Ai) such that

2N

i=1

m(Ai) = 1 and m(Ai) ≥ 0, i = 1, . . . , 2N, (11)

here Ai ∈ P(�).

Assume that m1 and m2 are two BBAs on the same frame ofiscernment �. According to Ref. [32], the distance between m1nd m2 is:

BPA(m1, m2) =√

12

(|| �m1||2 + || �m2||2 − 2〈 �m1, �m2〉

), (12)

here 〈 �m1, �m2〉 is the scalar product defined by

�m1, �m2〉 =2N∑i=1

2N∑j=1

m1(Ai)m2(Aj)|Ai ∩ Aj||Ai ∪ Aj|

, (13)

ith Ai, Aj ∈ P(�) for i, j = 1,. . ., 2N. || �m||2 is then the square norm of� :

| �m||2 = 〈 �m, �m〉. (14)

ting 11 (2011) 4332–4340

Based on the evidential distance, we can further define the sim-ilarity of two BBAs:

S(

mi, mj

)= 1 − dBPA

(mi, mj

)i, j = 1, 2, . . . , n. (15)

Thus the result can be represented by a similarity matrix:

SM =

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎣

1 S12 · · · S1j · · · S1n

......

......

...

Si1 Si2 · · · Sij · · · Sin

......

......

...

Sn1 Sn2 · · · Snj · · · 1

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎦

.

Furthermore, the support for a BBA mi is:

Sup(mi) =n∑

j = 1j /= i

S(mi, mj) i, j = 1, 2, . . . , n. (16)

The support for the BBA mi, i.e. Sup(mi), reflects the degree of thesupport of other BBAs. Based on it, we have the credibility C(mi):

C(mi) = Sup(mi)n∑

i=1

Sup(mi)

i, j = 1, 2, . . . , n. (17)

Obviously,n∑

i=1

C(mi) = 1. Therefore, C(mi) can represent the

weight of the BBA mi.In the process of testing evidential consistency in the ISS risk

assessment, a threshold value � can be set according to the actualsituations. If the similarity of any two items of evidence is greaterthan or equal to the threshold value �, then it is considered thatthe existing items of evidence are consistent. In contrast, if thesimilarity is lesser than �, we have to adjust the existing items ofevidence.

For the evidential adjustment, if an item of evidence is supportedby other items of evidence, then it has a higher credibility and weassign a larger weight for it in evidence combination; In contrast,if an item of evidence is in conflict with other items of evidence,then its credibility and weight should be smaller. The steps of theevidential adjustment are as follows:

Step 1. Obtain the credibility of the items of evidence: based onEqs. (16) and (17), we can obtain the credibility of the items ofevidence.Step 2. Weighted average for BBAs of the items of evidence: let ustreat the credibility as the weight of evidence. Then, we weightedaverage for BBAs of the items of evidence.Step 3: Combine the weighted average evidence: according to Ref.[37], if there are n items of evidence, we combine the weightedaverage evidence n − 1 times using Eq. (7).

5. Model validation via a case study

In order to further validate the ISS risk assessment model, weused it in assessing an actual securities company’s information sys-tems. In this section, we first demonstrate the model via a case study

according to the procedure of Section 4. Then sensitivity analysis isemployed to validate the reliability of the proposed model. Finally,the effectiveness of the model is evaluated by comparing the resultsof risk assessment of the proposed model in this paper, the fuzzy
Page 6: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

N. Feng, M. Li / Applied Soft Computing 11 (2011) 4332–4340 4337

for the

ct

5

vcFcas

acIr

dIai

cr

the level of risk of F̃k is higher than F̃k−1. Then, based on Proposition1, the BBAs for subassertions A1.1 to A1.5 were computed (seeTable 2).

Table 2The BBAs for the subassertions.

Subassertions m(A1) m(A2) m(A3) m(A4) m(A5) m(U)

A1.1 0.107 0.216 0.203 0.215 0.172 0.077

Fig. 3. Evidential diagram

omprehensive evaluation (FCE), the Bayesian Networks (BNs), andhe existing evidence theory.

.1. Model implementation

The securities company is a Chinese financial services firm pro-iding a wide range of services in securities trading and sales,orporate finance and investment banking, and asset management.ounded in 1989 as one of the oldest brokerage firms in China, theompany is headquartered in Shenzhen and hires 300 profession-ls in more than 10 branches in major cities all over the country,erving millions of institutional and individual clients.

We invited six information system experts, two of which arelso IT managers of the company, to assess the security risk of theompany’s information systems. As mentioned in Section 4.1, theSS index system and weights have been established based on theisk analysis for this securities company (see Table 1).

Furthermore, based upon the ISS index system, an evidentialiagram (see Fig. 3) for the main assertion “ISS risk” was developed.

n Fig. 3, we used the “and” relationship between the main assertionnd the subassertions, which implies that the main assertion is true

f and only if all subassertions are true.

According to the evidential diagram, we defined the frame of dis-ernment of the assertions as � = {very high risk, high risk, medianisk, low risk, very low risk}, where A1 = {very high risk}, A2 = {high

main assertion “ISS risk”.

risk}, A3 = {median risk}, A4 = {low risk}, and A5 = {very low risk}.With the exception of A1–A5, other subsets of P(�), noted by U,represent the unknown degree of evidence.

Six experts assessed the strength of evidence, which indicate thelevel of support that an item of evidence provides. For simplicity,we illustrated the process of reasoning by the strength of an itemof evidence provided by one expert.

Strength of evidence is represented by fuzzy form. In this casestudy, we employed asymmetric triangular membership function[38] to describe the belief degree of evidence. As shown in Fig. 4, themembership values of the evidence E, E = {e1, e2,. . ., e15}, are pro-vided by an expert. F̃1 to F̃5 are defined as the fuzzy subsets on E and

A1.2 0.093 0.177 0.130 0.345 0.208 0.047A1.3 0.069 0.131 0.169 0.251 0.257 0.123A1.4 0.132 0.147 0.206 0.331 0.149 0.035A1.5 0.070 0.131 0.133 0.298 0.332 0.036

Page 7: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

4338 N. Feng, M. Li / Applied Soft Computing 11 (2011) 4332–4340

bFwTppttrm

s

fsto

S

ie

(

(

TT

(see in Fig. 3), and then examined the impact of the change of thestrength on the beliefs of the main assertion “A1. ISS risk” respec-tively. The corresponding results are shown in Figs. 6 and 7.

Fig. 6. Impact of the change of E1.4.3 strength on the main assertion.

Fig. 4. Membership function.

The BBAs for main assertion “ISS risk” are computed by com-ining the BBAs of the subassertions based on the structure ofig. 3. This is done by propagating the BBAs through the net-ork. Shenoy and Shafer [39] discussed this process in detail.

he process of propagating BBAs in a network becomes com-utationally quite complex. However, there are several softwareackages available [40,41] that facilitate the process. We usehe tool for propagating uncertainty in valuation networks [40]o conduct the computation. The BBAs for main assertion “ISSisk” are m(A1) = 0.049, m(A2) = 0.162, m(A3) = 0.214, m(A4) = 0.316,(A5) = 0.217, and m(U) = 0.042.

Similarly, we could also obtain the BBAs according to thetrength of evidence provided by other five experts (see Table 3).

Then, we tested the consistency of above six items of evidencerom m1 to m6 as mentioned in Section 4.4. Since there were onlyix experts participating in the risk assessment, we set a higherhreshold �, � = 0.85. According to Table 3 and Eqs. (12)–(15), webtained the similarity matrix:

M =

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎣

1 0.816 0.801 0.754 0.832 0.817

0.816 1 0.853 0.844 0.781 0.776

0.801 0.853 1 0.696 0.798 0.800

0.754 0.844 0.696 1 0.821 0.755

0.832 0.781 0.798 0.821 1 0.829

0.817 0.776 0.800 0.755 0.829 1

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎦

.

It is obvious that the similarity of any two items of evidences lesser than �. Therefore, we have to adjust the existing items ofvidence. The results of adjustment are as follows:

1) Based on Eqs. (16) and (17), the credibility of the itemsof evidence are: C(m1) = 0.243, C(m2) = 0.216, C(m3) = 0.109,C(m4) = 0.045, C(m5) = 0.186, and C(m6) = 0.201.

2) Weighted average for BBAs of the items of evidence:mMAE(A1) = 0.058, mMAE(A2) = 0.143, mMAE(A3) = 0.214,mMAE(A4) = 0.311, mMAE(A5) = 0.210, and mMAE(U) = 0.064.

able 3he BBAs for main assertion “ISS risk”.

Experts m(A1) m(A2) m(A3) m(A4) m(A5) m(U)

Expert 1(m1) 0.049 0.162 0.214 0.316 0.217 0.042Expert 2(m2) 0.039 0.169 0.220 0.323 0.198 0.051Expert 3(m3) 0.098 0.104 0.199 0.248 0.254 0.097Expert 4(m4) 0.102 0.153 0.296 0.207 0.186 0.056Expert 5(m5) 0.065 0.112 0.186 0.298 0.203 0.136Expert 6(m6) 0.053 0.142 0.221 0.300 0.204 0.080

Fig. 5. Results of the ISS risk assessment.

(3) Combine the weighted average evidence five times:m(A1) = 0.032, m(A2) = 0.138, m(A3) = 0.223, m(A4) = 0.416,m(A5) = 0.165, and m(U) = 0.026.

Consequently, the results of ISS risk assessment in this casestudy is shown in Fig. 5, in which the belief supporting A4, i.e. “ISSrisk is low”, is 0.416. This suggests that we have the most confidencethat the ISS risk is low.

5.2. Sensitivity analysis

In this section, we perform sensitivity analysis to investigatehow the change of the strength of evidence affects the result of theISS risk assessment.

For instance, we decreased the strengths of E1.4.3 and E1.4.1

Fig. 7. Impact of the change of E1.4.1 strength on the main assertion.

Page 8: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

N. Feng, M. Li / Applied Soft Compu

os7lFr

tbvcia

5i

as

tt

M

oaid

Fig. 8. Comparison of the Methodl, the Method2, and the Method3.

The results in Figs. 6 and 7 indicate that although the strengthsf E1.4.3 and E1.4.1 have been changed, the belief supporting A4 istill lager than others still. Furthermore, by comparing Figs. 6 with, we can also observe that the larger the weight of evidence, the

arger the impact on the belief of the main assertion is, as shown inig. 3 where the weights of E1.4.3 and E1.4.1 are 0.135 and 0.681espectively.

In addition, we have also performed sensitivity analysis to inves-igate how the strengths of other items of evidence affected theeliefs on the main assertions. The results showed that the smallariations in the input strengths of evidence do not impact signifi-antly the beliefs of the main assertion. This implies that the models robust and reliable to small amounts of measurement error inssessing strength of evidence.

.3. Evaluation of ISS risk assessment model based on themproved evidence theory

Under the same conditions, we have employed the FCE, the BNs,nd the existing evidence theory to assess the ISS risk in this casetudy.

In particular, we use Method1, Method2, Method3, and Method4o refer to our proposed model, FCE, BNs, and the existing evidenceheory respectively.

Firstly, we compared the Method1 with the Method2 and theethod3 (see Fig. 8).The assessment results indicated that the sequences of risk level

btained from three methods are consistent. Furthermore, we canlso observe that the degree of the belief of low risk level is highern the Method1 than in the Method2 and the Method3, while theegrees of the belief of other levels are lower in the Method1 than

Fig. 9. Comparison of the Method1 and the Method4.

[

[

ting 11 (2011) 4332–4340 4339

in the Method2 and the Method3. Therefore, the Method1 is moreeffective than the other two methods in the process of the ISS riskassessment.

Secondly, we compared the Method1 with the Method4 (see Fig.9).

The experiment results show that the improved evidence theoryproposed in this paper outperform the existing evidence theory.Moreover, in Fig. 9, we can also find that the m value of U in theMethod1 is higher than that in the Method4. Thus, in the process ofthe ISS risk assessment, there is lower uncertainty in the Method1than in the Method4.

6. Conclusions

In this paper, we proposed an ISS risk assessment model basedon the improved evidence theory. This model has several advan-tages. First, the model is based on evidence theory, which caneffectively model the uncertainty involved in the assessment pro-cess. Second, for dealing with fuzzy evidence found in the ISS riskassessment, this model provides a new way to define the basic beliefassignment in fuzzy measure. Further, this model also providesa method of testing the evidential consistency, which can reducethe uncertainty derived from the conflicts of evidence provided byexperts.

In this paper, we also employed the sensitivity analysis tovalidate the reliability of the proposed model. In addition, the effec-tiveness of the model is evaluated by comparing the results of riskassessment of the proposed model in this paper, FCE, BNs, and theexisting evidence theory.

Although the proposed model performs with an advantage overexisting methods in the uncertain environment, it still requiresdomain experts’ belief inputs at the individual evidence level.Future research will be conducted to explore how to better elicitpractitioners’ assessments of the strength of the evidence.

Acknowledgements

The research was supported by the National Natural ScienceFoundation of China (Grant No. 70901054) and the National Sci-ence Fund for Distinguished Young Scholars of China (Grant No.70925005). The authors are very grateful to all anonymous review-ers whose invaluable comments and suggestions substantiallyhelped improve the quality of the paper.

References

[1] A. Kankanhalli, H.H. Teo, B.C.Y. Tan, K.K. Wei, An integrative study of infor-mation systems security effectiveness, International Journal of InformationManagement 23 (2) (2003) 139–154.

[2] M. Karyda, E. Kiountouzis, S. Kokolakis, Information systems security policies:a contextual perspective, Computers and Security 24 (3) (2005) 246–260.

[3] D.W. Straub, R.J. Welke, Coping with systems risk: security planning modelsfor management decision-making, MIS Quarterly 22 (4) (1998) 441–469.

[4] R.L. Winkler, Uncertainty in probabilistic risk assessment, Reliability Engineer-ing and System Safety 54 (2–3) (1996) 127–132.

[5] R.K. Rainer, C.A. Snyder, H.H. Carr, Risk analysis for information technology,Journal of Management Information Systems 8 (1) (1991) 129–147.

[6] L.D. Bodin, L.A. Gordon, M.P. Loeb, Information security and risk management,Communications of the ACM 51 (4) (2008) 64–68.

[7] G.V. Post, J.D. Diltz, A stochastic dominance approach to risk analysis of com-puter systems, MIS Quarterly 10 (4) (2001) 363–375.

[8] Y. Huanchun, Risk evaluation model on enterprises’ complex information sys-tem: a study based on the BP neural network, Journal of Software 5 (1) (2010)99–106.

[9] L. Grunske, D. Joyce, Quantitative risk-based security prediction forcomponent-based systems with explicitly modeled attack profiles, Journal ofSystems and Software 81 (8) (2008) 1327–1345.

10] W.G. de Ru, J.H.P. Eloff, Risk analysis modeling with the use of fuzzy logic,Computers and Security 15 (3) (1996) 239–248.

11] D. Xu, J. Sha, P. Zhang, B. Wan, Study of switch project construction risk iden-tification evaluation and tacking based on Delphi method, System EngineeringTheory and Practice 20 (12) (2000) 113–118.

Page 9: An information systems security risk assessment model under uncertain environment

Journal Identification = ASOC Article Identification = 911 Date: July 5, 2011 Time: 1:46 am

4 Compu

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

340 N. Feng, M. Li / Applied Soft

12] D.K. Hardman, P. Ayton, Arguments for qualitative risk assessment: the StARrisk adviser, Expert Systems 14 (1) (2000) 24–36.

13] S. Alter, S. Sherer, A general, but readily adaptable model of information systemrisk, Communications of the AIS 14 (1) (2004) 1–28.

14] H Salmela, Analysing business losses caused by information systems risk: abusiness process analysis approach, Journal of Information Technology 23 (3)(2008) 185–202.

15] C. Fan, Y. Yu, BBN-based software project risk management, Journal of Systemsand Software 73 (2) (2004) 193–203.

16] Y. Hu, J. Chen, H. Jiaxing, L. Mei, X. Kang, Analyzing software system quality riskusing Bayesian belief network, in: Proceedings of the 2007 IEEE InternationalConference on Granular Computing, 2007, pp. 93–96.

17] E. Lee, Y. Park, J. Shin, Large engineering project risk management usinga Bayesian belief network, Expert Systems with Applications 36 (3) (2009)5880–5887.

18] T. Zhan, X. Wang, Risk assessment for traffic information security based on fuzzycomprehensive evaluation, in: Proceedings of the 2nd International Conferenceon Transportation Engineering, 2009, pp. 3809–3814.

19] T.R. Peltier, Information Security Risk Analysis, second ed., CRC Press, BocaRaton, 2007.

20] X. Yang, H. Luo, C. Fan, M. Chen, S. Zhou, Analysis of risk evaluation techniqueson information system security, Journal of Computer Applications 28 (8) (2008)1920–1924.

21] R.P. Srivastava, T. Mock, Evidential reasoning for WebTrust assurance services,Journal of Management Information Systems 16 (3) (2000) 11–32.

22] R.P. Srivastava, L. Liu, Applications of belief functions in business decisions: areview, Information Systems Frontiers 5 (4) (2003) 359–378.

23] H. Xu, Y.T. Hsia, P. Smets, A belief-function based decision support system,in: D. Heckerman, A. Mamdani (Eds.), Proceedings of the Ninth Uncertainty inArtificial Intelligence, Morgan Kaufmann, San Mateo, CA, 1993, pp. 535–542.

24] D. Zeng, J. Xu, G. Xu, Data fusion for traffic incident detection using D-S evi-

dence theory with probabilistic SVMs, Journal of Computers 3 (10) (2008)36–43.

25] X. Yao, J. Fu, Z. Chen, Intelligent fault diagnosis using rough set method andevidence theory for NC machine tools, International Journal of Computer Inte-grated Manufacturing 22 (5) (2009) 472–482.

[

ting 11 (2011) 4332–4340

26] Z.P. Mourelatos, J. Zhou, A design optimization method using evidence theory,Journal of Mechanical Design 128 (4) (2006) 901–908.

27] H. Altincay, Ensembling evidential k-nearest neighbor classifiers throughmulti-modal perturbation, Applied Soft Computing 7 (3) (2007) 1072–1083.

28] Z. Deng, B. Li, J. Zhuang, Image object recognition by SVMs and evidence theory,Lecture Notes in Computer Science 3568 (2005) 560–567.

29] A.P. Dempster, A generalization of Bayesian inference, Journal of the RoyalStatistical Society 30 (1968) 205–247.

30] G. Shafer, A Mathematical Theory of Evidence, Princeton University Press,Princeton, 1976.

31] G. Shafer, The Dempster–Shafer theory, in: S.C. Shapiro (Ed.), Encyclopedia ofArtificial Intelligence, John Wiley and Sons, New York, 1992, pp. 330–331.

32] A.L. Jousselme, D. Grenier, E. Bosse, A new distance between two bodies ofevidence, Information Fusion 2 (1) (2001) 91–101.

33] R.L. Kumar, S. Park, C. Subramaniam, Understanding the value of counter-measure portfolios in information systems security, Journal of ManagementInformation Systems 25 (2) (2008) 241–279.

34] Z.S. Xu, A method for priorities of triangular fuzzy number complementaryjudgment matrices, Fuzzy Systems and Mathematics 16 (1) (2002) 55–60.

35] L. Zhou, A. Vasconcelos, M. Nunes, Supporting decision making in risk manage-ment through an evidence-based information systems project risk checklist,Information Management and Computer Security 16 (2) (2008) 166–186.

36] J.W. Guan, D.A. Bell, Approximate reasoning and evidence theory, InformationSciences 96 (3–4) (1997) 207–235.

37] C.K. Murphy, Combining belief functions when evidence conflicts, DecisionSupport Systems 29 (1) (2000) 1–9.

38] C.G. Jin, Y. Lin, Z.S. Ji, Application of event tree analysis based on fuzzy sets inrisk analysis, Journal of Dalian University of Technology 43 (1) (2003) 97–100.

39] P.P. Shenoy, G. Shafer, Axioms for probability and belief-function propagation,Uncertainty in Artificial Intelligence 4 (1990) 169–198.

40] A. Saffiotti, E.P. Umkehrer, A general tool for propagating uncertainty in valua-

tion networks, in: Proceedings of the Seventh National Conference on ArtificialIntelligence, 1991, pp. 323–331.

41] G. Shafer, P.P. Shenoy, R.P. Srivastava, Auditor’s assistant: a knowledgeengineering tool for audit decisions, in: Proceedings of the 1988 ToucheRoss/University of Kansas Symposium on Auditing Problems, 1988, pp. 61–79.


Top Related