isaca emerging issues in information technology auditing€¦ · 13. the organization obtains (or...
TRANSCRIPT
Copyright © 2015 Deloitte Development LLC. All rights reserved.
ISACA Emerging Issues in Information Technology Auditing
September 24, 2015
1 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Agenda
Emerging issues in IT audit4:20 – 4:50 Information produced by the entity (IPE)
Presented by Carrie Flynn
4:50 – 5:05 Workpaper reliance and integrityPresented by Carrie Flynn
5:05 – 5:25 Spreadsheet management and end user computing Presented by Rhonda Willert
5:25 – 5:45 Cybersecurity – The role of internal auditPresented by Rhonda Willert
Copyright © 2015 Deloitte Development LLC. All rights reserved.
Information produced by the entity (IPE)Quality information to support the functioning of internal control
3 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Information produced by the entity• What is IPE?• Relevant standards• COSO framework• Why is IPE important?
Identify and understand IPE• Identify and understand IPE • Example of IPE• What could go wrong?• Determine if IPE is sufficient• Determine type of IPE
Plan and perform tests of IPE• Source data• Report logic• Parameters
• IPE testing elements• Evaluate IPE• Sample size considerations
Information produced by the entity (IPE)
4 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Information Provided by the Entity (IPE) is often provided in the form of a “report”which may be either system-generated, manually-prepared, or a combination ofboth.
Examples include:• Standard “out of the box” or default reports or templates
• Custom-developed reports defined and generated by user-operated tools such asscripts, report writers, programming language and queries that are not standard to theapplication
• End-user applications such as spreadsheets that house and extract relevantinformation
• Entity-prepared analyses and schedules that are manually prepared by entitypersonnel from a system or other internal or external sources
Reports may allow for user selection of inputs (parameters) in which the userdefines certain criteria to generate the report output
What is IPE?
5 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Relevant PCAOB Standards
When using information produced by the entity as audit evidence, the auditorshould evaluate whether the information is sufficient and appropriate for purposesof the audit by performing procedures to:• Test the accuracy and completeness of the information, or test the controls
over the accuracy and completeness of that information; and• Evaluate whether the information is sufficiently precise and detailed for
purposes of the audit. [PCAOB AS 15.10]
PCAOBAS 15Audit
Evidence, Paragraph 10
To obtain evidence about whether a selected control is effective, the control istested directly [emphasis added]; the effectiveness of a control cannot be inferredfrom the absence of misstatements detected by substantive procedures. [PCAOBAS5.B9]
PCAOBAS 5.B9
6 Copyright © 2015 Deloitte Development LLC. All rights reserved.
COSO 2013 frameworkIn May 2013, the Committee of Sponsoring Organizations of the Tread way Commission(COSO) released an updated version of its internal control – integrated framework (the“2013 Framework)”.
The new framework provides additional structure by defining 17 required principles ofinternal control that must be present and functioning in order for management and auditorsto conclude that internal control over financial reporting is effective.
Principle(s):13. The organization obtains (or generates) and uses relevant quality information to support the functioning of internalcontrol.
Points of Focus:• Identifies information requirements — A process is in place to identify the information required and expected to
support the functioning of the other components of internal control and the achievement of the entity’s objectives.• Captures internal and external sources of data — Information systems capture internal and external sources of
data.• Processes relevant data into information — Information systems process and transform relevant data into
information.• Maintains quality throughout processing — Information systems produce information that is timely, current,
accurate, complete, accessible, protected, verifiable, and retained. Information is reviewed to assess its relevance insupporting the internal control components.
• Considers costs and benefits — The nature, quantity, and precision of information communicated are commensuratewith and support the achievement of objectives.
Principle(s) and points of focus relevant to IPE
7 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Why is IPE important?
Based on data from Audit Analytics for the period from November 15, 2012, through November 14, 2013, including 10-K filings for the calendar year ended December 31, 2012.
Accounting documentation
Material and/or numerous auditor /YE
adjustmentsAccounting personnel / competency
Restatement or non-reliance
of company filings
Segregations of duties/design
of controls
Non-routine
404 disclosuresrestatement
8 Copyright © 2015 Deloitte Development LLC. All rights reserved.
The following questions may assist us in understanding and identifying IPE that is relevant to our audit:
Identify and understand IPE
• What is the purpose of the IPE?• Is the IPE used in our testing?• Do one or more controls rely upon
the IPE?• Where does the source data
reside?• How does the data flow from
source data to report? • Is there report logic involved with
generating the report?• Are the related system(s) subject to
general IT controls?
• Is the report generated through a report writer tool?
• Is information downloaded to end user applications such as Excel spreadsheets?
• Does the report include relevant calculations on the source data?
• Does the user enter parameters when the IPE is generated? If so, what are the parameters?
• Have errors been identified in the IPE?
A detailed understanding of the IPE also allows us to design appropriate testing procedures.
9 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Identify and understand IPE
Examples of IPE relevant to general IT controls include:
• System change listing• Listing of users with privileged access• Terminated user listing• Password configuration report• Periodic access review listing• Listing of users with data center access
10 Copyright © 2015 Deloitte Development LLC. All rights reserved.
What could go wrong?Consider the following examples of what may go wrong in GITCs when obtaining an understanding of IPE.
Not all data is captured
Spreadsheet sorting error
Inappropriate changes to report
Incorrect parameters
The annual access reauthorization review conducted foran application does not include all of the users withaccess, due to a synchronization issue with amiddleware system.
The employee termination listing does not contain apopulation of terminated users due to a sorting errorwhen preparing the termination listings from a systemextract.
IT management runs a report of changes to anapplication. An employee in the IT department, who hadadministrator access to the system, manipulated thereport to exclude an unauthorized change he made.
The user listing used to test controls over systemaccess is not accurate, as the report was generated outof the wrong application system (development instance),since the wrong input value (parameter) was entered.
11 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Determine if the IPE is sufficient
• Once we identify the IPE, we evaluate whether the IPE is sufficiently precise anddetailed for purposes of the audit.
• If IPE is not sufficiently precise or detailed for our purpose, we likely cannot use it asevidence. However, we may work with the entity to determine if the original IPE canbe modified to meet our needs or identify other audit evidence to achieve the intendedpurpose.
For example, management provides a summary of application changes prepared in a Worddocument and we determine the information is not sufficiently precise and detailed for ourpurpose of testing program changes. We may work with management to identify analternative form of the information (e.g., a program change listing from the source code librarymanagement tool) that will be sufficiently precise and detailed for our purposes.
12 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Determine type of IPE to develop testing strategy
IPE used in a relevant control by managementDoes the control validate the accuracy and
completeness of the IPE, or is it dependent upon the accuracy and completeness of the IPE?
Understand the IPE and perform procedures to address each of the following, as applicable: • The source data• The report logic, which includes how the data is extracted and calculations• Relevant parameters that are user entered
Identify and test the controls that address the accuracy and
completeness of the IPE
Our evaluation of the design and evidence of operating
effectiveness of the control supports this conclusion
(no need to identify additional controls)
Validates Dependent
Test the accuracy and completeness of the information directly and/or identify and test the controls over the accuracy
and completeness of the information.
IPE we use as audit evidence
(to test the control)
13 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Plan and perform tests of the IPE
• When testing GITCs, much of the IPE we obtain includes audit evidence we use to test the control– In such instances, we typically obtain evidence about the accuracy and completeness
of such information concurrently with the actual procedure applied to the information(i.e., test the information directly). However, we may also reference tests of controls.
• Our procedures to test the IPE may vary depending on the purpose of the IPE, nature of the IPE, how it is created, the risk associated with the control, and the effectiveness of related GITCs
14 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Plan and perform tests of the IPE (cont.)We perform procedures to address each of the following, as applicable:
• The source data• The report logic, which includes how the data is extracted and any calculations• Relevant parameters that are user entered
For example, consider the process to generate a typical terminated user listing. We need todetermine (1) whether the appropriate parameters were applied, (2) whether the source data inthe HR database is accurate and complete and (3) whether the report logic that generates theterminated user listing is appropriate.
15 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Source data – direct tests
• Source data may include records or documents, whether internal or external, inpaper form, electronic form, or other media
• Examples of direct tests to address the accuracy and completeness of relevantsource data include:– Select a sample of items from the report and agree back to the source– Make a sample selection from the source and agree to relevant information on the
report– Reconcile report totals to source data totals– Observe the generation of the report and perform appropriate procedures to reconcile
information to the source
For example, when we obtain a terminated employee listing, we may test source data bymaking a selection of terminated employees from the source where termination status ismaintained and comparing that the terminated employees appear on the terminated employeelisting (completeness). We may also make a selection of terminated employees from theterminated employee listing and compare the termination date and other pertinent detailsmatch the source (accuracy).
16 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Source data – control tests
• We may also address the accuracy and completeness of source data through a test of controls
• Tests of controls over source data may be performed in combination with direct tests when using as evidence to test the control
• GITCs are often another relevant reference point for controls over source data when business process controls are dependent upon IPE that is generated from relevant systems.– IT specialists and auditors should discuss if it may be appropriate to reference GITCs
for certain tests of controls over related source data (in addition to other transactional or monitoring controls in the business)
For example, when we obtain a user access listing, a test of controls over source data mayinclude controls over user access provisioning and de-provisioning. The control to determineonly authorized and appropriate individuals have security administrator access may also be arelevant source data control.
17 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Report logic
Report logic includes how the data is extracted and relevant calculations. The configuration of the logic depends on the type of the report:
Type of report Report logicStandard “out of the box” or default reports Computer code embedded within the system
Custom-developed reports defined and generated by user-operated tools such as scripts, report writers, programming language and queries
Query language or navigation path specified by the user to create the report. In such cases, the report logic and parameters are typically tested together.
End-user applications such as spreadsheets that house and extract relevant information
Calculations/formulas in Excel spreadsheets
For example, there may be no report logic involved for IPE such as a screenshot ofpassword parameters taken from where the password settings are configured in the system.
18 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Report logic (cont.)
If report logic is automated, we would liken this to an automated control, andhence our control testing approach typically includes a reperformance of theautomated report logic, which is similar to a direct test. Example tests for reportlogic include:
Report logic test Direct test
Controltest
Manually reperform the extraction by selecting a sample of items from the report and agree back to the source and vice versa and/or reperform control totals
X X
Manually reperform the important calculations in the report, addressing significant variations in the calculations
X X
For reports without calculations, manually reperform the logic by comparing it included important elements/variations (e.g., user listing included both employees and contractors)
X X
Use a CAAT (e.g., ACL or a spreadsheet program) to independently re-perform the extraction and calculations/algorithm
X X
Inspect the specific programming or query language used to extract the relevant source data and generate the report
X
Apply a “benchmarking” strategy, as applicable (refer to PCAOB AS 5.B28-33; U.S. AAM Topic 4161 and IC Q&A 3-19)
X
19 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Parameters• Parameters may be entered by the user each time the report is generated to
define certain criteria when creating a report.
• Example parameters relevant to GITCs include:• Date range
• System instance
• Type of changes
• Type of users (employee, contractor, active, inactive, etc.)
• If the report includes user entered parameters, we obtain an understanding of the parameter values and perform appropriate procedures to determine if they are appropriate. Example tests for parameters include:
Report logic test Direct test
Controltest
Perform inquiry to determine if the control owner checked the parameters and reperformthe control owner’s review of the appropriateness of the parameters by determining if the correct parameters are displayed on the report.
X
Perform inquiry to determine if the control owner checked the parameters and observe the input of the parameters when the report is created by the entity.
X
Consider if the correct parameters are displayed on the report. X
Observe the input of the parameters when the report is created. X
20 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Scenario
Enterprise is the production code migration tool used for the Era application system. Era isconfigured such that changes implemented into production are automatically captured withinEnterprise (from which this report is generated). Therefore, changes cannot be made to the Erasystem without being tracked on the system change log, which is a standard report in theEnterprise tool.
IPE used in testing the control
Example #1: Listing of changes (system-generated)
GITC IPE test approach
We tested the automated control related to the system configuration that prevents changes frombeing made outside of the established process using Enterprise. This provides evidence thatchanges made to the Era system via Enterprise cannot be bypassed, therefore the standardchange listing generated from Enterprise includes a correct and complete population of changes.
Additionally, we inspected the parameters and determined the appropriate system instance (A12production system) was specified and the date range of 1/1/13 - 8/30/13 was appropriate for ourpurposes.
21 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Scenario
For the various operating system, network, and database technologies, we are unable to obtain asystematic listing of changes throughout the period due to system limitations. Because there isno systematic listing for these technologies, we used the ticketing system as our source forobtaining the population of changes. Production changes made to these technologies aremanually entered into the ticketing system.
IPE Used in Testing the Control
Example #2: Listing of Changes (Ticketing System)
GITC IPE test approach
We performed each of the following:
1. We observed John Smith generate the listing of changes between 1/1/13 and 9/30/13 from thesystem. We observed John enter the appropriate date parameters and select “all” for the typeof changes to include.
2. We inspected the listing and determined there were 60 changes across the operating system(34), network (11) and database (15), and relevant technologies were included in the listing.
continued on next slide…
22 Copyright © 2015 Deloitte Development LLC. All rights reserved.
IPE used in testing the control
Example #2: Listing of changes (ticketing system)
GITC IPE test approach (cont.)3. As part of our evaluation of design effectiveness, we obtained an understanding of the
frequency of system changes made to the relevant technologies and compared to the Remedylisting and concluded that the number of changes is consistent.
4. We inspected the listing and determined that the relevant change types (e.g., patches,upgrades, configurations, reports, code, data) are represented in the listing for eachrespective technology.
5. Through inquiries and inspection of change control meeting minutes (refer to W/P 4210), weobtained an understanding of changes for the related technologies implemented in the auditperiod. We made a selection of five changes and cross-referenced to the Remedy listing todetermine whether the changes appeared on the listing and the descriptions and dates areconsistent with our understanding. These five changes included the following (include specificdetails).
Note: If the related technology was an application system where evidence of last change date isavailable, test #5 may be replaced with the following: We obtained a screenshot that contains thedates on the last modified date of production application objects or executables. We made aselection of five changes from the system screenshot and validated that they appeared on thechange log correctly and completely.
23 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Scenario
A listing of system administrators was obtained to support our testing of privileged access to theInfinium application. During our evaluation of design effectiveness, we determined that the“System Authorities” permission grants access to perform system administration activities.Through inquiries and inspection of the organizational chart, we also determined there are fivepeople who perform system administration activities.
IPE used in testing the control
Example #3: Application user listing
GITC IPE test approach
We observed David Jones generate the listing of users with the “System Authorities” permissionfrom the production instance of the Infinium application on 6/1/13. We inspected the user listingand determined that the number of users on the report was consistent with the number of usersdisplayed directly in the system.
Through our tests of design effectiveness, we determined there are five individuals who performsystem administration activities (refer to W/P 4210 for the names of these individuals). Wevalidated that each of these five individuals appeared on the system listing, as expected.
Note: We tested the appropriateness of the access privileges assigned to these individuals inconjunction with our tests of operating effectiveness of the control.
24 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Scenario
We obtained the System Values Report from the operating system as part of our testing ofpassword parameters at the operating system layer. During our tests of design effectiveness,which included inquiries and inspection of the password policy, we determined that passwordexpiration, minimum password length, complexity and maximum sign-on attempts are to beestablished in accordance with standard limits.
IPE used in testing the control
Example #4: Password configuration report
GITC IPE test approach
We observed John Doe generate the System Values Report on 5/15/13 from the productioninstance of the operating system supporting the application, which is relevant to the audit.
The report used as audit evidence was generated directly from the source where the passwordsettings are maintained. We inspected the report and observed that the relevant system settingsin the report are consistent with what is displayed directly in the system. This included passwordexpiration, minimum password length, complexity and maximum sign-on attempts.
25 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Scenario
Management performs a semi-annual periodic access review of users within the Millenniumapplication as a detective monitoring control to determine if user access is valid and consistentwith job responsibilities (i.e., the access reflected in the system is appropriate). The listing is astandard report that is generated from the same system where the data is initiated andmaintained.
Control is dependent upon IPE
Example #5: Periodic access review listing
GITC IPE test approach
We determined that the periodic access review control is dependent upon IPE. We tested thecontrols over the source data when testing user access provisioning and de-provisioning. We alsotested the control related to the appropriateness of security administrators with the ability tomaintain the related user access.
We tested the extraction through reperformance by selecting five users from Era and comparedthat they appeared on the report and vice versa for specific details. Further, we reperformed theautomated report logic by comparing that employees and contractors were included in the listingand that there were no specific profiles or users that were excluded from management’s review.We performed inquiry to validate that the control owner checked the parameters and reperformedthe review of the parameters by verifying that the parameter values appropriately specify the Eraproduction instance.
26 Copyright © 2015 Deloitte Development LLC. All rights reserved.
The following elements are typically considered when testing IPE:
IPE testing elements
Source data
The information from whichthe IPE is created. This mayinclude data maintained inthe IT system or external tothe system (e.g., datamaintained in a spreadsheetor manually maintained).
Report logic
The computer code,algorithms, or formulas fortransforming, extracting orloading the relevant sourcedata and creating the report.Report logic may includestandardized reportprograms, user-operatedtools (e.g., query tools andreport writers) orspreadsheets.
Report parameters
Report parameters allow theuser to look at only theinformation that is of interestto them. Common uses ofreport parameters includingdefining the report structure,specifying or filtering dataused in a report orconnecting related reports(data or output) together.Report parameters may bedefined manually by theuser or they may be pre-setin the system.
27 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Sample size considerations
The appropriate sample size for testing IPE is a matter of professionaljudgment based on the nature of the IPE and, consistent with PCAOB AS15.5, may vary depending upon the specific facts and circumstances.
In determining the sample size, we consider two “dimensions” — thenumber of “instances” of IPE to test and then the number of items to testwithin each IPE instance.
Sufficiency is the measure of the quantity of evidence. The quantity of evidence needed is affected by the following: Risk of misstatement (in the audit of financial statements) or the risk associated with the control (in the audit of internal control over financial reporting). • As the risk increases, the amount of evidence that the auditor should obtain also
increases. For example, ordinarily more evidence is needed to respond to significant risks.• Quality of the evidence obtained. As the quality of the evidence increases, the need for
additional evidence decreases. Obtaining more of the same type of evidence, however, cannot compensate for the poor quality of that evidence. [PCAOB AS 15.5]
28 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• The number of items we select to test for each instance of IPE is based onauditor judgment considering factors such as the complexity of the IPE,significance of the IPE to the control and related risk of misstatement, andnumber of line items on the report‒ When testing report logic, we test a sample of one of each important calculation,
addressing significant variations in the calculations. We also test the extraction byselecting a sample of items from the report and agree back to the source and viceversa and/or consider control totals.
‒ Report logic related to GITC IPE does not typically involve calculations. However, thereport logic is applicable in extracting the data that appears on the report.
The number of items to test for each instance of IPE
For example, we obtain a user access listing (IPE) to test the appropriateness of applicationadministrator access. There are 18 administrators who appear on the listing. We maydetermine it is appropriate to select 5 users when performing our direct procedures to test theIPE.
Copyright © 2015 Deloitte Development LLC. All rights reserved.
Workpaper reliance and integrityDesign of controls, test of controls and operating effectiveness
31 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Design of controls• Evaluating criteria• Understanding the design evaluation factors• Documentation considerations• Evaluating control activities
Test of controls• Plan the nature, timing and extent• Risk associated with the control• Risk-based approach
Perform tests of operating effectiveness
Workpaper reliance and integrity
32 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Why do we evaluate the design of a control?Design of controls
PCAOB AS 5.42
The auditor should test the design effectiveness of controls by determiningwhether the company’s controls, if they are operated as prescribed bypersons possessing the required authority and competence to perform thecontrol effectively, satisfy the company’s control objectives and caneffectively prevent or detect errors or fraud that could result in materialmisstatements in the financial statements.
How do we evaluate the design of a control?Appropriateness of the
purpose of the control and its correlation to the
risk/assertion
Appropriateness of the control considering the
nature and significance of the risk
Competence and authority of the person(s) performing the
control
Frequency and consistency with which the control Is
performed
Level of aggregation and predictability
Criteria for investigation and process
for follow-up
Dependency on other controls or information
Assess the design for each control selected for testing
Consider the characteristics or details of the control
33 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Which design factors do you find the most challenging to evaluate?Understanding the design evaluation factors
PCAOB AS 5.43
Procedures the auditor performs to test design effectiveness include a mixof inquiry of appropriate personnel, observation of the company'soperations, and inspection of relevant documentation. Walkthroughs thatinclude these procedures ordinarily are sufficient to evaluate designeffectiveness.
How do we test and conclude on design effectiveness?Document assessment and
evidence of design effectiveness of control
Conclude on design IneffectiveEffective
Document conclusionand basis
Assess the Risk Associated with
the Control
Evaluatedeficiencies
34 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• Evaluating the control activities “FRASA”• Process versus control• Information produced by the entity• Characteristics of a common control
Documentation considerations
35 Copyright © 2015 Deloitte Development LLC. All rights reserved.
(F)requencyThe frequency or timing of occurrence (e.g., on a daily basis,biweekly, prior to the commencement of trading, upon completionof the reconciliation).
(R)esponsible party
The party responsible for conducting the risk mitigating activity (e.g., the director oftrading reviews, the accounting associate compares).
(A)ctivityThe specific risk-mitigating activity — Procedures have a risk-mitigating impact tobe considered a control activity as opposed to a procedure (e.g., custodian cashpositions are compared to the cash positions within the accounting system).
(S)ource The sources of information (if applicable).
(A)ction taken
The action taken with the results of the control activity (e.g., discrepancies areresearched and reported to the senior accounting associate for validation andinclusion on the daily portfolio management cash discrepancy report).
Evaluating the control activities “FRASA”
36 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Risk-based approach• More focus is needed on planning the nature of
our tests of operating effectiveness, especially when the control is supposed to mitigate a high risk.
Tests of operating effectiveness• Important steps of the control are sometimes
missing from planned tests of operating effectiveness
Plan the nature, timing, and extent
The auditor should test the operating effectiveness of control by determining whether the control is operating as designed and whether the person performing the control possesses the required authority and competence to perform the control effectively.
PCAOBAS 5.44
37 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Risk that a control may not be effective• Based on factors in PCAOB AS 5.47 (PCAOB AS 13.31) (including inherent risk).• Affects the planned nature, timing, and extent of tests of operating effectiveness of
relevant controls.
Risk associated with the control
Risk Associated With Control (RAWC) applies in PCAOB audits; higher RAWCincreases extent of testing. RAWC is NOT a concept in AICPA auditing standards.
Not higherHigher
Document conclusion
Plan the nature, timing, and the extent (NTE) of tests of
operating effectiveness (OE) of controls — overview and
nature
Conclude on risk associated
with the control
38 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• If there is a history of deficiencies, we may determine there is a higher risk that the related controls are not effective.
• If there have been changes to the entity’s IT environment that could adversely affect the design and operating effectiveness of the general IT controls, we may assess the risk associated with the affected general IT controls as higher.
• When assessing the risk associated with system change controls (e.g., approval and testing of system changes), we consider if controls over access to implement changes into production are operating effectively. If the related access controls are not operating effectively, we may determine there is a higher risk associated with the system change controls.
Risk associated with the control: Higher
For example, if the entity converted data into a new database as part of asystem upgrade, multiple significant accounts and disclosures may be affected.Our consideration of relevant factors may lead us to conclude that the riskassociated with the system change controls (including data conversioncontrols) is Higher.
39 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• There is generally not a higher risk associated with general IT controls that have an automated component (e.g., system password parameters) when the other related general IT controls are operating effectively.
• If user access privileges are reviewed on a quarterly basis, we may determine that the risk associated with the lower-level controls monitored by this control (e.g., approval of the extent of user access privileges for new employees) is Not Higher.
• If the access controls are operating effectively and there are no other relevant factors, we may determine that the risk associated with the system change controls is not higher.
Risk associated with the control: Not Higher
For example: system password parameters are configurable settings that are notchanged on a frequent basis. If access to modify the system parameters isappropriately restricted, this influences our determination that there is not a higherlevel of risk that the system password parameter control may not be effective.
40 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Area
Risk associated with the controlHigher Not higher
Nature
• Increase the persuasiveness of the nature of the audit evidence (e.g., reperformance)
• Consider performing procedures ourselves
• Obtain sufficient evidence, usually consisting of inquiry and inspection or observation
• Consider using the work of others
Timing(when testing as of an interim date)
• Perform interim testing later in the year to obtain more persuasive evidence closer to the as-of-date and rollforward as required
• Perform interim testing during the year and rollforward as required
Extent
• Perform high extent of testing• Consider whether to judgmentally
increase the sample sizes beyond those suggested by the sampling table
• Use normal extent of testing unless low extent of testing is determined to be appropriate
Risk-based approach
41 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• Dynamic regulatory requirements• Evolving technology environments• New team members bring fresh perspectives• Deeper understanding of the processes leads to new risks
Why would the testing approach change every year?
Copyright © 2015 Deloitte Development LLC. All rights reserved.
Spreadsheet management and end user computingLet’s get practical and tactical
44 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Introduction• Current State
EUC environment• Determine a framework• Governance• People• Process • Technology
Spreadsheet management and end user computing
45 Copyright © 2015 Deloitte Development LLC. All rights reserved.
The appeal and convenience of EUC usage can often result in non-controlled processes that increase organizational risk
End user computing (EUC) introduction
Enterprise resource planning (ERP) environment• Established controls• Well-defined procedures and
ownership• Auditable
EUC environment• Uncontrolled change• Limited security options• Incomplete version controlNOTE: This decrease in control can lead to
organizational losses including an increased risk of the following:
- Misstated Financial Statements- Regulatory and Compliance Violations- Negative Operational Impacts- Increased Fraud Risk
46 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Many organizations are aware of the risks associated with the use of EUCs (often documented during audits as the “Key Spreadsheets”), as a result have made attempts to implement policies and procedures to control the usage of these EUCs.
Current state
Governance• What is an EUC
(spreadsheet, data repository, database), organization cannot agree
• Who owns the organizational control / use of EUCs
• The organization cannot agree on a standard use of these EUC technologies
Technology• Network security is
limited and cannot provide granular control of EUCs
• Audit logging is currently not available on standard EUC solutions
• Inconsistent technologies are used throughout the organization.
Process• Departments have
established and implemented inconsistent processes for EUC usage
• The organization does not have a clear way of determining which EUCs should be controlled (each department has their own way of doing things)
People• There is no clear
ownership of the EUCs
• Due to organizational turnover, initial EUC design and intent have been lost
• A lack of EUC usage guidelines and training result in inappropriate usage / changes of EUCs
Organizational challenges
47 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Governance•Defining EUCs•Establishing policies and procedures
•Define EUC ownership•Monitoring and reporting
People•Define roles and responsibilities
•Training and awareness
Process•Define EUC risk ranking metrics
•Apply risk ranking metrics and determine control scope
•Define EUC-specific controls
•Apply controls to in-scope EUCs
Technology•Define technology requirements (consider the EUC controls)
•Determine support strategy
•Implement technology
The four cornerstones
of an EUC control
framework
Every organization is unique and a “one-size fits all” approach to EUC control islikely inappropriate. To help ensure that the EUC control environment is designedappropriately, management should first select an appropriate framework.
EUC environment – determine a framework
47
48 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• In order to establish appropriate scoping, management should first define whatconstitutes an EUC• Determine a list of applications that are currently in use by the user group. Oftentimes, management does not consider the full population of potential EUCs,restricting their scope to spreadsheets alone. Often times users make use of othertypes of EUCs, including user controlled databases, non-approved programminglanguages (VB Scripts, APIs, etc.)• Once a full population of EUCs has been determined, management shoulddetermine which of these EUCs is impacting the organization (operationally orfinancially).• Management should assess the usage of these EUCs and determine if standardprocedures are followed (it is likely, due to the disparate usage, that usage of theseEUCs is not consistent).
Defining EUCs
• In order to drive consistent usage of EUCs, management should develop comprehensive policies and procedures.
• Management should evaluate existing policies and procedures (often developed within business groups) and work to establish and propagate an organization-wide version.
Policies and standards
EUC environment – governanceGovernance People
Process Technology
49 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• Determine how EUC management will be structured within the organization.• Centralized – All EUCs are managed by a centralized team. Changes, access,
standardization and reporting is owned by a single team.• De-Centralized – EUCs are managed by individual business units. Changes,
access, standardization and reporting is owned by multiple leaders throughout the organization.
• Hybrid – EUC ownership is split. Standardization, reporting and compliance is managed by a centralized team. Changes and access are managed by multiple leaders throughout the organization.
• Management should evaluate the current ownership model and determine if it will meet the long term framework goals.
Ownership
• Management should define key risks and metrics for EUCs.• Management should establish an appropriate reporting mechanism, conducive to
supplying meaningful information to spreadsheet stakeholders.
Monitoring and reporting
EUC environment – governance, cont’dGovernance People
Process Technology
50 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Roles and responsibilities
• Leadership and organizational buy-in are key to establishing an effective EUC framework. To increase the likelihood of success, management should identify key stakeholders
• Once key stakeholders have been defined, they should be assigned roles and responsibilities:• Program sponsor• Central program group• Steering committee• Business unit representative• EUC user administrators
Training and awareness
• Establish a formal training program for each of the key stakeholder roles• Target the training timeline to be consistent with the framework goals
EUC environment – peopleGovernance People
Process Technology
51 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase I –Define EUC risk ranking metrics
Phase II -Evaluate EUC population and determine scope
Phase III -Design EUC-specific controls
Phase IV -Apply controls to the in-scope EUCs
Phase V - On-going control
EUC environment – process
A phased approach can enable organizationalbuy-in and streamlines the implementation ofthe process
Governance People
Process Technology
52 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase I –Define EUC risk ranking metrics
EUC environment – process, cont’d Governance People
Process Technology
Very likely, a majority of the EUCs that existwithin the organization do not have a significantimpact (operationally or financially).Management should consider defining riskcriteria to determine if an EUC should beincluded in the program.
Common metrics used to determine the overallrisk of an EUC are:- Output materiality- Throughput amount- Overall complexity- Judgment applied in usage
53 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase II -Evaluate EUC population and determine scope
EUC environment – process, cont’d Governance People
Process Technology
When organizations perform an initial inventory,they are often overwhelmed at the number ofEUCs that exist (known to be upwards of10,000.) Typically, management can exclude amajority of these EUCs with basic judgment.
Once an appropriate population of EUCs isdefined, management should apply riskthresholds (using the matrix defined in PhaseI) to isolate the high risk EUCs to enroll in theprogram.
54 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase III -Design EUC-specific controls
EUC environment – process, cont’d Governance People
Process Technology
In order to reduce the risk of EUC errors that mayresult in misstated financial statements,regulatory and compliance violations, negativeoperational impacts and increased fraud;management should design EUC-specificcontrols.
Common EUC controls may include the following areas:
- Data integrity control- Change control- User access and restriction control- Version control- Availability control
55 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase IV -Apply controls to the in-scope EUCs
EUC environment – process, cont’d Governance People
Process Technology
Once management has agreed to an appropriate controlenvironment, a roll-out methodology should be designed.Often times, this can be achieved by the use oftemplates (standard structures designed to enablesmooth transition into the control environment).
The application of controls typically has asignificant impact on the user group. Oftentimes, this phase of the process isunderestimated. It can be difficult to estimatethe time to implement these controls due to thevariable implementation time (per EUC). Thiscan take 20 minutes to 30 hours per EUC(depending on the complexity).
56 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase V - On-going control
EUC environment – process, cont’d Governance People
Process Technology
Once management has baselined all high-risk EUCs withthe new controls, an effective on-going monitoring andgovernance process should be implemented to ensurethat user groups continue to adhere to the definedcontrols.
57 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Phase IV – Establish a realistic roll-out strategy
Phase III – Evaluate security / role design
Phase II – Determine sizing and infrastructure
Phase I – Perform a technical assessment
EUC environment – technology
Throughout the technologyselection, implementation,and testing process, theother areas of theframework (Governance,People and Process)should be considered andrevised as necessary.
Governance People
Process Technology
58 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• Effective management of EUCs requires a comprehensive programfor management and control
• The program should be comprised of elements of governance, people,process and technology
• There is no cookie-cutter solution
Summary
61 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Cyber risk • High on the agenda• Drivers• Appetite • Roles and responsibilities• Cybersecurity framework• Assessment approach• Assessment maturity analysis• Assessment scorecard• Representative internal audit plan
Cybersecurity
62 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Audit committees and board members are seeing cybersecurity as a top risk, underscored by recent headlines and increased government and regulatory focus
High on the agenda
The Executive Order highlights the focus on an improved cybersecurity framework and the rapid changes of regulatory agency expectations and oversight
Recent US Securities and Exchange Commission (SEC) guidance regardingdisclosure obligations relating to cybersecurity risks and incidents…
“Registrants should address cybersecurity risks and cyberincidents in their Management’s Discussion and Analysis ofFinancial Condition and Results of Operations (MD&A), RiskFactors, Description of Business, Legal Proceedings andFinancial Statement Disclosures.” SEC Division of CorporateFinance Disclosure Guidance: Topic No. 2 - Cybersecurity
Ever-growing concerns about cyber-attacks affecting the nation’s critical infrastructureprompted the signing of the Executive Order (EO) 13636, Improving CriticalInfrastructure Cybersecurity.
One of the foundational drivers behind the update and release of the 2013 COSOFramework was the need to address how organizations use and rely on evolvingtechnology for internal control purposes
63 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Technology becomes more pervasive• Internet, cloud, mobile, and social are mainstream platforms inherently oriented for sharing• Employees want continuous, real-time access to
their information
Changing business models• Service models have evolved—outsourcing, offshoring, contracting, and remote workforce
More data to protect• Increased volume of customers’ personal, account, and
credit card data, as well as employee’s personal identifiable information and also company trade secrets
• The need to comply with privacy requirements across a wide array of jurisdictions
Threat actors with varying motives• Hackers to nation states• Continuously innovating and subverting
common controls • Often beyond the reach of a country’s
law enforcement
The forces driving growth and efficiency may create a broad attack surface
Drivers
Cybersecurity
Technology expansion
Evolving business models
Data growth
Motivated attackers
64 Copyright © 2015 Deloitte Development LLC. All rights reserved.
• Perimeter defenses• Vulnerability management• Asset management• Identity management• Secure SDLC• Data protection
Cyber Risk Program and Governance
Management should develop an understanding of who might attack, why, and how
Appetite
• Cyber criminals• Hactivists (agenda driven)• Nation states• Insiders/partners• Competitors• Skilled individual hackers
• Theft of IP/strategic plans• Financial fraud• Reputation damage• Business disruption• Destruction of critical infrastructure • Threats to health and safety
Who might attack?
What are they after, and what business risks do I need to mitigate?
What tactics might they use?
• Governance and operating model• Policies and standards• Management processes and capabilities • Risk reporting • Risk awareness and culture
• Spear phishing, drive by download, etc.
• Software or hardware vulnerabilities• Third-party compromise• Multi-channel attacks• Stolen credentials
• Incident response • Forensics• Business continuity /
disaster recovery• Crisis management
SecureAre controls in place to guard against known and
emerging threats?
VigilantCan we detect malicious or unauthorized activity,
including the unknown?
ResilientCan we act and recover quickly to reduce impact?
• Threat intelligence• Security monitoring• Behavioral analysis• Risk analytics
65 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Effective risk management is the product of multiple layers of risk defense. Internal Audit should support the board’s need to understand the effectiveness of cybersecurity controls.
Roles and responsibilities
• Establish governance and oversight• Set risk baselines, policies, and standards• Implement tools and processes• Monitor and call for action, as appropriate• Provide oversight, consultation, checks and balances, and
enterprise-level policies and standards
• Incorporate risk-informed decision making into day-to-day operations and fully integrate risk management into operational processes
• Define risk appetite and escalate risks outside of tolerance• Mitigate risks, as appropriate
• Independently review program effectiveness• Provide confirmation to the board on risk management
effectiveness• Meet requirements of SEC disclosure obligations focused on
cybersecurity risks
1st Line of defensebusiness and IT
functions
2nd Line of defenseinformation and technology
risk managementfunction
3rd Line of defense
internal audit
Roles and responsibilities
Given recent high profile cyber attacks and data losses, and the SEC’s and other regulators’ expectations, it is critical for Internal Audit to understand cyber risks and be prepared to address the questions and concerns expressed by the audit committee and the board
66 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Vigi
lant
Secu
re
• Data classification and inventory• Breach notification and management• Data loss prevention• Data security strategy• Data encryption and obfuscation• Records and mobile device management
Data management and protection
• Secure build and testing• Secure coding guidelines• Application role design/access• Security design/architecture• Security/risk requirements
Secure development life cycle
• Compliance monitoring• Issue and corrective action planning• Regulatory and exam management• Risk and compliance assessment and mgmt.• Integrated requirements and control framework
Cybersecurity risk and compliance management
• Incident response and forensics• Application security testing• Threat modeling and intelligence• Security event monitoring and logging• Penetration testing• Vulnerability management
Threat and vulnerability management
Res
ilien
t
• Change management• Configuration management• Network defense• Security operations management• Security architecture
Security operations
• Security training• Security awareness• Third-party responsibilities
Security awareness and training
Recover strategy, plans & procedures Testing & exercising Business impact analysis Business continuity planning Disaster recovery planning
Crisis management and resiliency
• Information gathering and analysis around:– User, account, entity– Events/incidents– Fraud and anti-money laundering– Operational loss
Risk analytics
• Security direction and strategy• Security budget and finance management• Policy and standards management• Exception management• Talent strategy
Security program and talent management
• Evaluation and selection• Contract and service initiation• Ongoing monitoring• Service termination
Third-party management• Account provisioning• Privileged user management• Access certification• Access management and governance
Identity and access management• Information and asset classification and inventory• Information records management• Physical and environment security controls• Physical media handling
Information and asset management
An assessment of the organization’s cybersecurity should evaluate specific capabilities across multiple domains
Cybersecurity framework
67 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Certain cybersecurity domains may be partially covered by existing IT audits, however many capabilities have historically not been reviewed by internal audit
Cybersecurity framework
SOX (financially relevant systems only) Penetration and vulnerability testing BCP/DRP Testing
Vigi
lant
Secu
re
• Data classification and inventory• Breach notification and management• Data loss prevention• Data security strategy• Data encryption and obfuscation• Records and mobile device management
Data management and protection
• Secure build and testing• Secure coding guidelines• Application role design/access• Security design/architecture• Security/risk requirements
Secure development life cycle
• Compliance monitoring• Issue and corrective action planning• Regulatory and exam management• Risk and compliance assessment and mgmt.• Integrated requirements and control framework
Cybersecurity risk and compliance management
• Incident response and forensics• Application security testing• Threat modeling and intelligence• Security event monitoring and logging• Penetration testing• Vulnerability management
Threat and vulnerability management
Res
ilien
t
• Change management• Configuration management• Network defense• Security operations management• Security architecture
Security operations
• Security training• Security awareness• Third-party responsibilities
Security awareness and training
Recover strategy, plans & procedures Testing & exercising Business impact analysis Business continuity planning Disaster recovery planning
Crisis management and resiliency
• Information gathering and analysis around:– User, account, entity– Events/incidents– Fraud and anti-money laundering– Operational loss
Risk analytics
• Security direction and strategy• Security budget and finance management• Policy and standards management• Exception management• Talent strategy
Security program and talent management
• Evaluation and selection• Contract and service initiation• Ongoing monitoring• Service termination
Third-party management• Account provisioning• Privileged user management• Access certification• Access management and governance
Identity and access management• Information and asset classification and inventory• Information records management• Physical and environment security controls• Physical media handling
Information and asset management
68 Copyright © 2015 Deloitte Development LLC. All rights reserved.
An internal audit assessment of cybersecurity should cover all domains and relevant capabilities, and involve subject matter specialists when appropriate
Assessment approach
Phase III: Risk assessment
Phase II: Understand current state
Phase IV: Gap assessment and recommendationsPhase I: Planning and scoping
Phas
eK
ey a
ctiv
ities
Del
iver
able
s
Activities:• Identify specific internal and
external stakeholders: IT, Compliance, Legal, Risk, etc.
• Understand organization mission and objectives
• Identify industry requirements and regulatory landscape
• Perform industry and sector risk profiling (i.e., review industry reports, news, trends, risk vectors)
• Identify in-scope systems and assets
• Identify vendors and third-party involvement
Activities:• Conduct interviews and
workshops to understand the current profile
• Perform walkthroughs of in-scope systems and processes to understand existing controls
• Understand the use of third-parties, including reviews of applicable reports
• Review relevant policies and procedures, including security environment, strategic plans, and governance for both internal and external stakeholders
• Review self assessments• Review prior audits
Activities:• Document list of potential risks
across all in-scope capabilities• Collaborate with subject matter
specialists and management to stratify emerging risks, and document potential impact
• Evaluate likelihood and impact of risks
• Prioritize risks based upon organization’s objectives, capabilities, and risk appetite
• Review and validate the risk assessment results with management and identify criticality
Activities:• Document capability
assessment results and develop assessment scorecard
• Review assessment results with specific stakeholders
• Identify gaps and evaluate potential severity
• Map to maturity analysis• Document recommendations• Develop multiyear
cybersecurity/IT audit plan
Deliverable:• Assessment objectives and
scope• Capability assessment scorecard
framework
Deliverable:• Understanding of environment
and current state
Deliverable:• Prioritized risk ranking• Capability assessment findings
Deliverables:• Maturity analysis• Assessment scorecard• Remediation recommendations• Cybersecurity audit plan
69 Copyright © 2015 Deloitte Development LLC. All rights reserved.
Maintaining and enhancing security capabilities can help mitigate cyber threats and help the organization to arrive at its desired level of maturity
Assessment maturity analysis
Cybersecurity domain
Cybersecurity risk and compliance mgmt.
Third-party management
Secure development life cycle
Information and asset management
Security program and talent management
Identity and access management
Threat and vulnerability management
Data management and protection
Risk analytics
Crisis management and resiliency
Security operations
Security awareness and training
Initial Managed Defined Predictable Optimized
Current state CMMI maturity*
Maturity analysis
• Recognized the issue• Ad-hoc/case by case• Partially achieved goals• No training, communication, or
standardization
• Process is managed• Responsibility defined• Defined procedures with
deviations• Process reviews
• Defined process• Communicated procedures• Performance data collected• Integrated with other
processes• Compliance oversight
• Defined quantitative performance thresholds and control limits
• Constant improvement• Automation and tools implemented• Managed to business objectives
• Continuously improved• Improvement objectives
defined• Integrated with IT• Automated workflow• Improvements from new
technology
Stage 1: Initial Stage 2: Managed Stage 4: PredictableStage 3: Defined Stage 5: Optimized
*The industry recognized Capability Maturity Model Integration (CMMI) can be used as the model for the assessment. Each domain consists of specific capabilities which are assessed and averaged to calculate an overall domain maturity.
Secu
reVi
gila
ntR
esili
ent
70 Copyright © 2015 Deloitte Development LLC. All rights reserved.
A scorecard can support the overall maturity assessment, with detailed cyber risks for people, process, and technology. Findings should be documented and recommendations identified for all gaps
Assessment scorecard
Threat and vulnerability management—Penetration testingArea Findings Ref. Recommendations Ref.
People
• The organization has some resources within the ISOC that can conduct penetration testing, but not on a routine basis due to operational constraints and multiple roles that those resources are fulfilling
2.6.4
• The organization may find it of more value and cost benefit to utilize current resources to conduct internal penetration testing on a routine and dedicated basis since they do have individuals with the necessary skills to perform this duty.
2.6.4
Process
• The organization has limited capability to conduct penetration testing in a staged environment or against new and emerging threats
2.6.5
• The organization should expand its penetration testing capability to include more advance testing, more advanced social engineering, and develop greater control over the frequency of testing
2.6.5
Technology
• The organization lacks standard tools to perform its own ad-hoc and on-the-spot penetration tests to confirm or support potential vulnerability assessment alerts and/or incident investigation findings.
2.6.6
• Either through agreement with a third-party vendor, or through technology acquisition, develop the technology capability to perform out of cycle penetration testing.
2.6.6
1: Initial 2: Managed 4: Predictable
3: Defined
5: Optimized
Capability assessment findings and recommendations
Cybersecurity domain
Cybersecurity risk and compliance mgmt.
Third-party management
Secure development life cycle
Information and asset management
Security program and talent management
Identity and access management
Threat and vulnerability management
Data management and protection
Risk analytics
Crisis management and resiliency
Security operations
Security awareness and training
Assessment scorecard
Secu
reVi
gila
ntR
esili
ent
People Process Technology
4 2 1
71 Copyright © 2015 Deloitte Development LLC. All rights reserved.
A cybersecurity assessment can drive a risk-based IT internal audit plan. Audit frequency should correspond to the level of risk identified, and applicable regulatory requirements/expectations
Representative internal audit plan
Internal audit FY 2015 FY 2016 FY 2017 Notes (representative)SOX IT generalcomputer controls X X X Annual requirement but only covers financially
significant systems and applications
External penetration and vulnerability testing X X X Cover a portion of IP addresses each year
Internal vulnerability testing X Lower risk due to physical access controls
Business continuity plan/disaster recovery plan X X Coordinate with annual 1st and 2nd line of
defense testing
Data protection and information security X Lower risk due to …
Third-party management X Lower risk due to …
Risk analytics X X X Annual testing to cycle through risk areas, and continuous monitoring
Crisis management X X Cyber war gaming scenario planned
Social media X Social media policy and awareness program
Data loss protection (DLP) X Shared drive scan for SSN / Credit Card #
As used in this document, “Deloitte” means Deloitte & Touche LLP, a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.
This presentation contains general information only and Deloitte is not, by means of this presentation, rendering accounting, business, financial, investment, legal, tax, or other professional advice or services. This presentation is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor.
Deloitte shall not be responsible for any loss sustained by any person who relies on this presentation.