© Fraunhofer
SECURITY ASSESSMENTS
Jan Steffan27. November 2015
© Fraunhofer
AGENDA
GoalsCustomers, expectations, conditions, trade-off confidence vs. effort
ApproachDifferent perspectives and models, ideal vs. specification vs. reality
Testingas a means of gaining information
ResultsWriting a useful report
Remarks and questions
© Fraunhofer
Motivations for security assessments
Manufacturer wants to make a secure product
Manufacturer’s marketing wants to advertise independently assessed security
Manufacturer is required to provide a security assessment for certain markets (e.g. government)
User is obliged to perform (regular) security testing
User wants to make purchase decision based on security
Independent research, bug bounties, hacking as hobby, idealism
Evil guys
Often the real motivation is to shift responsibility
for security to the external party doing the
assessment
© Fraunhofer
Different goals of security assessments
Differentgoals
Make sure it’s secure,
Meet formal requirements,
Assess quality with regard to security,
Find one vulnerability,
Manufacturer wants to make a secure product
Manufacturer’s marketing wants to advertise independently assessed security
Manufacturer is required to provide a security assessment for certain markets (e.g. government)
User is obliged to perform (regular) security testing
User wants to make purchase decision based on security
Independent research, bug bounties, hacking as hobby, idealism
Evil guys
© Fraunhofer
Confidence vs. effort – Goal-driven approach
Example: A customer wants us to publicly certify the security of an insulin pump (medical device) with wireless interface.
We need to gain a high confidence about its security properties.
Effort mainly depends on the complexity of the system
All potentially security relevant aspects need to be covered, including “hidden” features
Spend sufficient effort to really understand each aspect
© Fraunhofer
FDA: Content of Premarket Submissions for Management of Cybersecurity in Medical Devices
© Fraunhofer
Confidence vs. effort – Time-boxed approach
Example: Cumpany wants to buy one of three cloud storage solutions for its internal use. Security is only one of many aspects to consider.
Time and budget for a security assessment is fixed and limited.
Time-boxed approach (e.g. three systems within one month)
Time should be sufficient to gain some useful insights (e.g. one week)
Setup, familiarization and writing a good report takes time, too!
Goal: Produce the most useful results within a given time and budget
Do “low effort, high impact” tests, e.g. check for CSRF-protection and session management, evaluate security documentation
Do spot-tests to cover different aspects. Don’t drill down (e.g. exploit)
Assess quality
Was security considered? Did developers know what they are doing?
What kind are flaws can be found?
© Fraunhofer
Compliance testingPayment Card Industry Data Security Standard (PCI-DSS)
© Fraunhofer
Security is more than Compliance
• Compliance: Presence of well defined measures.
• Security: Absence of unexpected flaws.
© Fraunhofer
Three models of the system under test
Intended functionality according to the specification
What it should do in order to be secure
System under test as implemented
Spec
?
© Fraunhofer
Desired situation
Spec ?= =
© Fraunhofer
Spec
Threats not covered by security design
Insufficient security mechanisms,e.g. short key length
Unintended side-effects,typical vulnerabilities,e.g. buffer overflows
Missing or incorrectly implemented security measures,
Underspecified, but still correcte.g. nonexistent spec., programmer fixed it
Great!desired = specification= implementation
?We are in trouble …
© Fraunhofer
What information is available?
Available information?The more is known the better
User manuals
Architecture
Access to host machine
Access to error logs
Source code (white box)
Available privileges and accounts
Firewall, IDS/IPS, WAF disabled?
Types of allowed tests, e.g. social engineering, reverse engineering, decompilation, destructive, DDoS?
© Fraunhofer
Approach: Start with High Impact – Low Effort
ThreatAnalysis
DesignReview
Verification of Implementation
Vulnerability Analysis
Impact ofFindings
Effort
Design
Implementation
Co
mp
lete
An
alys
is
© Fraunhofer
Definition of Scope
The analysis should cover everything a typical user would expect:
All features and components that are enabledin typical usage scenarios
All components that may be relevant for security(Exposed interfaces, things affecting assets)
All security claims made by manufacturer
Clearly specified version(Difficult with constantly evolving products, e.g. SAAS)
We need to trust some base components: CPU, OS, crypto algorithms, …
© Fraunhofer
Threat Analysis
What security properties do we expect?
Usage scenarios
Stakeholders (roles)
Assets relevant for stakeholders(Technical and non-technical, e.g. reputation, intellectual property)
Security objectives for assets
Possible attacker motives and capabilities
Possible generic attack vectors
© Fraunhofer
Example: Threats not considered
Where does the key come from?
Key exchange without secure channel? -> no protection against Man-in-the-middle attacks!
Hard coded key? -> bad idea, too.
Client ServerEncrypted Channel
© Fraunhofer
Design Review
Review of all security mechanisms
Purpose of security mechanism
Analysis and description of details
Expected security properties from threat analysis and state-of-the-art(e.g. a PRNG should meet certain properties)
Comparison with actual security properties
Do security mechanisms complement each other?
Sound concept?
Matches assumptions of usage scenarios?
Life-cycle aspects, e.g. upgrade process, decommissioning
Limitations reasonable and transparent to user?
Spec
=?
© Fraunhofer
Example:Assumptions don’t match usage scenario
“The hatch of the lunar module was closed.
Enter your password on the touch screen to unlock it again.”
© Fraunhofer
Topic specific literature on how to do it right
© Fraunhofer
Example: Design requirements for PRNG
Goal: keep internal state confidential
Use an unpredictable initial seed with sufficient entropy
Prevent disclosure of information about the internal state through generated random numbers – use suitable cryptographic functions
Regularly add new entropy
Ferguson, Schneier, Kuhno: “Cryptography Engineering” contains a whole chapter on this. There are more pitfall than you may think!
© Fraunhofer
Verification of implementation
Do security mechanisms work as intended?
Implemented according to design or specification?
Comparison with reference implementation
Systematic testing
Test of corner cases, degenerated parameters, etc.
Review of error handling
Test against clean-room implementation of protocols
Spec=?
?
© Fraunhofer
Vulnerability analysis
Can security mechanisms be bypassed?
Typical bugs(Cross-site-scripting, buffer overflows & Co)
Insecure defaults andnon-obvious insecure configurations settings
Insufficient documentation of security aspects
Known weaknesses in base components
Focus on exposed interfaces and protocols, components affecting assets
=?
?
© Fraunhofer
Practical tests are like physics experiments
Perform an experiment
Make observations
Draw conclusion about hidden properties
Develop a (mental) model of inner structure
© Fraunhofer
Output ?=
Input
?
Spec
= ?Experiments
Observations
Inte
rfac
e
© Fraunhofer
Where to start?
Exposed interfaces
Paths leading to assets
Typical pitfalls (experience)
Recent issues
© Fraunhofer
Effort to check: a couple of minutes
Confidence: very clear result – secure or vulnerable
Efficiency: Low effort, high confidence
Example:CSRF token
© Fraunhofer
How would you check a web application for the following issues?What would be the effort and confidence?
SQL-injection?
Session-fixation?
Cross-site-scripting?
Passwords hashes are stored without salt?
Buffer overflow?
© Fraunhofer
Automated testing
Very nice if it can be made to work with minimal effort!
Some problems:
Setup effort
Inflexible, sometimes infeasible
Unclear coverage
Observations can be obscured, even misleading
False positives (verification requires time and expertise)
Good for standard checks like missing patches
Not good for new, unknown software, protocols etc.
Tools: e.g. nmap, nikto, Nessus,Nexpose, OpenVASLimited: Web application scanners
© Fraunhofer
Manual testing
High effort
Expertise required
More flexible and potentially deeper
Less broad, spot tests
More confidence about coverage and quality of results
Tools: e.g. ZAP, Burp for Web, scapy for network protocols,Fuzzers (varying degree of automation vs. adaptability)
© Fraunhofer
Selecting toolsTools are useless if they don’t fit the task and if you don’t understand their limitations and what they do!
Open source tools are not so bad
Often more flexible and can be fixed, adapted or extended
It is possible to understand what is going on inside
What was a good choice last year may be abandoned now
Commercial tools
Often more advanced (e.g. huge databases of test cases)
Often “dumbed down” - more usable but less useful
Can be really expensive!
Writing your own tools
Often worth the effort
Exactly fits your needs, can grow with the task
You get to understand the issue at hand very well
© Fraunhofer
Example of purpose-specific tool built with scapy
© Fraunhofer
Your most important tool
© Fraunhofer
Note:
Doing a security assessment is completely useless and a waste of time.
© Fraunhofer
Note:
Doing a security assessment is completely useless and a waste of time.
… unless you document the results in a proper report!
© Fraunhofer
Reporting with different target audiences in mind
Contents of a report:
Scope of tests
Executive summary for deciders
Summary of conducted tests and positive results
Detailed description of individual issues for developers
Tabular overview of identified issues as action items
© Fraunhofer
Executive summary
Scope:
When? Which version?
Included/excluded components? Configuration?
Depth and kind of analysis? Available information?
Executive summary for deciders:
Overall result: just some minor issues or hopeless case?
What needs to be done? Summary of issues and impacts
General advice
© Fraunhofer
Documentation of a vulnerability
Short descriptive titlee.g. “Vulnerability 17: Reflected XSS in search function (CWE 79)”
Affected component etc.
Conditions for successful exploite.g. victim opens link provided by attacker while being logged in
Explanation of the probleme.g. special characters are not neutralized when returning search results,Demonstration: http://example.org/?search=<script>alert(123);</script>
Technical impact of exploit and judgement of severitye.g. attacker can execute Javascript in victims session contex,take over session and perform any action in victim’s name
Recommendations for fixing the issuee.g. use white-listing, see OWASP recommendations
© Fraunhofer
Vulnerability metrics
Managers want a quick and easy way to decide which issues need to be fixed (first)
Problem: semantic information gets lost
What does “severity 5 on a scale of 1 to 10” mean?
What is the difference between severity 5 and 6?
It would be preferable to decide based on the potential impact (risk)
It would be desirable to have a metric with clear semantics
e.g. affects single users vs. all users, temporary vs. permanent data loss
Problem: In practice there are too many different factors that can be relevant. This information would still be lost.
Is a temporary effect on all users worse than a permanent one on some users?
© Fraunhofer
Vulnerability classification – pragmatic solution
Give advice instead of ambiguous information
Simple classification:
High: There is no question that this needs to be fixed or compensatedas soon as possible.
Medium: There’s no immediate or high risk, but it should be fixed if possible.
Low: no real harm is expected if it is ignored. Still it should be fixed to improve quality.
© Fraunhofer
Legal issues
Security testing can be seen as crime
Only test with prior written permission of the system owner
Scope (e.g. IP addresses) and timeframe should be clearly defined
There may be multiple parties involved, e.g. hosting company, software manufacturer, user
Risks and liability for potential damage should be clear
Reverse engineering can be seen as copyright violation
Software license may not allow giving source code to security testers
© Fraunhofer
Testing on a productive system
Sometimes there is no dedicated test system
Problem 1: damage (interruptions, data loss) should be avoided
Testers need to be extra careful, e.g. no automated scanning
Negotiate dangerous tests with system operator
Extra precautions, e.g. backup, testing at night time
Testers should not be made liable for damage
Problem 2: sensitive data may be visible to testers
Non-disclosure-agreement
Approved personal
Make sure data stays on premise
Problem 3: Activities of other users can be confusing
© Fraunhofer
Open questions
How to achieve high confidence efficiently?
How can we prevent that no potential threats and bugs are missed?
How can we make sure that all relevant aspects and components were covered in an analysis?
How can we do this in reasonable time?
How can we make it easier to identify weaknesses?
How can we make it easier to build secure systems in the first place?