mobile application testing: challenges and best practices
DESCRIPTION
With the rapid rise of mobile devices including smartphones and tablets, many organizations are rolling out mobile apps to extend the reach of their traditional web applications. Although the methodology for mobile application testing is fundamentally the same as that of traditional web and desktop application testing, mobile apps testing presents some unique challenges and issues including coverage of a myriad of mobile devices, usability testing, integration of mobile testing with web interface testing, mobile app performance, and security issues. Jimmy Xu describes these issues and current best practices for solving them. Jimmy introduces the latest technologies and tools in mobile application test automation, mobile application usability, performance, and security testing. In a non-technical and easy-to-understand approach that does not require previous mobile app testing or development experience, Jimmy uses a real world mobile app project to illustrate the challenges—and solutions—of mobile app testing.TRANSCRIPT
BW4 Session 6/5/2013 10:15 AM
"Mobile Application Testing"
Presented by:
Jimmy Xu CGI, Inc.
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073 888‐268‐8770 ∙ 904‐278‐0524 ∙ [email protected] ∙ www.sqe.com
Jimmy Xu CGI
Jimmy Xu has been in the IT industry for more than sixteen years with various companies including JDA Software/i2, IBM/DWL, and CGI. Jimmy has many years of experience designing, developing, and testing enterprise applications, including web and mobile apps, for government, manufacturing, financial, healthcare, and telecom clients. He has deep technical expertise in Java, Linux, Android, iOS, and other enterprise application platforms and a good mastery of various enterprise software development and testing methodologies. Jimmy holds CISSP, CSSLP, and CSTE certifications; has published an eBook on software security; and has been a conference presenter on software development, testing, security, and performance.
4/18/2013
1
Mobile Application Testing: Challenges & Best Practices
Better Software 2013, Las Vegas, USA
Jimmy Xu, CISSP, CSSLP, CSTE June 5, 2013
Agenda
• Mobile Testing Overview
• Mobile Test Automation
• Mobile User Experience Test
• Mobile Performance Test
• Mobile Security Test
• Questions and Answers
2
4/18/2013
2
Mobile Testing Overview
Mobile Testing Challenges - Multiplicity
Mobile application
testing will usually need
to cover a multiplicity of
mobile devices with
different capabilities,
features, and limitations.
4
4/18/2013
3
Mobile Testing Challenges - Usability
Usability is a greater
quality issue for mobile
applications than for
web or desktop
applications as mobile
users demand better
user experiences.
5
Mobile Testing Challenges - Integration
When mobile Applications
are developed as add-ons
to desktop web applications,
requirements, use cases,
and test cases will usually
need to be synched up.
6
4/18/2013
4
Mobile Testing Challenges - Performance
Mobile applications add
additional user load to
enterprise application
infrastructure:
• Communication with
enterprise servers can be
limited and unreliable in
bandwidth
• Mobile device hardware
resources are usually not as
powerful as on desktop/laptop
computers.
7
Mobile Testing Challenges - Security
Mobile applications
expose enterprise data
to a larger potential
attack surface.
8
4/18/2013
5
Different Types of Mobile Applications
Apps developed to run natively on
mobile devices
Native Apps
Web Apps
Optimized web pages correctly scales content for the device screen and are optimized for mobile browsers
Compatible web pages are compatible with mobile browsers but take no extra steps to optimize the mobile viewing experience
Combines native UI elements with
access to web content within a web
content–viewing area.
Hybrid Apps
Mobile Testing Perspectives
Comprehensive Scope
of MobileTesting
• Native apps
• Mobile web apps
• Hybrid apps
• Smart phones, tablets, Internet TVs
• Android, Windows 8, iOS, Blackberry OS
• Performance
• Security
• Usability
• Accessibility
• Compliance
• Regression Automation
Increase
• Quality of Testing
• Test Coverage
• Release confidence
Reduce
• Time to Market
• Testing Resources
• Defect Resolution
Time
• Overall Testing Costs
10
Testing
Tools
Best
Practice
Mobile
Testing
Team
4/18/2013
6
Mobile Testing Strategy
• Functional tests on
primary devices
• User experience tests only
on primary devices or on
all devices
• Automate regression test
cases
• Execute regression test
cases on other devices
• Execute security and
performance tests on
simulators / real devices
11
Flexible delivery using “best
shore” global delivery
network provides risk-
balanced delivery, optimized
offsite leverage and agility in
meeting demands
Deep domain expertise
enabling business risk
mitigation and improved
effectiveness
Innovative automated
test service framework
with trusted tools
vendors resulting in
faster time to market,
improved efficiency,
increased productivity
and lower costs
Frameworks that embody Lifecycle Quality
Management approach leading to defect prevention,
lower costs, and improved quality Lifecycle Quality
Management
Best Shore Global
Delivery Model
Automation Innovation
Domain Focused Solutions
Mobile testing services should drive 20% - 40% reduction in costs while improving speed
to market, productivity and quality.
12
Achieving Excellence in Mobile Testing
4/18/2013
7
Mobile Test Automation
Mobile Test Automation Options
Manual test on simulators
Manual test on real devices
Cloud-based test
Cloud-based automated test
14
4/18/2013
8
• Simulators not working exactly the same as real devices
• Not possible to test SMS, email, and phone call services on real networks
• Test logs and video recording of test sessions possible
• Online peer sharing of test sessions possible
• Inexpensive investment in infrastructure
• Manual execution and expensive to run same test multiple times
• Suitable for unit tests by developers
Manual Test on Simulators
Manual Test on Real Devices
CGI RESTRICTED AND CONFIDENTIAL
• Test your apps on exactly the same real devices & same live networks as for production use
• No test logs and video recording of test sessions available
• Very difficult for online peer sharing of test sessions
• Test infrastructure can be expensive if testing on multiple devices + carrier networks is needed
• Manual execution and expensive to run same test multiple times
• Suitable for SIT & UAT when # of combinations of devices / networks is small
4/18/2013
9
Cloud-Based Test
• Remote access to multiple real devices & live networks
• Screenshots, test logs and video recording of test sessions available for defect analysis
• Online peer sharing of test sessions to promote offshore/onshore collaboration and agile testing
• Monitoring of real-time device performance & user experience
• Test infrastructure can be expensive depending on # of devices + carrier networks needed
• Manual execution and expensive to run same test multiple times
• Suitable for SIT & UAT with large number of devices / networks combinations but small number of test cases
Cloud-Based Automated Test
Action
Interface
Data
• Remote access to multiple real devices & live networks
• Screenshots, test logs and video recording of test sessions available for defect analysis
• Online peer sharing of test sessions to promote offshore / onshore collaboration and agile testing
• Monitoring of real-time device performance & user experience
• Script once, test automatically on multiple devices
• Test infrastructure can be expensive depending on # of devices + carrier networks needed
• Suitable for SIT & UAT with large number of devices / networks combinations and large number of test cases
4/18/2013
10
Separation of Actions, Data, Interfaces
19
Data Actions
Interface
1) Enter value
2) Click Button
• Select a scripting
tool
• Select an execution
platform
• Separate actions,
data, and interfaces
SystemUtil.Run "notepad","","",""
Window("Notepad").WinEditor("Edit"
).Type "yes"
Window("Notepad").WinEditor("Edit"
).Type micCtrlDwn + "s" +
micCtrlUp
Window("Notepad").Dialog("Save
As").WinEdit("File name:").Set
"test"
Window("Notepad").Dialog("Save
As").WinButton("Save").Click
SystemUtil.Run http://google.com/
,"","",""Browser("Google").Page("G
oogle").WebEdit("q").Set
"mba"Browser("Google").Page("Googl
e").WebEdit("q")
.SubmitBrowser("Google").Page("mba
Search").SyncBrowser("Google").Clo
se
Script Once, Test Many
• Search for texts and images on the screen, regardless of their
locations, sizes, and colors
• Record and play
• Keyword-based scripting
• Port scripts to any devices
4/18/2013
11
Mobile Test Management Automation
CGI RESTRICTED AND CONFIDENTIAL
• Use ALM for mobile test management automation
• Manage mobile test statements, test cases, and test scripts
• Manage mobile test execution
• Manage mobile test defects lifecycle
• Integration with mobile test environment (simulators, real devices, or cloud)
• Integration with mobile development IDE/SDK
• Real-time traceability, KPIs, metrics, reporting
Mobile User Experience Testing
4/18/2013
12
Mobile User Experience Guidelines
• Mobile Web Best Practices (MWBP)
• Accessibility (WCAG) General
Guidelines
• iOS user interface guidelines
• Android user interface guidelines
• Blackberry user interface guidelines
Device-Specific User Interface
Guidelines
• Domain specific
• Organization specific
App-Specific User Interface
Guidelines
Mobile Device Characteristics
• Display screen size
• Keyboard entry limitation
• Pointing device limitation
• Network bandwidth limitation
• Battery life limitation
• Device memory limitation
• Display color limitation
• Color contrast limitation
• File format rendering limitation
• Browser variations
• Support for HTML, HTML5, CSS,
JavaScript, Flash, Java
differences
• People interact with one app at a time
• Single window with no visible components
• Display orientation
• Gestures for user-device interaction
• Voice recognition
• Location services
• Text message, email, and phone call services
4/18/2013
13
“One Web” Principle
One Web: same
services to all users / devices
Some services and
information are
more suitable for
and targeted at
particular user
contexts
Some services
have a primarily
mobile appeal
Some services
have a primarily
desktop appeal
Some services
have a
complementary
desktop and mobile
appeal
Testable Statements – iOS Orientation Change
Example
• The iOS version of App XYZ must
support both landscape and
portrait orientation
• The iOS version of App XYZ must
support both variants of the
landscape orientation by rotating
content 180 degrees.
• App XYZ is not required to rotate
its content when the iOS device is
held with the Home button on the
top.
Testable Statements
26
• Think twice before preventing your
app from running in all
orientations.
• If your app only runs in one
orientation:
• Launch your app in your
supported orientation,
• Avoid displaying a UI element
that tells people to rotate the
device.
• Support both variants of an
orientation.
iOS orientation change guideline
4/18/2013
14
Test Cases - iOS Orientation Change Example
• Download and install App XYZ from App Store onto a real iPhone 5
device
• Start App XYZ from iPhone 5 home screen
• Navigate to different screens of App XYZ
• While on any screen, do the following: • Hold iPhone 5 device in landscape with home button on the right, and expect App
XYZ’s content to be displayed in a top-down, left-right order
• Then hold the device in landscape with home button on the left, and expect App
XYZ’s content to be rotated 180 degrees
• Then hold the device in portrait with home button on the bottom, and expect App
XYZ’s content to be rotated 45 degrees
• Then hold the device in portrait with home button on the top, and expect App
XYZ’s display of content not to be changed.
Test Case
27
Mobile Performance Test
4/18/2013
15
Performance Engineering Domains
Models the expected production usage of an application by simulating multiple users accessing the application's services concurrently. It is the most fundamental performance test to understand response times and error rates.
Tests the system’s stability when the load is raised beyond normal usage patterns. This test determines at what load an application fails, and how it fails. Useful for determining headroom for capacity planning
Also known as endurance testing, this determines the ability of an application to perform its required functions under stated conditions for an extended period of time
Most often used to measure an application's throughput with respect to batch or message processing where user response times are not relevant
By comparing to baselines, determines how linear the application's infrastructure can scale to support increased work load
Load
Stress
Reliability
Scalability
Volume
Testing Additional Load from Mobile Apps
• Estimating additional load from mobile apps
• Simulating requests from mobile apps with load test scripts
• Selecting a primary load testing tool
• Custom coding to support HTML5, JavaScript / Ajax, REST, SOAP
• Measuring server response time and throughput
• Monitoring usage of server system resources
4/18/2013
16
Testing, Tuning & Profiling Mobile Apps
• Measuring mobile user
experience
• Identifying bottlenecks in
your mobile apps
• Performing root cause
analysis
• Improving mobile user
experience
• Minimizing mobile app
footprint
• Profiling function calls
Monitoring Device System Resources
Collecting & reporting system
usage metrics
CPU busy & idle time and
background processes
Measuring footprint of mobile
apps on mobile devices
Memory & disk space usage
Batter life & power
consumption rate
Network bandwidth, throughput,
and data usage
Connection interruptions, performance jitters and degradations of mobile
networks
4/18/2013
17
Mobile Security Test
Secure SDLC
Security testing should be
performed during SIT to simulate
application abuses and ensure any
vulnerabilities uncovered are properly
addressed as “security bugs.”
Application should be tuned
and hardened at all layers
of the platform stack to
minimize infrastructure
software mis-configuration
vulnerabilities.
Vulnerability scanning should be regularly
performed during the maintenance phase on both
the application and infrastructure to ensure
no new security risks have been
introduced and that the level of
security is still intact.
Should be defined according to
governance rules for authentication,
authorization, non-repudiation, data
confidentiality, integrity,
accountability, session management,
transport security, privacy, etc.
Should take into consideration
network, server, middleware,
database and programming
platform vulnerabilities,
leveraging techniques such as
threat modeling and risk
analysis.
Static code analysis should be
performed to ensure secure coding
guidelines are followed and coding
vulnerabilities are minimized.
REQUIREMENTS
PLANNING & DESIGN
DEVELOPMENT TESTING
DEPLOY MENT GOVERNANCE
Security Polices,
Guidelines, Standards,
Procedures, Metrics
created & enforced by
organizations
OPERATE/ MAINTAIN
4/18/2013
18
• Misplaced or lost smart phones / tablets
• Mobile devices not password protected
• Unencrypted credentials, insecure storage, or cached data
• Misconfigured certificate and proxy settings
• Mobile devices may have unauthorized modifications
• Malware apps downloaded from app stores or jail breaking
• Security software often not installed to scan for Trojans,
spyware, malware, and spams
• Invoking classes, services, activities from insecure sources
• Out of date operating system versions
• Out of date software utilities
• Shared mobile app IDs and data
Device Based Attacks
Server & Network Based Attacks
• Data transmissions via Wi-Fi Hot Spots not always encrypted
• Bluetooth communications in "open" or "discovery" mode
• NFC offers no protection against eavesdropping
• Internet connections usually not protected by firewalls
• Man-in-the-middle attacks
• Weak authentication schemes
Internet
4/18/2013
19
Abuse Cases, Misuse Cases, Attack Scripts
• Misuse cases refer to use cases when the actor initiates unintentional but potentially harmful actions.
• Abuse cases refer to use cases when the actor initiates intentionally harmful actions.
• Attack scripts refer to scripts developed to automate penetration such as dictionary attack and ‘fuzzing’.
Mobile Security Testing Environment & Tools
• Mobile software development kit
(SDK)
• Simulators
• Decompiling tools
Static source code analysis tools
38
• Debugging tools
• System data dumping tools
• Tracing tools
Profiling tools
• Exploring tools
• SQL querying tools
• Sniffing tools
• Fuzzing tools
• Automation tools
Penetration test tools
• Tools that intercept and manipulate
the traffic between mobile devices
and servers
Proxy tools
4/18/2013
20
Vulnerability Severity
Low Severity:
CVSS base score of
0.0-3.9
High Severity:
CVSS base score of
7.0-10.0
Medium Severity:
CVSS base score of
4.0-6.9
• Identified vulnerabilities will be flagged with a CVSS severity level
• The primary impact on the confidentiality, integrity, and availability of the protected system/resources
• The derivative impact on loss of life and/or properties
• The percentage of the impacted area within the total environment
• How easy it is to exploit the vulnerability
• How easy it is to remediate the vulnerability
• How confident the testing team is about the existence of the vulnerability
Assignment of CVSS score based on:
Questions
?
4/18/2013
21
Our commitment to you We approach every engagement with one
objective in mind: to help clients succeed