agile mobile testing workshop
Post on 02-Jul-2015
590 Views
Preview:
DESCRIPTION
TRANSCRIPT
AGILE MOBILE TESTING WORKSHOP PUNE AGILE CONFERENCE
JULIAN HARTY Contact me: julianharty@gmail.com Rev: 22 Nov 2014
Creative Commons License How to design your mobile apps by Julian Harty is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. http://creativecommons.org/licenses/by-sa/3.0/deed.en_US
AGILE TESTING
http://lisacrispin.com/2011/11/08/using-the-agile-testing-quadrants/
TTUF
TIME TO USEFUL FEEDBACK
Information is more valuable when it is timely
CONTINUOUS INTEGRATION (FOR MOBILE APPS) Raw Ingredients • Code • Code Repository (git, svn, …) • Triggers • Build tools • Automated tests • Run time environment(s)
• Emulators • Simulators • Devices
WHERE ARE THE LATENCIES
WORKSHOP
WHERE ARE THE LATENCIES? • Build times
• Affects end-to-end Unit Test runtime • Commissioning run-time environment
• Automated tests • Deployment
• Installing the app so it can be tested • App Store Approval
• Feedback from the market • Feedback from the field
• App Qualities • Failures & Defects in use
INTERACTIVE TESTING
TTUF
TBS
We want to maximize T and minimize B & S
T B S
T = Testing B = Bug reporting S = Setup
MINIMIZE SETUP Email or SMS URLs to phone
Have a configuration workstation with all the drivers installed
Create apps on your build server and make them available
MINIMIZE BUG INVESTIGATION
• Screenshot utilities
• Learn how to access, filter and store device logs
• Good quality camera for close-up screenshots
• Write a bug report that will still be valuable when the bug will actually be investigated
MAXIMIZE TESTING Use testing heuristics
• I SLICED UP FUN (Jonathan Kohl)
• COP FLUNG GUN (Moolya)
HEURISTICS & MNEMONICS
ANTIFRAGILE TESTING
http://moolya.com/blogs/2012/04/121/Test-Mobile-applications-with-COP-who-FLUNG-GUN
TEST THIS
Kiwix
USING THIS GUIDE
http://moolya.com/blogs/
WiFi Password: FW4WFAAA
AUTOMATED TESTS
TTUF
REDUCE EFFORT
TESTABILITY
TESTABILITY
SPENDING MONEY WISELY
WHAT IS TESTABILITY? The concept of designing & implementing software so it is easier to test • Testing can be automated • Testing can be interactive
SCALES OF TESTABILITY
There are at least 2 dimensions of Testability: • ease of interfacing • transparency into the state & behaviour of the software being tested.
easy
transparency
inte
rfaci
ng
challenging
trans
pare
nt
opaq
ue
DESIGNING FOR TESTABILITY: HOOKS Programmatic Hooks To connect test automation easily Consider whether to leave them in situ
DESIGNING FOR TESTABILITY: VISIBILITY
“Eyes into the Soul of the machine...” Expose internal data and state
• Makes some checks easier to confirm • e.g. Error recovery mechanisms cleaned up the
app’s internal state Beware:
• Non-test code might start using the data • If so, consider formalising the access in an API
TESTABILITY: LAYERING OF CODE Already covered some aspects in the Segmented Design topic Ideal to be able to automate the testing of each layer or component independently Then testing of the composite software can focus on testing the composite aspects Beware of emergent behaviour
• Test the qualities: non-functional-testing (NFT)
TESTABILITY: SEPARATION OF CONCERNS
Separate generic and platform-specific code Generic code:
• Application logic: What the app does, functionality Platform-specific code:
• User Interface • Threading • Calls to platform-specific APIs
TESTABILITY: ISOLATE COMPLEX CODE
Try encapsulating & isolating complex code • Provide an interface* • Have excellent automated tests exercise it • Warn casual developers (and testers) not to tamper
with it • Now the rest of our code is easier to understand &
manage In parallel consider ways to replace complex code with simpler code
* e.g. See the Facade design pattern
BACK TO “VALUE”
BIG PICTURE
FULL LIFECYCLE COSTS
SPENDING WISELY?
FULL LIFECYCLE COSTS The initial development effort may be dwarfed by maintenance work There are trade-offs between reducing the cost of initial development and the cost of maintenance work Code that costs more to modify is undesirable. Well designed code & good automated tests can reduce the risk and cost of maintenance work.
Beware of premature aging of your app’s codebase!
SPEND MONEY ON TESTING?
WHERE AND WHEN TO
NOVODA Costs 60% more to ‘add’ test automation to Android projects
Who’s willing to sign off on it?
Where and when does the ROI start?
THINGS TO CONSIDER How long do your code bases ‘last’? Who pays for ‘maintenance’? Where is the expertise to maintain the code? Active apps need ongoing nurture & investments even if you’re not changing the functionality
ALTERNATIVES TO TESTING Testing is not the only way to obtain useful feedback. Sometimes it’s not the best way either.
COMPLEMENTING TESTING WITH OTHER INFORMATION SOURCES
• Crowd Sourcing • Log Analysis & Crash Dumps • Analytics • In-app feedback
VISUALIZATION TOOLS
UiAutomationViewer (for Android) Using visualisation tools to help define the test automation interface
USING MOBILE ANALYTICS
SECTION 7
USING MOBILE ANALYTICS An overview of Mobile Analytics How they can help augment our testing
TOPOLOGY
Overview of Mobile Analytics Each step may be delayed
Dat
a C
olle
ctor
Database
Analytics WebServer
Business view
Filter(s)
Mobile Apps sending Analytics data
TYPES OF EVENTS
Mobile app Analytics Library
Analytics Collector
1:1 App-initiated event
m:1 App-initiated event
Library-initiated event
E1
E
E4 …
L
E
Ea
L
E
Ea AnalyticsDatabase
Internet connection
ANALYTICAL QUESTIONS
Engineering Activity, Benchmarking,
Testing
Trends, Defect Reports Extrapolation
Software quality models, bottleneck
analysis
Specification refinement, asset
reallocation
Failure prediction models
What’s Happened? (Reporting)
What’s Happened? (Alerts)
What will Happen? (Forecasting)
How and why did it happen?
(Factor analysis)
What is the next best action?
(Recommendation)
What’s the best / worst that can happen?
(Modeling / Simulation)
Information
Insight
Past Present Future
FISHBONES
Feasible Practical Useful
top related