power debugging

43
Power Debugging Kapil Vaswani Researcher Rigorous Software Engineering Microsoft Research India FT54 Sandeep Karanth RSDE Advanced Development and Prototyping Microsoft Research India Sriram Rajamani, Aditya Nori (Rigorous Software Engineering, MSRI) Joseph Joy, B. Ashok, Gopal Srinivasa (Advanced Development and Prototyping, MSRI) Hongkang Liang, Vipindeep Vangala (Windows Sustained Engineering) Trishul Chilimbi (Runtime Analysis and Design, MSR) Abhik Roychoudhury (National University of Singapore) Ben Liblit (University of Wisconsin)

Upload: sage

Post on 23-Feb-2016

78 views

Category:

Documents


0 download

DESCRIPTION

FT54. Power Debugging. Sandeep Karanth RSDE Advanced Development and Prototyping Microsoft Research India. Kapil Vaswani Researcher Rigorous Software Engineering Microsoft Research India. Sriram Rajamani, Aditya Nori (Rigorous Software Engineering, MSRI) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Power Debugging

Power DebuggingKapil VaswaniResearcherRigorous Software EngineeringMicrosoft Research India

FT54

Sandeep KaranthRSDE

Advanced Development and Prototyping Microsoft Research India

Sriram Rajamani, Aditya Nori (Rigorous Software Engineering, MSRI)Joseph Joy, B. Ashok, Gopal Srinivasa (Advanced Development and Prototyping, MSRI)Hongkang Liang, Vipindeep Vangala (Windows Sustained Engineering)Trishul Chilimbi (Runtime Analysis and Design, MSR)Abhik Roychoudhury (National University of Singapore)Ben Liblit (University of Wisconsin)

Page 3: Power Debugging

Ask experts

Check bug database

Check version history

Reproduce bug

Trace in a debugger

Debugging Yesterday, Today and Tomorrow How else can

we help diagnose failures?+Visual Studio

IntellitraceTM

+Visual StudioTest Elements

+Visual Studio Test Impact Analysis

Page 4: Power Debugging

Power Debugging

Holmes Statistical debugging toolUse large test suites to diagnose failures

Debug AdvisorRecommendation system for bugsMines software repositories for information related to a bug

DarwinTool for debugging regressionsUse a previous, stable version of an application to diagnose failures

Page 5: Power Debugging

HolmesStatistical Debugging

Kapil Vaswani, Aditya Nori (Rigorous Software Engineering, MSRI)Sandeep Karanth (Advanced Development and Prototyping, MSRI)Trishul Chilimbi (Runtime Analysis and Design, MSR)Ben Liblit (University of Wisconsin)

Page 6: Power Debugging

HolmesWhere testing meets debugging> Programs are often put through

rigorous testing > Large test suites> Many passing tests, some failing tests

> Can test suites help us find the cause of failures?

Page 7: Power Debugging

Statistical Debugging with Holmes> Collect profiles/coverage data from a

large number of (successful and failing) test cases

> Look for code paths that strongly correlate with failure

Page 8: Power Debugging

Debugging with Holmes

Test suiteAutomated/ManualVisual Studio unit testsVisual Studio Test Elements

Code coverageHolmes path coverageHistorical debugging

0111011..

Test resultsPass/fail

HolmesStatistical analysis

Potentialroot causes

Page 9: Power Debugging

HolmesVisual Studio Integration

Sandeep KaranthRSDEAdvanced Development and Prototyping

demo

Page 10: Power Debugging

Holmes available for download today!

http://research.microsoft.com/holmes Try it and give us feedback!

announcing

Page 11: Power Debugging

Debugging with Holmes

Test suiteAutomated/manualVisual Studio unit testsVisual Studio Test Elements

Code coverage

0111011..

Test resultsPass/fail

HolmesStatistical analysis

Potentialroot causes

HolmesStatistical analysis

Code coverage

Page 12: Power Debugging

Code coverage for Holmes> Statement/block/arc coverage

insufficient> Path coverage

> Track acyclic, intra-procedural path fragments

> Why paths?> Paths represent scenarios> Bugs often occur in complex

scenarios> Profile with low overheads

(roughly 10 – 30%)

a

b

c d

e

f

Page 13: Power Debugging

Statistical Analysis> Measuring correlation straight-forward> But there is a pitfall!

> cum hoc ergo propter hoc, a logical fallacy

> Correlation does not imply causality> Examples

> Exception handling code> Error recovery routines

Page 14: Power Debugging

Cause and correlation> An analysis that distinguishes cause

from correlation> Context of the path - method/loop/try-

catch block containing the path> Look for paths that strongly correlate with

failure> But context does not correlate with

failure> Very effective in practice!

Page 15: Power Debugging

Statistical Analysis> Inputs to analysis

> Path coverage for each test case> Outcome of each test case

> Compute four statistics for each path number of successful test cases in

which p is covered number of failing test cases in which p

is covered number of successful tests in which the

context of p was covered number of failing tests in which the

context of p was covered

Page 16: Power Debugging

Statistical Analysis

𝒊𝒏𝒄𝒓𝒆𝒂𝒔𝒆 ( p )=Fcov ( p )

Scov ( p )  +Fcov ( p )  −contex t ( p)

𝒓𝒆𝒄𝒂𝒍𝒍 ( p )=Fcov ( p )F total

𝒄𝒐𝒏𝒇𝒊𝒅𝒆𝒏𝒄𝒆 ( p )=mean(increase ( p ) , recall (p ))

Context: How much is the context of a path correlated with failure?

Increase: How much more is the path correlated with failure?

Recall: What fraction of all failures occur when this path is covered?

Confidence: Overall measure that combines increase and recall

𝒄𝒐𝒏𝒕𝒆𝒙𝒕 (p )=Fcontext ( p )  

Scontext ( p )  +Fcontext ( p )  

Page 17: Power Debugging

Giving Holmes a hand> Write more test cases!

> Use automated test generation tools like Pex

> How many?> Typically, 10-20 related failing tests with

~100 passing tests suffice

Page 18: Power Debugging

>>FUTURESummary

> Holmes available on http://research.microsoft.com/holmes > Ships with a tool for measuring path

coverage> Integrates with Visual Studio and Test

Elements> Supports managed code> Supports automated and manual tests

> Ongoing work> Support for unmanaged code> Support for historical debugging traces

> Try it and give us feedback!

Page 19: Power Debugging

DarwinAutomatically Root-causing RegressionsKapil Vaswani (Rigorous Software Engineering, MSRI)Abhik Roychoudhury (National University of Singapore)

Page 20: Power Debugging

Regressions> Changes that break functionality> Often uncovered by regression testing

> Debug by comparing buggy version with previous, correct version> Doesn’t work when too many changes> Doesn’t work for unmasking regressions

Page 21: Power Debugging

Debugging by comparing test cases> Compare trace of the

failing test case with a similar, passing test case

> Problem – such test cases don’t exist!

Root cause

Page 22: Power Debugging

Darwin – Key ideas> Define notions of similarity between

test cases

> Automatically generate similar, passing test given a failing test case

Page 23: Power Debugging

Test Similarity> Given

> Two versions of an application P and P’

> A test T that passes on P and fails on P’

> Similarity> A test T’ is similar to T

if T’ and T follow the same control flow path in P but different paths in P’

Old version

New version

Root cause

Page 24: Power Debugging

Test Generation in Darwin> Problem of finding similar tests as a

constraint solving problem> Using techniques similar to Pex

Page 25: Power Debugging

Darwin at workvoid dodash(char delim, char* src,

int* i, char* dest, int* j, int maxset)

{int k;bool junk;char escjunk;

while ((src[*i] != delim) &&

(src[*i] != ENDSTR) {

if (src[*i] == ESCAPE) {

escjunk = esc(src, i);

junk = addsrt(escjunk,

dest, j, maxset);}else {

…}

}

void dodash(char delim, char* src, int* i, char* dest, int* j, int maxset)

{int k;bool junk;char escjunk;

while ((src[*i] != delim) &&

(src[*i] != ENDSTR) {

if (src[*i - 1] == ESCAPE) {

escjunk = esc(src, i);

junk = addsrt(escjunk,

dest, j, maxset);}else {

…}

}

% [0-9][^9-B][@t][^a-c]% [0-9][^9-B][00][^a-c]

Failing test case Passing test case generated by Darwin

Old New

Page 26: Power Debugging

>>FUTURECurrent status

> Prototype based on Pex> Automatically root caused regressions

in large applications> Web servers (HTML pages)> Image processing applications (jpeg

images)

> Working on VS integration, supporting multi-threading, …

> Watch this space! http://research.microsoft.com/darwin

Page 27: Power Debugging

DebugAdvisorA recommendation system for bugs

Sriram Rajamani (Rigorous Software Engineering, MSRI)Joseph Joy, B. Ashok, Gopal Srinivasa (Advanced Development and Prototyping, MSRI)Hongkang Liang, Vipindeep Vangala (Windows Sustained Engineering)

Page 28: Power Debugging

A Common Scenario

Tester/developerreceives bug report

Has this or similar bug been

looked at or fixed before?

What do we know about this kind of

bugs?

Who should I ask for help?

Where should I start

looking?

Page 29: Power Debugging

What You KnowThe customer experiences some deadlocks on a server. The problem is random and may occur from several times a week to once a month. The system looks hung because the global resource 'ObpInitKillMutant' is help by a thread which tries to close a file forever. So all the processes having a thread waiting on 'ObpInitKillMutant' stop working fine. Drivers such as TCP/IP continue to respond normally but it's impossible to connect to any share.

0: kd> !thread 82807020ChildEBP RetAddr Args to Child80c7a028 00000000 00000000 ntkrnlmp!IopAcquireFileObjectLock+0x5882a6d7a0 80c7a028 00120089 ntkrnlmp!IopCloseFile+0x7982a6d7a0 80c7a010 80f6da40 ntkrnlmp!ObpDecrementHandleCount+0x11200000324 7ffdef01 00000000 ntkrnlmp!NtClose+0x17000000324 7ffdef01 00000000 ntkrnlmp!KiSystemService+0xc900000324 80159796 000000c9 ntkrnlmp!ZwClose+0xb000000c9 e185f648 00000000 ntkrnlmp!ObDestroyHandleProcedure+0xd809e3008 801388e4 82a6d926 ntkrnlmp!ExDestroyHandleTable+0x4800000001 82a6d7a0 7ffde000 ntkrnlmp!ObKillProcess+0x4400000001 82a6d7a0 82a6d7f0 ntkrnlmp!PspExitProcess+0x5400000000 f0941f04 0012fa70 ntkrnlmp!PspExitThread+0x447ffffffff 00000000 00002a60 ntkrnlmp!NtTerminateProcess+0x13cffffffff 00000000 00002a60 ntkrnlmp!KiSystemService+0xc900000000 00000000 00000000 NTDLL!NtTerminateProcess+0xb

REGISTERS:eax=00000005 ebx=e3185488 ecx=0000083c edx=e2dddc68

Textual descriptionof bug

Stack trace

Processor state

Page 30: Power Debugging

Debug Advisor Search

Page 31: Power Debugging

Similar Bugs

Page 32: Power Debugging

Related Information

Page 33: Power Debugging

Debug Logs

Page 34: Power Debugging

Query processing

QueryStack traceCode snippetsEmails

Feature parsersStack trace parserRegister information parser

Query engin

e

RepositoriesVersion controlBug repository

Debug logs

Similar bugs

Bug repository

Relationship builder

Relationship graph

Link Analysis

Debugadvisorreport

Page 35: Power Debugging

Deployment feedback> Deployed internally for over 6 months

> Used by several developers

77%

23%

Unsolicited feedback129 users, 628

queries

Useful Not useful

75%

25%

Solicited feedback20 bugs

Useful Not useful

Page 36: Power Debugging

>>FUTURESummary and Status

> Much better precision and recall compared to full text search

> Effectiveness depends on access to large repositories> Evaluating effectiveness with smaller,

more representative repositories> Exploring potential integration with Visual

Studio

Page 37: Power Debugging

Summary> Debugging is getting harder!

> Debugging tools need to evolve> Actively help diagnose failures

> Three tools that assist/automate debugging> Exploit by-products of a typical software

lifecycle (tests, versions, repositories)> Holmes available for download, others

will follow

Page 38: Power Debugging

Related talksCode Visualization, UML, and DSLs Cameron Skinner Tuesday 4:30 PMExtending the Microsoft Visual Studio 2010 Code Editor to Visualize Runtime Intelligence

Gabriel Torok & Bill Leach Wednesday 4:30 PM

A Lap Around Microsoft Visual Studio and Team Foundation Server 2010

Cameron Skinner & Mario Rodriguez Thursday 10:00 AM

Microsoft Visual Studio Lab Management to the Build Setup Rescue Vinod Malhotra Thursday 10:00 AMScrum in the Enterprise and Process Customization with Microsoft Visual Studio 2010

Simon Bennett & Stuart Preston Thursday 1:45 PM

Advanced Diagnostics, IntelliTrace™ and Test Automation Habib Heydarian Thursday 1:45 PM

Power Tools for DebuggingKapil Vaswani & Sandeep Karanth Thursday 3:00 PM

Automating "Done Done" in the Team Workflows with Microsoft Visual Studio Ultimate and Team Foundation Server 2010

Jamie Cool & Brian Randell Thursday 3:00 PM

Page 39: Power Debugging

Resources> Holmes

> Download: http://research.microsoft.com/holmes> Forum: http://blogs.msdn.com/holmes

> Technical papers> Holmes: Effective Statistical Debugging via Efficient Path

Profiling, Trishul Chilimbi, Ben Liblit, Krishna Mehra, Aditya Nori and Kapil Vaswani, ICSE 2009

> Darwin: An Approach for Debugging Evolving Programs, Dawei Qi, Abhik Roychoudhury, Zengkai Liang and Kapil Vaswani, FSE 2009

> DebugAdvisor: A Recommender System for Debugging, B. Ashok, Joseph Joy, Hongkang Liang, Sriram Rajamani, Gopal Srinivasa, and Vipindeep Vangala, FSE 2009

Page 40: Power Debugging

YOUR FEEDBACK IS IMPORTANT TO US! Please fill out session evaluation

forms online atMicrosoftPDC.com

Page 41: Power Debugging

Learn More On Channel 9> Expand your PDC experience through

Channel 9

> Explore videos, hands-on labs, sample code and demos through the new Channel 9 training courses

channel9.msdn.com/learnBuilt by Developers for Developers….

Page 42: Power Debugging

© 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries.The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Page 43: Power Debugging