serena business manager 11.1 vs ... - home - serena … virtual users load test scenario ... or...

74
SERENA SOFTWARE Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results Author: RT Tangri 2016-08-11

Upload: duongmien

Post on 23-May-2018

223 views

Category:

Documents


0 download

TRANSCRIPT

SERENA SOFTWARE

Serena Business Manager 11.1 vs. 11.0.1:Comparative Performance Test Results

Author: RT Tangri2016-08-11

Table of ContentsWho Should Read This Paper? ...................................................................................................... 3

Test Methodology ........................................................................................................................ 3

Runtime Test Architecture ............................................................................................................ 4

200 Virtual Users Load Test Scenario ............................................................................................. 4

80 Virtual Users Load Test Scenario ............................................................................................. 7

About the Data Set ..................................................................................................................... 8

Performance Test Results ............................................................................................................ 9

Additional Information.................................................................................................................. 72

Reference ................................................................................................................................. 74

2 Serena Software

Who Should Read This Paper?This paper is intended for system administrators or others interested in comparing the performance testresults for Serena Business Manager 11.1 with SBM 11.0.1. This paper discusses the performance testsresults of 200 virtual users (scaled to 80 virtual users for SBM Orchestration Engine) in the on-premiseversion of Serena Work Center. The performance tests were executed on Windows 2008 R2 and thefollowing databases:

• SQL Server 2008 R2

• SQL Server 2014 SP1

• Oracle 11g R2

• Oracle 12c

Test MethodologyTesting was conducted in a private enterprise performance testing lab using HP's LoadRunner 12.00. Thetests measured application response time, throughput, and system resource usage under a load of 200virtual users against a large enterprise data set with a think time pause of 10 seconds between transactions.

This test involved making LoadRunner generate a load by creating virtual users (one unique virtual user wasadded every 5 seconds until the desired load level was reached) to model user activity. Each virtual userthen performed scripted tasks and sent crafted HTTP requests. LoadRunner then used a threading executionmodel to create multiple instances of unique virtual users to create load and concurrency on the SBMapplication. During this test, the 200 virtual users iterated the set of use cases (transactions) in theperformance suite. The scripts and the datasets used in this testing can be provided for reference uponrequest.

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 3

Runtime Test ArchitectureThe performance test was performed in a distributed SBM installation using 5 separate 64-bit servers. Theenvironment included a dedicated Serena License Manager version 2.2.0 installation with 80,000 seatlicenses.

• Server 1 – SBM Application Engine Web Server (Win2008R2-IIS 7 or Win2012R2-IIS 8.5) with 8GBmemory

• Server 2 – SBM Tomcat7-1: Single Sign-On (SSO), SBM Mail Services, SBM Orchestration Engine with8GB total and 4GB memory allocated to Tomcat7

• Server 3 – SBM Common Services Tomcat7-2, (Smart Search, Resource Mgmt, Relationship Service,SSF Services, SLS, SBMProxy, AuthInjector) with 8GB total and 4GB memory allocated to Tomcat7

• Server 4 – SQL Server 2008 R2, SQL Server 2014 SP1, or Oracle 11g R2 database with 16GB of totalmemory

• Server 5 – SBM Logging Services (Mongod64) with 8GB of total memory

• SOAPUI 5.2.0 XML RESTService (http://10.248.50.1:8091/Resource) (8GB memory)

200 Virtual Users Load Test ScenarioLoad tests simulated scaling a number of common tasks performed by 200 virtual users. Using a standarddefect tracking application that enables a team to track product defects, the tests replicated the life cycle ofdefects from submittal, approval, and assignment. Notes and attachments were added to each item as itmoved through the process. The Notification Server sent e-mails via SMTP to a test SMTP Mail Serverrepository. Shared views of activity, backlog, calendar feeds, and shared dashboard views of multi-viewreports were run against the system.

Tests were performed by a set number of 200 unique virtual users over a 20 minute baseline. The think-time between each use case (LoadRunner “Transaction”) was 10 seconds (with exceptions for transactions

Runtime Test Architecture

4 Serena Software

from Serena Work Center: 60 second think-time before indexer transactions; 90 second think-time beforenotification transactions). The performance workflow consisted of the following actions:

• LoadRunner virtual users ran in multiple threads of execution

• 200 unique virtual users logged in via SSO to Serena Work Center in vuser_init

• Repeated iterate (defect tracking application) use cases

• 200 unique virtual users logged off via SSO from Serena Work Center in vuser_end

SERENA WORK CENTER (SBM APPLICATION ENGINE) USE CASES

Serena Work Center use cases run during virtual user initialization against a large enterprise data set:

• Web SSO Login

• VUser Init Notification Remove All

• Settings Save Preferred Project

• Settings Save Items Per Page 40 And Notification Poll Interval 90 Second

• SWC Home Add Report Widget (Of Multi View Report #1)

• SWC Home Add Activity Widget (My Activity)

Serena Work Center use cases run during virtual user exit against a large enterprise data set:

• SWC Home Delete Report Widget (Of Multi View Report #1)

• SWC Home Delete Activity Widget (My Activity)

• VUser End Notification Remove All

• Web SSO Logout

Serena Work Center use cases iterated by each virtual user against a large enterprise data set:

• Pin Up Defect Tracking (Added in SBM 11.1)

• Select Defect Tracking App (Added in SBM 11.1)

• Submit-Form

• Submit-Form OK (Submit Item)

• Transition Owner In

• Defect Tracking Activities Side Bar

• Defect Tracking My Activity After Owner IN

• Defect Tracking Shared Activity Feed View After Owner IN

• Defect Tracking Calendars Side Bar

• Defect Tracking My Calendar

• Defect Tracking Shared Calendar Feed View After Owner IN

• Transition Owner Out

• Transition Owner Out Ok

• Transition Secondary Owner In

• My Activity After Secondary Owner In

• Shared Activity View All Items I Own Primary and Secondary

• Transition Secondary Owner Out

• Transition Secondary Owner Out Ok

• Transition REST GRID WIDGET (REST Grid Widget is inserted as a Regular Transition in AE Form)

• Transition REST GRID WIDGET OK (calls XML REST Mock Service to exercise AuthInjector componentvia SBMProxy in CommonServices Tomcat7)

• Transition-SyncTest (makes Synchronous AE-WebServices call)

• AddNote

Serena Work Center (SBM Application Engine) Use Cases

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 5

• Add Attachment In Note (27KB JPEG File From Disk)

• AddNote-Ok (Text with HTML5 Tags Includes 27KB JPG + 2KB Note)

• AddAttachment

• AddAttachment-OK (11KB TXT File From Disk)

• Email_Link

• Email_Link SearchUser

• Email_Link-OK (Message with HTML5 Tags)

• Transition Assign To CCB

• Transition Assign To CCB OK

• Transition Assign To Area Owner

• Transition Assign To Area Owner OK

• Social View Of BUG

• BUGID Lucene Text Search

• BUGStar Lucene Text Search

• BUGIDStar Lucene Text Search

• Attachment Lucene Text Search

• TitleDescription Lucene Text Search

• TitleDescriptionStar Lucene Text Search

• Submitter Lucene Text Search

• User Profile Card

• ContactCard AD Client Logging

• Add then Delete From Favorites

• PinUp Defect Tracking (Application) (Removed in SBM 11.1)

• Select Defect Tracking PinUp (Removed in SBM 11.1)

• Defect Tracking Manage Views (Removed in SBM 11.1)

• Defect Tracking Dashboards Side Bar (Added in SBM 11.1)

• Defect Tracking My Dashboard

• Defect Tracking Shared Dashboard View (Of Multi View Report #1)

• Defect Tracking Backlogs Side Bar

• Defect Tracking Shared Backlog Feed View

• Report Center (Removed in SBM 11.1)

• All Reports Tab (Added in SBM 11.1)

• CTOSDS Elapsed Time Duration

• CTOSDS Time in State Duration

• CTOSDS Average Time To State Duration

• CTOSDS Open And Completed Trend

• CTOSDS Entering A State Trend

• CTOSDS State Activity Trend

• Multi-View Report 1

• Static Listing For Performance EG

• Static Listing For Multi-View Performance

• All Issues By Project and State

• All Issues by Issue Type

• Listing Report

• All Open Issues

• My Open Issues

200 Virtual Users Load Test Scenario

6 Serena Software

• All Issues by Project and Issue Type

• All Issues by State

• All Issues by Owner

• Notification UI View All

• Notification UI MarkRead and Remove N (For Each Notification Object)

• Each Iteration Notification Remove All

80 Virtual Users Load Test ScenarioThe SBM Orchestration Engine load test simulated a repeated, scaling number of synchronous andasynchronous orchestration internal events initiated by virtual users as they submitted items into a ComplexLoad OE application.

During Submit Item, the orchestration was initiated as a transition that repeatedly updated items withorchestration results by making the following AE Web Service calls:

• AEWebServices71.GetItemsByQuery

• AEWebServices71.UpdateItem

• AEWebServices71.UpdateItemWithName

Tests were performed by a set number of unique virtual users over a 20 minute baseline. The think-timebetween virtual user clicks was 10 seconds.

To test scalability, a set of up to 80 unique virtual users logged in to the SBM User Workspace, submitted anitem into the system to initiate the orchestrations, and then logged out of the system.

The performance workflow consisted of the following actions:

• LoadRunner virtual users ran in multiple threads of execution

• 80 unique virtual users logged in via SSO to Serena Work Center in vuser_init

• Repeated iterate use cases (each is a LoadRunner "Transaction")

• 80 unique virtual users logged off via SSO from Serena Work Center in vuser_end

SERENA WORK CENTER (SBM APPLICATION ENGINE, ORCHESTRATION ENGINE) USE CASES

Serena Work Center use cases run during virtual user initialization against a large enterprise data set:

• Web SSO Login

• VUser Init Notification Remove All

• Select LoadOE App

• PinUp LoadOE App

Serena Work Center use cases run during virtual user exit against a large enterprise data set:

• VUser End Notification Remove All

• Web SSO Logout

Serena Work Center use cases iterated by each virtual user against a large enterprise data set:

• LoadOE Submit Form

• LoadOE Complex Submit OK (Initiates Synchronous and Asynchronous Orchestrations as Transition toSubmit Form)

Serena Work Center (SBM Application Engine, Orchestration Engine) Use Cases

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 7

About the Data SetThe large enterprise data set that was used for testing is summarized below.

SBM Entity Count

Workflows (define the process that items follow) 31

Projects (store process items, such as issues and incidents) 5,302

Users 21,100

SBM groups 138

Folders (store personal and group favorites) 147,570

Issues (process items that follow a workflow) 1,130,000

Contacts (similar to address book entries) 20,053

Resources 24,249

Attachments 36,960

Note: In addition, the database included the following configuration settings:

• processes = 300 (Newer versions of Application Engine require more sessions per user topower Work Center)

• READ_COMMITTED_SNAPSHOT = OFF

About the Data Set

8 Serena Software

Performance Test ResultsThe following graphs summarize the results from the performance tests.

AVERAGE TRANSACTION RESPONSE TIME

The response times for individual transactions are summarized below. Note the following:

• SQL Server 2014 SP1 provides the best performance.

• No significant performance impact was observed when the Rich Text Editor was enabled for Memo/Note Fields in submit and transition forms.

Average Transaction Response Time

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 9

Performance Test Results

10 Serena Software

ORCHESTRATION ENGINE TRANSACTION RESPONSE TIME

The response times for individual transactions are summarized below. Note that Oracle 11g R2 provides thebest performance for Load OE Complex Submit OK.

Orchestration Engine Transaction Response Time

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 11

Performance Test Results

12 Serena Software

Orchestration Engine Transaction Response Time

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 13

TRANSACTION SUMMARY

The number of transactions iterated by 200 virtual users is summarized below. Note that with changes toWork Center in 11.1, more transactions were needed to accomplish the same UI task.

Comparing transactions iterated between SBM 11.1 and SBM 11.0.1:

• 5% less transactions iterated with SQL Server 2008 R2 in SBM 11.1

• 5% less transactions iterated with SQL Server 2014 SP1 in SBM 11.1

• 5% less transactions iterated with Oracle 11g R2 in SBM 11.1

Performance Test Results

14 Serena Software

Transaction Summary

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 15

Performance Test Results

16 Serena Software

ORCHESTRATION ENGINE TRANSACTION SUMMARY

The number of transactions iterated by 80 virtual users during the Complex Load OE baseline is summarizedbelow.

Comparing transactions iterated between SBM 11.1 and SBM 11.0.1:

• 5% more transactions iterated with SQL Server 2008 R2 in SBM 11.1

• Slightly less transactions iterated with SQL Server 2014 SP1 in SBM 11.1

• 5% less transactions iterated with Oracle 11g R2 in SBM 11.1

Orchestration Engine Transaction Summary

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 17

Performance Test Results

18 Serena Software

Orchestration Engine Transaction Summary

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 19

Performance Test Results

20 Serena Software

WEB SERVER CPU USAGE

Web server CPU usage is summarized below. Note that with 5% less transactions executed in SBM 11.1, the10% improvement in web server CPU usage was the result of Application Engine improvements made inSBM 11.1.

Web Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 21

Performance Test Results

22 Serena Software

Web Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 23

ORCHESTRATION ENGINE WEB SERVER CPU USAGE

Orchestration Engine web server CPU usage is summarized below. At least 10% improvement of the webserver CPU usage in SBM 11.1 was the result of Application Engine and ODE performance improvementsmade in SBM 11.1.

Performance Test Results

24 Serena Software

Orchestration Engine Web Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 25

Performance Test Results

26 Serena Software

WEB SERVER MEMORY USAGE

Web server memory usage is summarized below. Note that with 5% less transactions executed in SBM 11.1,the 25% improvement in web server memory usage was the result of Application Engine performanceimprovements made in SBM 11.1.

Web Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 27

Performance Test Results

28 Serena Software

Web Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 29

ORCHESTRATION ENGINE WEB SERVER MEMORY USAGE

Orchestration Engine web server memory usage is summarized below. The 25% improvement in web servermemory usage was the result of Application Engine and ODE performance improvements made in SBM 11.1.

Performance Test Results

30 Serena Software

Orchestration Engine Web Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 31

Performance Test Results

32 Serena Software

WEB SERVER HANDLE COUNT

Web server handle count is summarized below. Note that with 5% less transactions executed in SBM 11.1,improvement in the web server handle count (especially with SQL Server 2014 and Oracle) was the result ofApplication Engine performance improvements made in SBM 11.1.

Web Server Handle Count

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 33

Performance Test Results

34 Serena Software

Web Server Handle Count

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 35

ORCHESTRATION ENGINE WEB SERVER HANDLE COUNT

The Orchestration Engine web server handle count is summarized below. The 10% improvement in theOrchestration Engine web server handle count in SBM 11.1 was the result of Application Engine and ODEperformance improvements made in SBM 11.1.

Performance Test Results

36 Serena Software

Orchestration Engine Web Server Handle Count

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 37

Performance Test Results

38 Serena Software

WEB SERVER VIRTUAL BYTES

Web server virtual bytes are summarized below. Note that with 5% less transactions executed in SBM 11.1,the 10% improvement in the web server virtual bytes was the result of Application Engine performanceimprovements made in SBM 11.1.

Web Server Virtual Bytes

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 39

Performance Test Results

40 Serena Software

COMMON SERVICES CPU USAGE

Common Services Tomcat 7 CPU usage is summarized below. Note that Oracle 11g R2 provides the bestperformance.

The 20% higher Common Services CPU usage in SBM 11.1 was the result of Tomcat 7 Ehcache and LuceneSBM Search cache changes.

The spike in CPU usage at the beginning of the 200 virtual user baseline was the result of the CommonServices caches being populated.

Common Services CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 41

Performance Test Results

42 Serena Software

Common Services CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 43

COMMON SERVICES MEMORY USAGE

Common Services Tomcat 7 memory usage is summarized below. Note that Oracle 11g R2 provides the bestperformance.

The higher Common Services memory usage was the result of Tomcat 7 Ehcache and Lucene SBM Searchcache changes.

Performance Test Results

44 Serena Software

Common Services Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 45

Performance Test Results

46 Serena Software

NOTIFICATION SERVER CPU USAGE

Notification Server CPU usage is summarized below. Note that SQL Server 2014 SP1 provides the bestperformance.

The 25% improvement in Notification Server CPU usage was the result of Notification Server performanceimprovements made in SBM 11.1.

The spike in the CPU usage at the beginning of the 200 virtual user baseline was the result of theNotification Server Caches being populated.

Notification Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 47

Performance Test Results

48 Serena Software

Notification Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 49

Performance Test Results

50 Serena Software

ORCHESTRATION ENGINE ODE CPU USAGE

ODE Tomcat 7 CPU usage is summarized below. Note that SQL Server 2014 SP1 provides the bestperformance.

Orchestration Engine ODE CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 51

Performance Test Results

52 Serena Software

NOTIFICATION SERVER MEMORY USAGE

Notification Server Tomcat 7 memory usage is summarized below. Note that Oracle 11g R2 provides thebest performance.

At least 10% improvement in the Notification Server memory usage was the result of Notification Serverperformance improvements made in SBM 11.1 and the Concurrent Mark Sweep garbage collection policy.

Notification Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 53

Performance Test Results

54 Serena Software

Notification Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 55

ORCHESTRATION ENGINE ODE MEMORY USAGE

ODE Tomcat 7 memory usage is summarized below. Note that SQL Server 2014 SP1 provides the bestperformance.

At least 25% improvement in the ODE memory usage was the result of ODE performance improvementsmade in SBM 11.1 and the Concurrent Mark Sweep garbage collection policy.

Performance Test Results

56 Serena Software

Orchestration Engine ODE Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 57

Performance Test Results

58 Serena Software

DATABASE SERVER CPU USAGE

Database Server CPU usage is summarized below. Note that SQL Server 2008 R2provides the bestperformance.

The lower database CPU usage for the 200 virtual user baseline was the result of the performanceimprovements made in SBM 11.1.

Database Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 59

Performance Test Results

60 Serena Software

Database Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 61

ORCHESTRATION ENGINE DATABASE SERVER CPU USAGE

Orchestration Engine database server CPU usage is summarized below. SQL Server 2014 SP1 provides thebest performance.

At least 10% improvement in the Orchestration Engine SQL Server or Oracle CPU usage was the result ofApplication Engine and ODE performance improvements made in SBM 11.1.

Performance Test Results

62 Serena Software

Orchestration Engine Database Server CPU Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 63

Performance Test Results

64 Serena Software

DATABASE SERVER MEMORY USAGE

Database server memory usage is summarized below. Note that SQL Server 2014 SP1 provides the bestperformance.

The database server memory usage for the 200 virtual user baseline was the result of the performanceimprovements made in SBM 11.1.

Note: The SQL Server database server memory usage was reported low on Windows becauseSQL Server breaks down memory usage across Windows components.

Database Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 65

Performance Test Results

66 Serena Software

Database Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 67

ORCHESTRATION ENGINE DATABASE SERVER MEMORY USAGE

Database server memory usage is summarized below. Note that SQL Server 2008 R2 provides the bestperformance.

The lower database server memory usage for the 80 virtual user baseline was the result of the performanceimprovements made in SBM 11.1.

Note: The SQL Server database server memory usage was reported low on Windows becauseSQL Server breaks down memory usage across Windows components.

Performance Test Results

68 Serena Software

Orchestration Engine Database Server Memory Usage

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 69

Performance Test Results

70 Serena Software

MONGOD64 ON SEPARATE SERVER

Screenshots for Mongod64 CPU usage and Mongod64 memory usage are not included. At the INFO logginglevel, Mongod64 CPU usage is around 1% and Mongod64 memory usage is up to 700 MB for both SBM 11.1and SBM 11.0.1 for the 200 virtual user (SBM) and 80 virtual user (Orchestration Engine) performancebaselines.

TOMCAT 7 ODE EVENT PROCESSING PERFORMANCE

• In SBM 11.1 with SQL Server 2008 R2, 11,553 events were processed by ODE with SUCCESS statuswithin 30 minutes, 44 seconds.

Mongod64 on Separate Server

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 71

• In SBM 11.0.1 with SQL Server 2008 R2, 11,494 events were processed by ODE with SUCCESS statuswithin 30 minutes, 40 seconds.

• In SBM 11.1 with SQL Server 2014 SP1, 11,498 events were processed by ODE with SUCCESS statuswithin 30 minutes, 52 seconds.

• In SBM 11.0.1 with SQL Server 2014 SP1, 11,524 events were processed by ODE with SUCCESSstatus within 30 minutes, 55 seconds.

• In SBM 11.1 with Oracle, 11,541 events were processed by ODE with SUCCESS status within 28minutes, 33 seconds.

• In SBM 11.0.1 with Oracle, 11,522 events were processed by ODE with SUCCESS status within 30minutes, 38 seconds.

RELATIONSHIP SERVICE PERFORMANCE

Results for one Neo4J instance (not clustered), Tomcat 7 3GB Heap + Neo4J 1GB Heap, default 10 workerthreads are summarized below.

• Neo4J Java CPU% peak was 25% during the initial load (most CPU intensive period).

• Initial load totaled 1.2 million records (ExportToCSV + ImportToGraphDB), which took 108 minutes tofinish.

• Replicating/synchronization of new records in TS_CHANGEACTIONS table was processed at a rate of2,400 records per minute.

• For 1,137,446 records (and 14,118,424 relationships), Neo4J database disk usage was 3.9GB.

SMART SEARCH INDEXER PERFORMANCE

Results of the Smart Search indexer (Lucene), 4 GB Common Services Java Heap are summarized below.

• Initial index or re-index operations are now multi-threaded. 6 worker threads of execution quicklyprocessed all primary tables sequentially.

• After the initial index or re-index finished, the indexer periodically polled the database every 30seconds (by default) to update the index with new records from the TS_CHANGEACTIONS table, whichwas then searchable via Serena Work Center.

• Initial index or reindex of 1,102,276 items (with an associated 24,249 resources and 36,960attachments) in the TTT_ISSUES table took no more than 39 minutes to complete (indexdir foldersize was 230MB).

• The indexer finished the update of 46,357 change items in 17 minutes and 45 seconds.

Additional InformationNote the following performance testing information.

CONFIGURATION NOTES

• For details on preventing file locks by anti-virus software on the Tomcat 7 server file system, seeD22079 in the Serena knowledge base.

• You can improve Oracle 12c back-end performance by:

▪ Defining the correct cache size for the Automatic Memory Manager

▪ Enabling Automatic Storage Manager by defining file system and volume manager optimized forOracle database files

▪ Enabling Oracle Data Partitioning

• If TRACE level logging in Active Diagnostics is enabled for debugging SBM components on a 64-bitsystem with 8GB RAM, the mongod64 process will compete for memory with SBM Common Servicesand prevent Tomcat from using the 4GB Heap that is allocated to it.

Additional Information

72 Serena Software

• The default performance settings in SBM Configurator are used for load tests:

▪ Notification Server ForceSend = OFF

▪ Throttling of Orchestrations = OFF

• On the server that hosts SBM Common Services, the indexer settings are changed for load tests inSBM Configurator:

▪ Expected Search Usage = High

▪ Database Polling Interval = 30 seconds

• Serena recommends that you optimize table statistics for large SBM tables with an Oracle database(not required for SQL Server):

exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_PROJECTANCESTRYDENORM',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_TIMEINSTATE',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_TIMEZONEGMTOFFSETDENORM',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_CALENDARS',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_STATES',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_CHANGEACTIONS',cascade => true, estimate_percent => 10, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TTT_ISSUES',cascade => true, estimate_percent => 10, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_PROJECTANCESTRYDENORM',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_TIMEINSTATE',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_TIMEZONEGMTOFFSETDENORM',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_CALENDARS',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf' , tabname =>'TS_STATES',cascade => true, estimate_percent => 100, granularity =>'ALL', degree => 8);

• Oracle ODBC client connection to Oracle server configuration tips:

▪ Another variable for number of sessions used depends on what type of ODBC connectionApplication Engine is making as a client to the Oracle server. ODBC connection as Client10 (SSL) orvia Oracle Advanced Security requires an increase in processes/sessions on the Oracle server.

▪ To avoid server refused the connection errors in the Application Engine Event Viewer andJBDC connection exceptions in the Tomcat 7 server.log in the performance lab at 200vu load

Configuration Notes

Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 73

against Work Center, increasing the number of processes to 300 and sessions to 400 in Oracle issufficient:

sqlplus system/testauto@hpq5> alter system set processes=300 scope=spfile;

Verify change:

sqlplus system/testaut0@hpq5> show parameters processes;> show parameters sessions;

SBM 11.1 PERFORMANCE DEFECTS

The following defects have been submitted against the GA build of SBM 11.1:

• DEF286744 – SWC PERF 200vu: Submitter Lucene Keyword Search Performance Has Regressed InSBM 11.1

PERFORMANCE TEST EXCLUSIONS

The following items were excluded from performance testing.

• System report transactions were not included in 200vu load test.

• New SBM 11.1 feature transactions were not included in this comparison.

Reference____

ABOUT SERENA SOFTWARE

Serena Software provides Orchestrated application development and release management solutions to theGlobal 2000. Our 2,500 active enterprise customers, including a majority of the Fortune 100, have madeSerena the largest independent Application Lifecycle Management (ALM) vendor and the only one thatorchestrates DevOps, the processes that bring together application development and operations.

Headquartered in Silicon Valley, Serena serves enterprise customers across the globe. Serena is a portfoliocompany of HGGC, a leading middle market private equity firm.

CONTACT

Web site: http://www.serena.com/company/contact-us.html

Copyright © 2016 Serena Software, Inc. All rights reserved. Serena, TeamTrack, StarTool, PVCS, Comparex, Dimensions, PrototypeComposer, Mariner, and ChangeMan are registered trademarks of Serena Software, Inc. The Serena logo and Version Manager aretrademarks of Serena Software, Inc. All other products or company names are used for identification purposes only, and may betrademarks of their respective owners.

Reference

74 Serena Software