Thursday, December 01, 2011

Steps For Performance Testing



Steps for Performance Testing


Tasks


1.Develop the Performance script (Record in VU)
2.Customising the performance Script and test it for 1 user 10 iteration.
3. Deploy the performance script on the specified box.
4. Run the performance test as per the scenarios
5. Generate Reports from the rational tool
6. Download the reports by PC Anywhere FTP.
7. Create Excel spread sheet (Report summary)
8. Help developers to identify the performance issues on the development stage by conducting Various tests.

Steps 3 & 8 is done at US end by Sateesh Balamanickam

 

Script Recording and Customization procedure

 

Task 1 and 2


Steps


1.             Run the Rational Robot Version 2001a.04.00
2.             Select the UNC project in the Repository and provide User and Pwd.
3.             Record the Script in VU mode only
Pre-Requisite for Recording are
a.     Click sequence of the site should be available with specifications
b.    View the site manually and verify that there is no breakage or missing links or slow downloading pages.
c.      
4.             Session naming convention  for the VU scripts as follows. :
a.     Site Name –version no -ddmmyyyy
5.             Script naming convention  at the end of the script recording as follows. :
a.     Site Name–## ( Sequence number).

6.             Also make sure the Generate script is successful.
7.             Following are the Session Recording options  under Tools ->Session recording options…































8.             IF the recording is not successful then split the scripts into multiple small scripts with minimum functionality thereby covering the full coverage as per the Click sequence provided.
9.              While recording use the block option and name the block with # and the block name. ( This # will be customized later on)








Procedure for the Script customization


Steps


1.             Verify the variables SgenRes00# has been re used, ie there should not be any hanging variables. If found then it has to deleted & recompile the script to verify the action taken.
2.             Change the Start_time and Stop_time block with uniform sequence of number
Replace #  with 01, for 1st Block, 02 for 2nd Block. Uniformly ie it should also end in the similar pattern.
3.             Compile and verify with any major change in the customization done, if error found then rectify before the save of the customization.
4.             Customize the transaction names from Home~ to the detail names. These are done at http_header_recv, http_nrecv
5.             These are done for the easy reporting of the output generated by Rational.



Other System Requirements are for 50  VU to run it requires 50 * 8 MB of RAM + 66 MB for the O.S.

Suite Settings and Runtime  settings



























Option setting for the VU script in the TSS Environment Variables
This comes up when  Select Suite-> Edit Settings

Then Double click on the TSS Environment box.
For the next Tab


Here Average think time and Think Maximum should be changed.
In next Tab TSS the following should be checked.


In the next screen also set the time out






Rest of the tabs the options remains the same.






This screen comes only when multiple computers are selected to run the Agent
Then Computer setting will be on.






Above are the Run time settings for the scripts








The above settings when running the script it pops  up  change the Log options & Resource monitoring.
Log naming convention is nn VU # 05 Runs 01 represents nn virtual users with 5 iterations running for the 01st time.











































For Multiple Scenarios


Sequential has to be checked not balanced.


125                               1Hour               75 AE & 25 Manager & 20 CSR & 5 Admin










Setting for the Report Generation


Reports Used
  1. Command Status
  2. Response Vs Time
  3. Performance Report

Use the Response type to specified type as Timer and Percentile as 25,50,75,90,95 & MAX.
Save the reports in the standard naming convention in the target directory.

Steps for Report Generation


  1. Copy Standard template with the required name using standard name convention. Naming convention is Name of application + “ Stress Test”+ Date.
  2. Prerequisite files are
    1. Report will be generated from Performance, Command Status and Response vs. time report with all combinations.
    2. User Specifications and Pass Criteria provided.

User Specification
Threshold Time set by the Application owner
Rational Delays
Time delays that rational use for a particular command
Specifications
Sum of Threshold Time and the Rational Delay Time
Threshold Time Standards
Static or Light content Page - 4 Secs
Pages that pull data from Database - 9 Secs

  1. Application Details Tab: This is a summary of all the other tabs in the report.
1.     Name is the Name of the Application
2.     Measured Transactions is the total number of Command names
3.     Total users is number of users varies depending upon application
4.     Concurrent users is the last number of Virtual Users
5.     Anticipated Peak Usage is 15% of total Virtual Users
6.     User Specification has to taken from click streams.
7.     Test Number is number of tests done on the application.
8.     Virtual Users is the number of users connected to the application.
9.     Runs are number of iterations that script has been executed.
10.  Start time and End time is available from Test log files.
11.  Availability Yield comes from User results page.
12.  Performance Yield comes from Performance results page.
13.  Pass/Fail =IF (H35>'Application Details'! $C$19,"Pass","Fail".
If Performance yield (Column: H) is greater than Pass Criteria (Cell: C) then the result is Pass otherwise it is Fail.

  1. User Results Tab:
1.     Conclusion is derived from overall status.
2.     Availability Yield
a.     Commands are populated from Command Status Report (From the bottom line).
b.     Successful is populated from Command Status Report (From the bottom line).
c.      Availability yield comes from Successful/Commands.
d.    Pass/Fail will be decided as per the criteria in Application Details Tab(C 18)
3.     Performance Yield
a.     Command Names unique transaction names as customized in the   
Script.
b.    Specification is same as per the application details.
c.     Min, 25%, Median, 75%, Max and Mean are from the Command Status Report.
d.    Runs and Passed (Met Spec) Comes from Pivot Table from Performance Results Tab.
e.     Yield is Passed/Runs.
f.     Pass/Fail will be decided as per the criteria in Application Details Tab(C 19)


  1. Performance Results:
  1. Transaction, Time Stamp, Resp is copied from Response vs. Time Report.
  2. Resp (sec) is Resp (Milli – Secs)/1000
  3. Spec Value is as per the application detail.
  4. If transaction met its specification target “Met Spec” will be set to 1 otherwise it is set to 0.
  5. Performance Summary
    1. It is a pivot table generated from Performance data. (B – G)
Generation of Pivot table
a)     Right click on table. Select wizard.
b)    A pop window named Pivot table and Pivot Wizard comes up. Click on back so that it goes to step 2 of 3.
c)     Select range and click next and then finish.
d)    Right Click and select Refresh data.
e)     Grand total should match the Total number of rows.
            Any cell in Met Spec should not be N/A. 

No comments: