Performance-testing applications built on LiveCycle ES in the TEST and STAGING environments is now routinely done by Adobe’s customers. However, test results are not always reported in a consistent manner.
Most tests are usually driven by a load driver tool such as HP LoadRunner, Borland SilkPerformer, Apache JMeter or The Grinder. Of these tools, JMeter and The Grinder are free but have limited capabilities compared to their commercial peers such as simultaneously collecting server-side metrics such as CPU, disk, memory and network bandwidth utilization. These tools report elapsed time for each transaction, usually in seconds. Depending on the number of these data points in a time interval (usually an hour), you can obtain the transaction throughput and report it in terms of transactions per hour.
Importing this data into Microsoft Excel, Open Office Calc or Google Docs will let you analyze and chart the results. In the case of Microsoft Excel 2007, you will need to install the “Data Analysis Toolpack” Addin.
For more details on LiveCycle performance testing methodology, see this DevNet article.
The following are the results of a twelve-hour, 1 virtual user test conducted using the eTech LiveCycle Benchmark, a short-lived orchestration that Adobe’s eTech team uses for benchmarking LiveCycle performance. Since LiveCycle is a J2EE application, tests of durations less than 2 hours are essentially invalid. Most JVMs require up to 30 minutes to achieve steady state.
1) Create a Scatter Plot and Eye-ball the Results
In Excel, highlight the column containing your elapsed time numbers and choose the menu option Insert->Scatter.
The scatter plot above (click it to get a larger picture) tells me that except for some outliers (exceedingly high elapsed time) at the very beginning of the test, most of the transactions during the 12-hr test experienced elapsed times consistently between 36 and 39 seconds.
2) Generate Descriptive Statistics
Choose the menu option Data->Data Analysis->Descriptive Statistics. The Input Range should be the column containing your elapsed time numbers. Make sure that the checkboxes for ‘Summary Statistics’ and ‘Confidence Level for Mean’ are checked.
Mean : 37.3269204
Standard Error : 0.061905612
Median : 37.041
Standard Deviation : 2.149827787
Sample Variance : 4.621759516
Kurtosis : 352.1330688
Skewness : 17.45734641
Range : 51.27
Minimum : 36.544
Maximum : 87.814
Sum : 45016.266
Count : 1206
Largest(1) : 87.814
Smallest(1) : 36.544
Confidence Level(95.0%) : 0.12145476
In the above result, the important metrics are Mean, Median, Minimum, Maximum, Count and Confidence Level (95%). It essentially tells me that if I were to repeat the same test under identical circumstances, 95 out of 100 transactions can be expected to experience an elapsed time of between (Mean – Confidence Level) and (Mean + Confidence Level) which is 37.21 seconds – 37.45 seconds.
If you are comparing the results of two or more test runs, Mean, Median and Standard Deviation are useful. If you have two test runs who have Mean Elapsed Time values which are very close to each other, the Standard Deviation (SD) can be used to make a decision. The test run with a lower SD is better. Also, tests with lower Medians are better.
3) Calculate Percentiles
Choose the menu option Data->Data Analysis->Rank and Percentile. The Input Range should be the column containing your elapsed time numbers.
59 : 44.002 : 7 : 99.50%
1000 : 42.833 : 8 : 99.40%
1001 : 42.049 : 9 : 99.30%
6 : 41.922 : 10 : 99.20%
63 : 41.433 : 11 : 99.10%
4 : 40.289 : 12 : 99.00%
58 : 39.75 : 13 : 99.00%
9 : 39.52 : 14 : 98.90%
60 : 39.18 : 15 : 98.80%
5 : 39.161 : 16 : 98.70%
The above result (snipped) tells me that in the test, 99% of the transactions experienced an elapsed time of 39.75 seconds or less. This is a key metric if you have to satisfy Service Level Agreements (SLAs) with the business unit that uses the application. If this number is above the SLA, you have to add additional members to the cluster or run LiveCycle on better hardware.