Recording Load Tests
With the universally useable HTTP(S) Proxy Recorder, test scenarios can be recorded using any Web Browser –Internet Explorer, Firefox or Safari – and can include AJAX calls and requests made from pop-up windows.
Data traffic for HTTP(S)-based web service client programs can also be recorded.
Simple test scenarios can be created automatically with an Integrated Web Crawler which discovers all the pages comprising a website. Test scenarios can also be created manually by importing pre-defined URL calls from a Self-Written Definition File.
Universal HTTP/S Proxy-Recorder
Note: when using Firefox, an Add-On named “Firefox Recording Extension” supports even more convenient load test scenario recording. The Add-On is displayed in the “Firefox Toolbar” at the top of the web browser window.
Supported Authentication Methods
Apica ProxySniffer supports the following authentication methods during recording and load test execution processes: HTML Form-Based Login HTTP Basic Authentication HTTP Digest Authentication NTLM Kerberos SSL Authentication using X509 Client Certificates in PKCS#12 format SSL Authentication using Smart Cards and HSMs (PKCS#11 security devices)
Automatic Protection from False Positive Measurement Results
To prevent false conclusions from inaccurately executed measurements, such as checking only the HTTP response codes in URL calls, Apica ProxySniffer provides a protective mechanism against “false positive” measurement results. When recording tests. this function automatically analyses all URL response data content and configures most useful text-search fragments in order to automatically check the content of the recorded web pages during the execution of a load test:
Integrated Web Crawler
Self-Written Definition Files
Self-Written Definition Files are often used to create load tests for Web Services, for example if no client program is available for recording.
Self-Written Definition File: -defaultURL http://188.8.131.52 -defaultRequestContentDirectory “D:XmlRequestData” -defaultRequestContentType “text/xml” POST /putDataDo?action=addAddress 200 -requestContentFile requestData1.xml -responseContentType “text/xml” POST /putDataDo?action=getAddressList 200 -requestContentFile requestData2.xml -responseContentType “text/xml”
The values of CGI and HTML form parameters can be extracted and assigned in an easy way by just clicking on the corresponding extract and assign icons.
CGI and HTML Form Parameter and XML, SOAP and JSON data
from Input Files, whose data are read at run-time in sequential or randomized order from HTML form parameters; for example, hidden form fields from CGI parameters contained in hyperlinks, in form actions, or in HTTP redirects from HTTP response header fields from values of received XML and SOAP data from values of received JSON data from any text fragments of received HTML and XML data from User Input Fields (which are arbitrary configurable load test input parameters) from output parameters of Load Test Plug-Ins A variable can be assigned as follows, irrespective of how it was extracted: to the value of an HTML form field to the value of a CGI Parameter to a value of XML and SOAP data to a any text fragment of a HTTP request (within the HTTP request header as well as within the HTTP request content) to the protocol (http/https), or the host name or the TCP/IP port of one or all URL calls to the path and the name of an uploaded file to the user’s think time of one or all web pages to the response verification algorithm of an URL call (searched text fragment or size of received data) to the number of iterations, and/or the pacing delay, of an inner loop to some HTTP request header fields (most request header fields are automatically handled by Proxy Sniffer) to an input parameter of a Load Test Plug-In
Dynamically Exchanged Session Parameters
Dynamically exchanged session parameters such as .NET VIEWSTATE values, or also at runtime created order numbers of a shopping basket, can often be handled automatically in the GUI with just one mouse click on the icon in theVar Finder menu. But also – if automatically handling is not possible – only extracting of the variable has to be done manually. The assignments to all succeeding URLs where the variable should be used can be done implicitly by selecting the corresponding check box in the variable extract dialogue:
“Input Files” (data files) can be used to extract variables from a text file, such as a username and a password per simulated user – that can then be assigned to a login form. The functionality of input files is generic which means that variables for any purposes can be extracted.
User Input Fields
“User Input Fields” are arbitrary global variables whose values are requested each time a load test is started. The following example uses two User Input Fields: The first User Input Field is used to make the host name of the URL calls variable, in order that the same load test program can be executed against a development system and a test system, without the need to record two web surfing sessions. The second User Input Field is used to make the User’s Think Time of all Web Pages variable.
Measurement Results Per Test Run
Results per URL Call (Overview)
Results per URL Call (Details)
Response Time per Page
In-Depth Measurement of HTTP(S) Response-Streams/Detection of Jerky Video Playback
Proxy Sniffer supports the in-depth measurement of HTTP(S) response-streams. This feature is especially useful for Web sites that contain videos and allows you to detect whether jerky video playback occurs during viewing of a video and to then diagnose if enough network bandwidth is available for all users so that the video can be viewed by each user without interruption. Note that Apica ProxySniffer can measure only video streams that are delivered within a single URL response (such as from YouTube). This feature can also be used as a reference for optimizing response data. Colored charts show elapsed times for receiving fragments of user data (in red) and overhead data of the chunked protocol (in blue). Image 1 of 5:Enabling the option “Resp. Throughput Chart per Call” when starting the load test.
2 of 5: Calling the captured data of the HTTP streams. Image 3 of 5: Measured internal throughput of a video on a preset viewing time of 3 minutes (180,000 milliseconds). The linear flow and the flow rate peak at the beginning of receiving the data indicates that the delivery is made by a special video server which prevents on the one hand network peaks and ensures on the other hand that no jerky video playback occurs.
Image 3 of 5: Measured internal throughput of a video on a preset viewing time of 3 minutes (180,000 milliseconds). The linear flow and the flow rate peak at the beginning of receiving the data indicates that the delivery is made by a special video server which prevents on the one hand network peaks and ensures on the other hand that no jerky video playback occurs.
4 of 5: Throughput measurement of a PDF document which should be received in 30 seconds by a linear network throughput, in order that the beginning of the document can already be viewed after some few seconds. The second measured sample does not meet this requirement.
5 of 5: Throughput measurement of a HTML response received from a Web portal server. It is conspicuous that the most response time is spent in the chunked protocol overhead, but that the user data (payload) is received in a relatively short time. One explanation could be that the Web page is “calculated” piece by piece by the portal server (page navigation, page main content, page footer), and that some server internal delay times occurred during the calculations..
Users Waiting for Response
TCP Socket Connect Time
SSL Cache Efficiency
Real-Time Monitoring and Real-Time Error Analyses
The screenshot shown above shows the real-time overview window of a single Exec Agent job. For cluster jobs as shown below, a different real-time overview window is displayed. By clicking on the magnifier icon of a cluster member (in the column Job ID), the real-time overview window of the corresponding Exec Agent job can be displayed.
Real-Time Overview Window
The screenshot show above shows the real-time overview window a single Exec Agent job. For cluster jobs as shown below a different real-time overview window is displayed. By clicking on the magnifier icon of a cluster member (in the column Job ID) the real-time overview window of the corresponding Exec Agent job can be displayed.
Email and SMS Alert Notifications
Email and SMS Alert Notifications can be released during the execution of a Job. For example if a predefined threshold of the response times is exceeded, or if too many errors (session failures) are measured within a configurable interval. Additionally, informative Email and SMS notifications can also be released when a job cannot be started, when a job starts, when a job crashes (internal error) and when a job has been completed.
PDF Reports can be created about Measurement Results per Test Run, about Comparisons of Test Runs, and as well as for measured Load Curves. Additionally, comments related to test measurement results can – from within a Preview Function – be inserted at various points. In this way, it is possible to create fully commented Load Test reports directly within Proxy Sniffer, without the need for any additional text editing. Furthermore the PDF reports can also be branded with the logo of your company.
Load Test Plug-In Wizard
Load Test Plug-ins are self-written enhancements to the Proxy Sniffer product. These Plug-Ins are configured using the “Var Handler”, and are called by Proxy Sniffer during the Load Test execution. This is the only functionality where Proxy Sniffer cannot keep its promise of “no programming knowledge required”. The core function(s) of a Plug-In must be created by a user experienced in Java programming. After the Plug-In has been created, it can be used in every Load Test Program, and the use of a Plug-In in itself does not require knowledge of Java programming. In order to simplify the process of creating a Proxy Sniffer Plug-In, a wizard named “Plug-In Template Generator” is available which automatically generates all the Java code necessary for the integration of the Plug-In with the Proxy Sniffer product; thereafter, only the core function(s) of the Plug-In need to be coded by a programmer.
Command Line Support
The following example of a Shell Script, written on Mac OS X, executes the same load test several times with a different number of users (1, 20, 50, 100, 200 and 500 users) on a load generating cluster. #!/bin/bash # set the Java CLASSPATH to prxsniff.jar, to the default directory, and to the Proxy Sniffer installation directory export CLASSPATH=/Applications/ProxySniffer/prxsniff.jar:/Applications/ProxySniffer/iaik_jce_full.jar:.:/Applications/ProxySniffer # change to directory of load test program cd /Applications/ProxySniffer/MyTests # clear all data in the analyse load test menu of the GUI java PdfReport clear # loop repeatedly over simulated users # ——————————————- for users in 1 20 50 100 200 500 do # define the load test program and its arguments. Note: if the program is zipped you have to add “.zip” to the program name loadTestProgram=”Test01″ loadTestProgramArgs=”-u $users -d 180 -t 60 -sdelay 100 -maxloops 0 -sampling 10 -percpage 100 -percurl 100 -maxerrmem 20 -nolog” # define the load generating cluster name clusterName=”clusterA” # define the load test result file currentDate=`date “+%d%h%y_%H%M%S”` loadTestResultFile=”`echo $loadTestProgram`_`echo $currentDate`_`echo $users`u.prxres” # create the cluster job java PrxJob -s transmitClusterJob “$clusterName” $loadTestProgram $loadTestProgramArgs prxstat=`cat PRXSTAT` if [ $prxstat -lt "0" ]; then echo “unable to define job, status = $prxstat” exit 1; fi jobId=$prxstat # start the load test job on the cluster. Note: if an input file must be splitted you have to use the -split argument # see application reference manual java PrxJob -s startClusterJob “$clusterName” $jobId prxstat=`cat PRXSTAT` if [ $prxstat -ne "0" ]; then echo “unable to start job, status = $prxstat” exit 1; fi echo “$loadTestProgram started with $users users on $clusterName, job ID = $jobId” # wait until job is completed java PrxJob -s waitForClusterJobCompletion “$clusterName” $jobId prxstat=`cat PRXSTAT` if [ $prxstat -ne "0" ]; then echo “unable to wait for job $jobId, status = $prxstat” exit 1; fi echo “job ID = $jobId completed on $clusterName” # acquire load test result file java PrxJob -s acquireClusterJobResultFile “$clusterName” $jobId “$loadTestResultFile” prxstat=`cat PRXSTAT` if [ $prxstat -ne "0" ]; then echo “acquire of load test result file failed, status = $prxstat” exit 1; fi echo “load test result $loadTestResultFile acquired” # load result into analyse load test menu of the GUI java PdfReport load $loadTestResultFile # end loop over simulated users # ———————————– done
Download ZebraTester 5.5-A
For Use in the Apica LoadTest Portal
Your free Apica ZebraTester script tool allows you to create scripts and manage tests using a free local load generation instance known as an “Exec Agent” for up to 20 minutes per load test. Easily expand loads as your needs require.
Get Started - Installation and Configuration Guide - Browser Recording Extension
Windows Server Edition (64 bit)