Time taken for request to reach server + Server processing time + Time for response to return to client. The more the better
Number of transactions ( request / response)
--------------------------------------------
Unit of Time (milliseconds/seconds)
OR
Number of (K)bytes ( sent / received)
-------------------------------------
Unit of Time (milliseconds/seconds)
the more the better
Number of errors
------------------
Number of requests
the less the better
How well a system scales in terms of Response time , Throughput, and Reliability when additional resources are added.
The performance testing should be done against a list of performance requirements given in SLA documents. Some example like
The average / max response time should be ____ The system should be able to support____ pages per second The system should be capable of supporting at least____ users per hour
- Number of users
- Number of request
- Peak Load test
- Stress test
- Spike test
- Endurance test
http://jmeter.apache.org/download_jmeter.cgi
Add Jmeter's bin directory to the PATH
> jmeter \
-n \ # run in non graphical mode
-t \ # to specify the file name
testfile.jmx \ # file name containing the test script
-l \ # specify the results file
results.csv \ # result file name
-j \ # specify the log file
logfile.log \ # file name of lof file
-e \ # specify to generate a html report
-o # specify the output folder
Download ang jmeter plugin you want and install to /lib/ext
directory and restart jmeter.
Plugins worth installing
- Custom thread groups
- 3 basic graphs
- THroughput shaping timer
- Dummy sampler
Its the root of the test and all other components are under it.
Its the entry point of the test. It controls the number of threads( users) used to execute the test. We can add multiple thread group to a test plan and it acts like individual test case.
Allow us to define configuration values to be used in the test.
Logic controllers customize the logic of when to sent the request
Samplers make the request to the server.
Timers introduce delay between the request to simulate real world scenario.
Validate the response as expected is sent from server.
Jmeter collects information about the request it performs and listeners aggregate and show metrics from that information by listening to it.
The execution happens in hierarchical and ordered.
Hierarchical -> some component has higher priority over other components so they will be executed first even though they ore ordered below some other component.
Ordered -> the execution happen in the same way its ordered in the test plan
Test Plan
|
|-- Thread Group
|
|-- Home
|
|-- Transaction Controller
|
|-- Book Catalog
|
|-- Book Detail
|
|-- Login
In the above example for Ordered
Test Plan the order of execution will be from
Home --> Book Catalog --> Book Detail --> Login
Test Plan
|
|-- Thread Group
|
|-- Home
|
|-- Response Assertion 1
|
|-- Transaction Controller
|
|-- Book Catalog
|
|-- Book Detail
|
|-- Response Assertion 2
In the above example for Hierarchical
Test Plan, response assertion 1 apply to home request and response assertion 2 apply to book catalog and book detail request. So the order of execution will be from
Home --> response assertion 1 Catalog --> response assertion 2 book detail --> response assertion 2
Configuration Elements --> Pre Processors --> Timers --> Logic Controllers / Samplers --> Post processors --> Assertions --> Listeners
The post processors, assertions, and listeners will execute only if there is response from the samplers. Similarly timers, pre and post processors are only executed if there is samplers for which they can be applied to.
User defined variables will be executed first no matter where its placed. Configuration manager element placed in the nested child node will override the same setting from the parent. Thus configuration default elements are merged while managers are not.
Test Plan
|
|-- Thread Group
|
|-- Transaction Controller 1
|
|-- Http Request default 1
|
|-- Home
|-- Transaction Controller 2
|
|-- Login
|
|-- User defined variables
|
|-- Http Request default 3
So the scope will be
- User defined variables ( and it applies to both Transaction controller 1 & 2)
- Http Request Default 1 ( applies to Transaction controller 1)
- Http Request Default 2 ( and it applies to both Transaction controller 1 & 2)
The execution order will be Uniform random timer -> COnstant timer -> Home -> Response Assertion -> View results tree
The execution order will be
Timer -> Home Request -> Post processor 1 -> Post processor 2 Catalog Request -> Post Processor 1 -> Post Processor 3 -> Assertion -> View Results Tree
The execution order will be
Timer 1 -> Timer 2 -> Home Request -> Timer 1 -> Timer 2 -> Catalog Request
The execution order will be
Home Request -> Think Time 1 -> Catalog Request -> Think Time 2
So a think timer executes after a request while the normal timer executes before the request.
- Rename the default test plan to Performance Test Plan
- Add
Http Request Defaults
-> Configuration Defaults - Add server name or ip as
localhost
and port as8080
- Add
Thread Group
and rename it as User Thread Group - Add
Transaction Controller
andcheck
the Generate parent sample - Add
Http Request
under Transaction Controller and rename as Home Request and set thepath
to be/
- Ad another
Http Request
under Transaction Controller and rename as Book Catalog Request and set thepath
to be/books
- Add
Response Assertion
under Transaction Controller and clickResponse Code
as the Field to test, and clickAdd
to add a pattern to test and set200
as the pattern - Under User Thread Group add another
Http Request
and rename as Book Detail Request and set thepath
to be/books/1
- Add a
Uniform Random Timer
under User Thread Group and set the random delay to be7000
milliseconds - Add 2 Listeners ,
View Results Tree
andSummary Report
under User Thread group - Now save the script as
PerformanceTestPlan.jmx
from the root of the project run the command
docker build -f ./dockerfiles/app.Dockerfile -t app:v1 .
docker run app:v1 -p 8080:8080
Now the app will run and the host port 8080 is bound to the app.
jmeter -n -t PerformanceTestPlan.jmx -l results.csv -j logfile.log -e -o ./jmeterout
Configure your browser to proxy at localhost:8888. Now add Recording
template from File-> Templates. Change the host to record as localhost
and scheme as http. Click start to start the recording in the HTTP(S) Script recorder. A Recorder : Transaction Controller popup will appear.
Follow the below steps,
- Give the name
Home
in pop up and visit the home page. - Give the name
Catalog
and visit the browse catalog page - Give the name
Login
and visit the login page and login to the system usinguser1
as credentials. - Give the name
BookDetail
and browse 1 book - Give the name
Review
give a review for that book - Now stop the recording.
- Right Click on the Thread Group and validate the test.
- Save as
RecordingTestPlan.jmx
To record a HTTPS site, import the Jmeter's certificate from the bin
folder to the Trusted Authorities in browser.
Add a CSV Data set config and browse the login.csv file, to import the csv data. Set the variable names as USERNAME,PASSWORD
and set false to Recycle on EOF and set true to Stop thread on EOF. Update the login request parameters as ${USERNAME}, and ${PASSWORD}. Change the number of threads to 3 in the Thread group and run the test. The different user credentials from the csv is taken and passed to server.
Add a CSV selector extractor post processor under the login POST request and give the name as BOOK_URL
, CSS selector expression as div[class*="single-product"] div a
and attribute as href
. 0 to pick a random elem,ent and 9999 for default value. Now the ${BOOK_URL}
can be used instead of hardcoded URL.
Add Ultimate Thread Group
and add a new row and set Start Thread COunt to 100 and Initial Delay to 0 and StartupTime to 30 and Hold for 10 seconds and shutdown in 1 second. The variables can be adjusted to make more realistic stress test. The other configurations, samplers and listeners can be copied from the previous RecordingTestPlan.jmx
and this test script is saved as StressTestThread.jmx
.
Reusable part of test script can be extracted into a test fragment and then reused in other tests through module controller or include controller.
Satisfied Count + Tolerated Count / 2
---------------------------------------
Total Samples
- Minikube or docker-desktop K8s
- Kubectl
- docker
Run the create_everything.sh
file to set up and run the distributed performance test.
Update the PerformanceTestPlan.jmx file to add a influxdb backend listener and change the url of influx db to the influxdb service url.
In the grafana dashboard import the grafana_template.json file.
After all test is done , then delete_cluster.sh
is used to delete the cluster.
Referenced from https://blog.kubernauts.io/load-testing-as-a-service-with-jmeter-on-kubernetes-fc5288bb0c8b.