Skip to main content

Automating JMeter test plans in Jenkins (aka A Jenkins plugin a day... Performance Plugin)

JMeter (http://jmeter.apache.org/) is an Open Source tool written in Java to test system performance simulating different types of heavy load on the system under test using different protocols (HTTP, SOAP, JDBC, LDAP, JMS, etc.). This post is specific for people who want to understand how to automate JMeter test plans execution using Jenkins: I am assuming the reader has good knowledge of JMeter and Maven and omitting some JMeter and Maven features details. The JMeter release I am referring to in this post is the 2.12.
Test plans in JMeter can be executed by command line or through its Java Swing GUI. Running through Jenkins doesn't require them both. You need to use just Jenkins (of course), Maven and the Performance Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Performance+Plugin). You can use the JMeter GUI just to make changes to a test plan.
Create the test plan.
Let's create a simple test plan first. Suppose it is a HTTP test plan. You can start from Web Test Plan, one of the templates available in JMeter. This template creates a HTTP test plan for the Google home page (www.google.com). Add a parameter for the Number of Thread property of the Thread Group:


execute the test plan to check that everything is fine and then save it (for this example the chosen name is HTTPExampleTestPlan.jmx). As stated before, this is the only step for which you need to use JMeter.

Create the Maven project.
Create a folder for the Maven project and copy the .jmx test plan file into the src/test/jmeter subfolder. Then create the POM file for the project:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.example</groupId>
    <artifactId>jmeter-demo</artifactId>
    <packaging>jar</packaging>
    <version>1.0-SNAPSHOT</version>
    <name>jmeter-demo</name>
    <url>http://maven.apache.org</url>
   
    <build>
        <plugins>
            <plugin>
            <groupId>com.lazerycode.jmeter</groupId>
            <artifactId>jmeter-maven-plugin</artifactId>

            <version>1.4.1</version>
            <configuration>
                <testResultsTimestamp>false</testResultsTimestamp>
                <propertiesUser>
                    <threadCount>${performancetest.threadCount}</threadCount>
                </propertiesUser>
                <propertiesJMeter>
                    <jmeter.save.saveservice.thread_counts>true</jmeter.save.saveservice.thread_counts>
                </propertiesJMeter>
            </configuration>

            <executions>
                <execution>
                    <id>jmeter-tests</id>
                    <phase>verify</phase>
                    <goals>
                    <goal>jmeter</goal>
                    </goals>
                </execution>
            </executions>
            </plugin>
        </plugins>
    </build>
</project>


You need to write just a basic POM file that includes only the JMeter Maven plugin (http://jmeter.lazerycode.com/). The <configuration> section of the plugin is the JMeter configuration. Here you can tell Maven that the test plan is parameterised and which configuration property is parameterized (in our example the threadCount property only). You can notice that there is no reference to the test plan name in the POM file code. Any JMX file that is saved in the src/test/jmeter subfolder is automatically executed at runtime by the JMeter Maven plugin. You can run this project from a shell command to test it before moving to Jenkins:

mvn jmeter:jmeter -Dperformancetest.threadCount=2

The results of this test will be created in the target/jmeter/report folder.

Install the Performance Plugin.
Install the Performance Plugin from the Jenkins Update Center the usual way. This plugin allows capture from JMeter and JUnit reports and generation of charts with the trend report about performance. In this post we will focus on the JMeter part only.

Create the Jenkins build job.
From the Jenkins Dashboard create a new freestyle build job and make it paremeterised in order to accept the value for the thread count property of the Meter test plan:



Add a Maven build step. Set the goals the same way as for running the Maven project from a shell command and set the location of the POM file of the project:



Add then the Performance plugin as post-build action and set the location of the report file inside the Maven project:



Finally set the other properties (thresholds) for your test plan execution:



Execute it.
Set the schedule for the build job and execute it. After each build you can see a text summary of the test result in the console output:

...
[info] Completed Test: HTTPExampleTestPlan.jmx
[info]  
[info] Building JMeter Report(s)...
[info]  
[info] Raw results: C:\Guglielmo\Code\JMeter\target\jmeter\report\HTTPExampleTestPlan.jtl
[info] Test report: C:\Guglielmo\Code\JMeter\target\jmeter\report\HTTPExampleTestPlan.jtl-report.html
[INFO]  
[INFO] Test Results:
[INFO]  
[INFO] Tests Run: 1, Failures: 0, Errors: 0
[INFO]  
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11.731s
[INFO] Finished at: Tue Dec 09 10:55:37 GMT 2014
[INFO] Final Memory: 7M/15M
[INFO] ------------------------------------------------------------------------
Performance: Percentage of errors greater or equal than 0% sets the build as unstable
Performance: Percentage of errors greater or equal than 0% sets the build as failure
... 

and in a graphical format as well in the project page for the build job:



Clicking on the Performance Trend link in the left column of the project page
you can see both the overall and the single test cases performance trend (in our example they overlap because we have a single test case) across the build job executions:



Clicking on the Performance Report link in the left column of each build page

you can see the full detailed performance report



Clicking on the test name in the table you can see the details of the test case for the current run split by thread (the example shown in the following figure refers to a run with THREAD_COUNT = 4):



Following the simple steps described in this post you can schedule and execute any type of JMeter test plan with any type of protocol through Jenkins.

Comments

Popular posts from this blog

Streamsets Data Collector log shipping and analysis using ElasticSearch, Kibana and... the Streamsets Data Collector

One common use case scenario for the Streamsets Data Collector (SDC) is the log shipping to some system, like ElasticSearch, for real-time analysis. To build a pipeline for this particular purpose in SDC is really simple and fast and doesn't require coding at all. For this quick tutorial I will use the SDC logs as example. The log data will be shipped to Elasticsearch and then visualized through a Kibana dashboard. Basic knowledge of SDC, Elasticsearch and Kibana is required for a better understanding of this post. These are the releases I am referring to for each system involved in this tutorial: JDK 8 Streamsets Data Collector 1.4.0 ElasticSearch 2.3.3 Kibana 4.5.1 Elasticsearch and Kibana installation You should have your Elasticsearch cluster installed and configured and a Kibana instance pointing to that cluster in order to go on with this tutorial. Please refer to the official documentation for these two products in order to complete their installation (if you do

Exporting InfluxDB data to a CVS file

Sometimes you would need to export a sample of the data from an InfluxDB table to a CSV file (for example to allow a data scientist to do some offline analysis using a tool like Jupyter, Zeppelin or Spark Notebook). It is possible to perform this operation through the influx command line client. This is the general syntax: sudo /usr/bin/influx -database '<database_name>' -host '<hostname>' -username '<username>'  -password '<password>' -execute 'select_statement' -format '<format>' > <file_path>/<file_name>.csv where the format could be csv , json or column . Example: sudo /usr/bin/influx -database 'telegraf' -host 'localhost' -username 'admin'  -password '123456789' -execute 'select * from mem' -format 'csv' > /home/googlielmo/influxdb-export/mem-export.csv

Using Rapids cuDF in a Colab notebook

During last Spark+AI Summit Europe 2019 I had a chance to attend a talk from Miguel Martinez  who was presenting Rapids , the new Open Source framework from NVIDIA for GPU accelerated end-to-end Data Science and Analytics. Fig. 1 - Overview of the Rapids eco-system Rapids is a suite of Open Source libraries: cuDF cuML cuGraph cuXFilter I enjoied the presentation and liked the idea of this initiative, so I wanted to start playing with the Rapids libraries in Python on Colab , starting from cuDF, but the first attempt came with an issue that I eventually solved. So in this post I am going to share how I fixed it, with the hope it would be useful to someone else running into the same blocker. I am assuming here you are already familiar with Google Colab. I am using Python 3.x as Python 2 isn't supported by Rapids. Once you have created a new notebook in Colab, you need to check if the runtime for it is set to use Python 3 and uses a GPU as hardware accelerator. You