Giter Site home page Giter Site logo

performance-is's Introduction

WSO2 Identity Server Performance

WSO2 Identity Server performance artifacts are used to continuously test the performance of the Identity Server.

These performance test scripts make use of the Apache JMeter to run the tests with different concurrent users and different Identity Server version.

The deployment is automated using AWS cloud formation.

Artifacts in the master branch can be used with the WSO2 Identity Server version 5.10.0.For previous versions, please use below branches

  1. is-5.9.0 for product version 5.9.0
  2. is-5.8.0 for product version 5.8.0 to 5.6.0

About the deployment

At the moment we support for two deployment patterns as,

  1. Single node deployment.
  • Single Node Deployment Diagram
  1. Two node cluster deployment.
  • Two Node Cluster Deployment Diagram

WSO2 Identity Server is setup in an AWS EC2 instance. AWS RDS instance is used to host the MySQL user store and identity databases.

JMeter version 3.3 is installed in a separate node which is used to run the test scripts and gather results from the setup.

Run Performance Tests

You can run IS Performance Tests from the source using the following instructions.

Prerequisites

Steps to run performance tests.

  1. Clone this repository.
git clone https://github.com/wso2/performance-is
  1. Checkout master branch for the latest Identity Server version or relevant version tag for previous releases.
cd performance-is
git checkout v5.8.0
  1. Build the artifacts using Maven.
mvn clean install
  1. Based on your preferred deployment, navigate to single-node directory or two-node-cluster directory.
  2. Run the start-performance.sh script. It will take around 15 hours to complete the test round with default settings. Therefore, you might want to use nohup. Following is the basic command.
./start-performance.sh -k is-perf-test.pem -a ******* -s ******* -c is-perf-cert -n wso2IS.zip -j apache-jmeter-3.3.tgz -- -d 10 -w 2

See usage:

./start-performance.sh -k <key_file> -a <aws_access_key> -s <aws_access_secret>
   -c <certificate_name> -j <jmeter_setup_path>
   [-n <IS_zip_file_path>]
   [-u <db_username>] [-p <db_password>]
   [-i <wso2_is_instance_type>] [-b <bastion_instance_type>]
   [-w <minimum_stack_creation_wait_time>] [-h]

-k: The Amazon EC2 key file to be used to access the instances.
-a: The AWS access key.
-s: The AWS access secret.
-j: The path to JMeter setup.
-c: The name of the IAM certificate.
-n: The is server zip
-u: The database username. Default: wso2carbon.
-p: The database password. Default: wso2carbon.
-i: The instance type used for IS nodes. Default: c5.xlarge.
-b: The instance type used for the bastion node. Default: c5.xlarge.
-w: The minimum time to wait in minutes before polling for cloudformation stack's CREATE_COMPLETE status.
    Default: 10 minutes.
-h: Display this help and exit.

What does the script do?

  1. Validate the CloudFormation template with given parameters, using the AWS CLI.
  2. Run the CloudFormation template to creat the deployment and wait till the stack creation completes.
  3. Extract the following using the AWS CLI.
    • Bastion node public IP. (Used as the JMeter client)
    • Private IP of the WSO2 IS instance.
    • RDS instance hostname.
  4. Setup the wso2 IS server in the instance and create the databases.
  5. Copy required files such as the key file, performance artifacts and the JMeter setup to the bastion node.
  6. SSH into the bastion node and execute the setup-bastion.sh script, which will setup the additional components in the deployment.
  7. SSH into the bastion node and execute the run-performance-test.sh script, which will run the tests and collect the results.
  8. Download the test results from the bastion node.
  9. Create summary CSV file and MD file.

Performance Analysis Graphs

We have added a performance analysis feature to the project, which allows you to generate performance plots based on CSV data files. This feature provides insights into response times for different deployment types and scenarios. By analyzing these performance plots, you can identify performance bottlenecks and make informed optimizations.

To use this feature, we have added the performance_plots.py script to the project. This script reads CSV data files, filters the data based on concurrency ranges, and generates performance plots using the matplotlib library. The generated plots are saved in the 'output' folder.

Additionally, we have updated the README.md file in the performance_analysis directory to provide detailed instructions on how to use the script, customize the settings, and understand the input CSV data format.

To get started with the performance analysis feature, please refer to the performance_analysis/README.md file for instructions and examples.

Legacy Mode

If needed to run the performance test in legacy mode, please use the legacy-mode branch. Legacy mode will include single node and 2 node setup deployments with previous test flows.

performance-is's People

Contributors

aaujayasena avatar ashendes avatar ashensw avatar buddhimah avatar chamathns avatar chrishantha avatar darshanasbg avatar dinikasen avatar inthirakumaaran avatar maheshika avatar mpmadhavig avatar ruchira2k avatar sachin-mamoru avatar sahanruwanga avatar senthalan avatar somindatommy avatar vihanga-liyanage avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

performance-is's Issues

Create Cleanup Scripts

Description:
Separate JMeter scripts to remove the users and Apps added for the tests can be useful.

apt-get update, will give lock-frontend issue

Executing with aws instances gives below error.

apt-get update, will give lock-frontend issue (E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)
E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?). Can not execute any other apt install until

OS: ubuntu 18.04
aws instance type: c5.xlarge

This because of the apt issue with new linux versions. [1]

[1] https://itsfoss.com/could-not-get-lock-error/

Add prefix to element names in CFN

Description:
ATM, we cannot run two stacks parallel due to the namespaces. If we can add a suffix to element names in CFN, we will be able to run multiple stacks at the same time. I'd suggest adding a unique key to all names and replace that key with a given suffix through the start script.

Need to collect DB Metrics

Description

It's identified that proper metrics of the RDS instance used for a test is really important in analyzing the results. We can use AWS cloud watch APIs to pull the data and as the first step, just logging the RDS CPU utilization against time in a text file would be enough.

Fix Http 200 Assertions

Description:
There are some JMeter scripts with Http/200 assertions added with a new line. This will cause an added error message in the JTL file, which will be an issue when summarizing results.

Merge all branches to the master

Description:
ATM, we have three separate branches with different types of performance testing setups. The proper way to manage these is to have separate directories in the same branch. We can move common scripts and JMeters to a common folder to avoid code duplicates.

Restructure naming format in benchmark results

This is regarding: https://github.com/wso2/performance-is/tree/master/benchmarks

Current filenames are as follows.
single-node-2-core-5.8.0-2019-05-14.md
single-node-4-core-5.8.0-2019-05-09.md
single-node-4-core-5.9.0-2019-10-03.md
2-node-cluster-2-core-5.8.0-2019-05-30.md
2-node-cluster-4-core-5.8.0-2019-05-24.md

We should remove the date as it does not effect for the perf result. As we plan to maintain pre-release results (milestones,alpha,beta,rc results) as well as post release results (wum packs) we need to have exact release version of the pack in the filename.

As an external user will be mostly looking for a performance benchmarks for a particular product version, IMO we can have the first identifier as product version number then go to the deployment details. Like,

  • 5.8.0-beta6_single-node_4-core.md
  • 5.8.0-rc2_single-node_2-core.md
  • 5.8.0_two-nodes_2-core.md
  • 5.8.0_two-node_4-core.md
  • 5.9.0_single-node_4-core.md

Need to think on the formatting with the docker based deployments as well.

Change summary.md format

Description:
Summary.md file needs to be changed to follow the newly discussed format. Instead of one large table of data, a more summarized set of tables will be added with this format.

Update perf benchmarks

We need the following benchmarks added.

  • 5.8.0-single-node-4-core
  • 5.8.0-single-node-2-core
  • 5.9.0-single-node-2-core
  • 5.9.0-2-node-2-core
  • 5.9.0-2-node-4-core

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.