Traffic Mirroring to load test web application

Generating real traffic on your application is hard. Successful load testing lends the assurance of a high-quality application that can serve a large number of users of the organization.

First of all, Why do we need load testing?

When testing our website, app or API endpoint under a load, we are actually simulating how it will perform when hundreds, thousands or millions of users visit it, in real life. Our system might perform completely differently for one user (functional testing) compared to many (load testing), due to the system’s resources. So, to understand, analyze and fix errors, bugs, and bottlenecks before they actually happen, we need to understand the real users load and it is not wise at all to neglect the real people who will be using our system or product.

KPIs like response time, error rate, memory leakage, and CPU might be top-notch when running functional tests. But, when scaling to millions of users and running the tests from all over the globe, they could suddenly behave like an alien application.

The questions that we will be answered by load testing:

  • How much load our application is capable of handling?
  • How much load our system is capable of handling?
  • As of Micro-service, Which instance or container size will be ideal of certain load?

and many more.

Problems

There are a lot of ways to generate loads on application. But it is very very hard to generate real load on application through the tools like jmeter or other.

For already built and production application, we can test load via mirroring the real traffic to the stage or test server.

Demo

I will be using a very basic python-flask based application.

app.py

#  Project TrafficLoader is developed by Fahad Ahammed on 3/7/20, 11:12 AM.
#
#  Last modified at 3/7/20, 11:10 AM.
#
#  Github: fahadahammed
#  Email: [email protected]
#
#  Copyright (c) 2020. All rights reserved.

from flask import Flask, request
import os

app = Flask(__name__)

ENV = os.getenv('ENV', 'prod')  # two env: 1: dev, 2: prod and default is prod

APPLICATION_NAME = f"TrafficLoader_Application-{ENV}"
FILE_NAME_TO_SAVE_DATA = f"data-{APPLICATION_NAME}.txt"


def save_to_file(content):
    try:
        with open(FILE_NAME_TO_SAVE_DATA, "a") as file:
            file.write(str(content) + "\n")
            file.close()
    except Exception as e:
        return e


@app.route('/', methods=["GET"])
def hello():
    return f"Hello from {APPLICATION_NAME}"


@app.route('/save', methods=["POST"])
def hello_save():
    to_save = request.json  # { "message": "Hello..." }
    save_to_file(content=to_save["message"] + f" From {APPLICATION_NAME}")
    return f"Saving in {APPLICATION_NAME}"


if __name__ == "__main__":
    app.secret_key = b'_5#y__RO__4Q8z\n\xec]/'
    app.config["ENV"] = ENV
    if ENV == "dev":
        app.run(host="0.0.0.0", port=15002, debug=True)
    else:
        app.run(host="0.0.0.0", port=15001)

This application has two endpoints.

  1. “/”: Prints “Hello from the application.”
  2. “/save”: It is a POST method endpoint and takes a json as a body in

{ “message”: “Hello !”}

env

Now this application can run in two environment.

  1. dev
  2. prod

We can set environment in ubuntu/mac via

export ENV=dev

Without any environment assigned, it will by default run as production mode selecting ENV=prod and port=15001 and when setting that ENV variable to “dev”, it will take port as 15002.

Also according to that ENV, this application will create some variables for our testing help.

  1. Route “/” will return TrafficLoader_Application-{ENV}
  2. Route “/save” will set
    save json response to “data-TrafficLoader_Application-{ENV}.txt” with changed message of that request with identifying the app environment.

Run

Production

$ python3 app.py 
* Serving Flask app "app" (lazy loading)
* Environment: prod
* Debug mode: off
* Running on http://0.0.0.0:15001/ (Press CTRL+C to quit)

Request:

$ curl http://0.0.0.0:15001/
Hello from TrafficLoader_Application-prod

Development

$ python3 app.py
* Serving Flask app "app" (lazy loading)
* Environment: dev
* Debug mode: on
* Running on http://0.0.0.0:15002/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 180-761-601

Request:

$ curl http://0.0.0.0:15002/
Hello from TrafficLoader_Application-dev

Nginx

For traffic mirroring I will be using nginx.

TrafficLoader.conf

upstream production-server {
    server 127.0.0.1:15001;
}
upstream dev-server {
    server 127.0.0.1:15002;
}
server {
    listen 40000;
location / {
        mirror /mirror;
        proxy_pass http://production-server/;
    }
location = /mirror {
        internal;
        proxy_pass http://dev-server$request_uri;
    }
}

Here, I have declared two upstream for production and dev. And assumed this setting for production server and default location “/” is proxying to upstream production-server or the port of 15001.

For mirroring, I have to create another location drive pointing to “/mirror” and thus in dev-server upstream with all request taking on mirror.

Then I set ‘mirror /mirror’ to mirror the traffic in “location = /mirror {” directive which should proxy to dev-server upstream.

Request:

$ curl http://127.0.0.1:40000
Hello from TrafficLoader_Application-prod

So, our nginx proxy is working as expected: requesting to prod upstream thus prod server/app.

This request is actually sending in dev server too, internally.

Visualizing:

Normal Request:

POST Request:

Conclusion

This way we can mirror production traffic to our desired testing server. The the real users will have no side affects.

Previous ArticleNext Article
Fahad Ahammed is a System Administrator and DevOps in Software Department of OWSL. As a Linux enthusiast he breathes in server consoles or terminals. You can have a look to his website here: https://fahadahammed.com

Leave a Reply

Your email address will not be published. Required fields are marked *