Gorgon: A simple task multiplier analysis tool (e.g. loadtesting)
Load testing is something very important in my job. I spend a decent amount of time checking how performant are some systems.
There are some good tools out there (I’ve used Tsung extensively, and ab is brilliant for small checks), but I found that it’s difficult to create flows, where you produce several requests in succession and the input depends on the returned values of previous calls.
Also, normally load test tools are focused in HTTP requests, which is fine most of the time, but sometimes is limiting.
So, I got the idea of creating a small framework to take a Python function, replicate it N times and measure the outcome, without the hassle of dealing manually with processes, threads, or spreading it out on different machines.
The source code can be found in GitHub and it can be installed through PyPi. It is Python3.4 and Python2.7 compatible.
pip install gorgon

Gorgons were mythological monsters whose hair were snakes.
Gorgon
To use Gorgon, just define the function to be repeated. It should be a function with a single parameter that will receive a unique number. For example
def operation_http(number): # Imports inside your function # is required for cluster mode import requests result = request(get_transaction_id_url) unique_id = get_id_from(result) result = request(make_transaction(unique_id)) if process_result(result) == OK: return 'SUCCESS' return 'FAIL'
There’s no need to limit the operation to HTTP requests or other I/O operations
def operation_hash(number): import hashlib # This is just an example of a # computationally expensive task m = hashlib.sha512() for _ in range(4000): m.update('TEXT {}'.format(number).encode()) digest = m.hexdigest() result = 'SUCCESS' if number % 5 == 0: result = 'FAIL' return result
Then, create a Gorgon with that operation and generate one or more runs. Each run will run the function num_operations times.
from NUM_OPS = 4000 test = Gorgon(operation_http) test.go(num_operations=NUM_OPS, num_processes=1, num_threads=1) test.go(num_operations=NUM_OPS, num_processes=2, num_threads=1) test.go(num_operations=NUM_OPS, num_processes=2, num_threads=4) test.go(num_operations=NUM_OPS, num_processes=4, num_threads=10)
You can get the results of the whole suite with small_report (simple aggregated results) or with html_report (graphs).
Printing small_report result Total time: 31s 226ms Result 16000 512 ops/sec. Avg time: 725ms Max: 3s 621ms Min: 2ms 200 16000 512 ops/sec. Avg time: 725ms Max: 3s 621ms Min: 2ms
Example of graphs. Just dump the result of html_report as HTML to a file and take a look with a browser (it uses Google Chart API)
Cluster
By default, Gorgon uses the local computer to create all the tasks. To distribute the load even more, and use several nodes, add machines to the cluster.
NUM_OPS = 4000 test = Gorgon(operation_http) test.add_to_cluster('node1', 'ssh_user', SSH_KEY) test.add_to_cluster('node2', 'ssh_user', SSH_KEY, python_interpreter='python3.3') ... # Run the test now as usual, over the cluster test.go(num_operations=NUM_OPS, num_processes=1, num_threads=1) test.go(num_operations=NUM_OPS, num_processes=2, num_threads=1) test.go(num_operations=NUM_OPS, num_processes=2, num_threads=4) print(test.small_report())
Each of the nodes of the cluster should have installed Gorgon over the default python interpreter, unless the parameter python_interpreter is set. Using the same Python interpreter in all the nodes and controller is recommended.
paramiko module is a dependency in cluster mode for the controller, but not for the nodes.
As a limitation, all the code to be tested needs to be contained on the operation function, including any imports for external modules. Remember to install all the dependencies for the code on the nodes.
Available in GitHub
The source code and more info can be found in GitHub and it can be installed through PyPi So, if any of this sounds interesting, go there and feel free to use it! Or change it! Or make suggestions!
Happy loadtesting!
Have you looked on multimechnize
No, I didn’t know that, but it looks like an interesting tool. I’ll take a look.
Thanks for letting me know!