Sure, Dockerizing JMeter in server mode for use in Digital Ocean's cloud service would be super easy. But, that's no way to learn a DevOps API!

In this talk, I walk through an application written using Spring Boot that executes 99,000,000 requests in about 40 minutes across 100 machines.

A friend and colleague runs a very popular API service called ipify.org.
At times, it's handled 2.5 million requests per second. It simply returns your IP
address as the Internet sees it. This friend is moving his API service from Heroku
to Digital Ocean (often referred to as DO). I thought it'd be a fun exercise
to load test his new infrastructure and learn the Digital Ocean API in the
process.

At under $0.01 / hour / small virtual machine, DO itself seemed like the
perfect place to do this load testing. How much traffic could I generate across, say,
100 of these small instances? This thought naturally led to JMeter. Not only does
JMeter allow you to run complex HTTP interaction scripts, it's built for distributed
processing. You fire up a bunch of server machines to do the heavy lifting.
A single client machine distributes the same test script to each of the server
machines. As each server machine executes the test script, the client gathers up the
test results into a single raw results file. Spoiler alert: I was able to run a
distributed test with 19,800 concurrent requests in under an hour that generated
a total of 99,000,000 requests to ipify.org.

I'm a big fan of Zed Shaw's The Hard Way series
of courses. The idea is that there's value in deep, slow learning of a given topic.
Early on in working with the DO API it became clear that the easiest
way to accomplish my goal would be to use Docker containers for my JMeter
servers. That would not give me a lot of experience with the DO API,
though. Too easy! So, instead, I wrote a program using Spring Boot that
interacts directly with the API.

Comments

Comments are closed.