October 2020 - Browser Bots

Up until now you’ve been load testing at the HTTP protocol layer. Your Loadster scripts consisted of a sequence of HTTP/S requests, chained together, with validators and capturers bolted on when necessary. This approach made testing pretty much any web application possible, but not always easy.

For some web apps, testing at the protocol layer is actually quite difficult, especially when you have to capture parameters from a server response and use them in a subsequent request. Scripting OAuth and SAML flows at the protocol layer can be particularly cumbersome because these flows require a lot of capturing and regurgitation of special tokens.

The solution? Loadster’s new Browser Bots. Browser Bots make scripting such complicated interactions a whole lot easier, because they control real headless Chromium browsers.

Scripting with Browser Bots consists of user actions like navigate to URL, click on element, type text into an input field, and so on. You don’t have to worry about every individual HTTP request, because the browser figures that out for you. Automation at the browser level especially shines for testing complicated web applications with a lot of client-side logic or convoluted authentication flows.

What about the old v-users or virtual users? They aren’t going away, but from now on we’ll call them Protocol Bots. Protocol Bots are still ideal for load testing HTTP APIs (REST, GraphQL, etc) because they give you precise control of each request. They generally work fine for simple static sites too. And testing with Protocol Bots, when practical, is very affordable: they only consume 1/4th as much Loadster Fuel as Browser Bots.

In short, every time you create a script, you’re free to choose the right bots for the job: Protocol Bots when you want to work at the protocol layer, and Browser Bots when you want to automate real browsers.