The tests are executed on MacBook Pro M1 with 16G RAM. The mock backend service is a single high performance HTTP server running on Go with fasthttp framework.
The runtime versions are:
Node.js v19.6.1
Deno v1.30.3
Bun v0.5.6
The test is executed for a minute. The executed fetch requests are divided by 60 to get the final ops/second.
To keep the article size limited, I’m comparing hello world & JSON processing cases here. A follow-up article will deal with file upload, download, URL encoded data, and multipart/form-data.
For the hello world case, the server returns a simple hello world string. While parsing body is not technically a part of the fetch API, but the ultimate purpose for everyone is to get the response body (if there is any). For this case, the response body is parsed using await resp.text() API.
The following is the test code that runs in the three runtimes:
The following chart shows the fetch ops/second for a 1 minute test:
The Bun comes out as a winner. It performs much faster than both Node.js and Deno. Node.js is the slowest of all. Bun can execute about three times more fetch ops per second when compared to Node.js.
Before moving ahead, there might be an argument about including response body processing in the test. Let’s do a run without it. Here is the code now:
await fetch("http://localhost:3000");
The following chart shows the fetch ops/second for a 30-minute test:
The difference is negligible. Bun still leads by a big margin.
Winner for hello world: Bun
For the JSON processing case, the mock backend server echoes the JSON request body. This makes it easier to test both the sending and receiving of JSON data. To make things simple, I’ll run the test for two JSON sizes:
First JSON containing a simple array of three objects: