Throttling Outgoing Requests in Node.js and .NET Core
1 min read

Throttling Outgoing Requests in Node.js and .NET Core

I have published two articles recently on a problem I was running into working on Keep Track of My Games. In order to sync user's Steam collections, I have to call the Steam Web API.

The Steam Web API implements "rate limiting" meaning that if you call it too many times too quickly it returns a HTTP 429 Too Many Requests response. According to the terms the rate limit is 100,000 requests per day, which is pretty generous. But if you're thinking of syncing 2000 users every 15 minutes, that puts you two times over the limit! So you need a throttling mechanism to defer processing once you reach the limit. In most scenarios like this, public APIs will return some useful HTTP headers that let you know what your current request count is but in this case, the Steam API does no such thing (it's a bit dated).

There are a few ways to rate limit or throttle outgoing requests to an API like this but most approaches don't work with clustering meaning multiple isolated clients. Approaches like using slim semaphore or limiter don't cut it because those only work in-memory. We need a backing store to coordinate counting requests across a cluster. Bottleneck is one npm package that supports this but it can only use Redis. Since I don't use Redis (and I'm using C#) that wasn't an option for me. Instead I turned to RavenDB for the solution and it's been working out well!

I wrote up two guides on achieving this using RavenDB, one for .NET and one for Node.js, so if you're curious how I solved the problem then check them out!

Enjoying these posts? Subscribe for more