How To Use Rate Limiting In ASP.NET Core

How To Use Rate Limiting In ASP.NET Core

4 min read ·

Thank you to our sponsors who keep this newsletter free to the reader:

Today's issue is sponsored by Rebus Pro. Rebus is a free .NET "service bus", and Rebus Pro is the perfect one-up for serious Rebus users. Use Fleet Manager to get Slack alerts when something fails and retry dead-lettered messages with a click of the mouse.

And by Hasura, an open-source engine that gives instant GraphQL and REST APIs on new or existing SQL Server, enabling teams to ship apps faster.

Rate limiting is a technique to limit the number of requests to a server or an API.

A limit is introduced within a given time period to prevent server overload and protect against abuse.

In ASP.NET Core 7, we have a built-in rate limiter middleware that's easy to integrate into your API.

We're going to cover four rate limiter algorithms:

Let's see how we can work with rate limiting.

What Is Rate Limiting?

Rate limiting is about restricting the number of requests to an API, usually within a specific time window or based on other criteria.

This is practical for a few reasons:

  • Prevents overloading of servers or applications
  • Improves security and guards against DDoS attacks
  • Reduces costs by preventing unnecessary resource usage

In a multi-tenant application, each unique user can have a limitation on the number of API requests.

Configuring Rate Limiting

ASP.NET Core 7 introduced built-in rate limiting middleware in the Microsoft.AspNetCore.RateLimiting namespace.

To add rate limiting to your application, you first need to register the rate limiting services:

builder.Services.AddRateLimiter(options =>
{
    options.RejectionStatusCode = StatusCodes.Status429TooManyRequests;

    // We'll talk about adding specific rate limiting policies later.
});

I suggest updating the RejectionStatusCode to 429 (Too Many Requests) because it's more correct. The default value is 503 (Service Unavailable).

And you also have to apply the RateLimitingMiddleware:

app.UseRateLimiter();

That's everything you'll need.

Let's see the rate limiting algorithms we can use.

Fixed Window Limiter

The AddFixedWindowLimiter method configures a fixed window limiter.

The Window value determines the time window.

When a time window expires, a new one starts, and the request limit is reset.

builder.Services.AddRateLimiter(rateLimiterOptions =>
{
    rateLimiterOptions.AddFixedWindowLimiter("fixed", options =>
    {
        options.PermitLimit = 10;
        options.Window = TimeSpan.FromSeconds(10);
        options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        options.QueueLimit = 5;
    });
});

Sliding Window Limiter

The sliding window algorithm is similar to the fixed window, but it introduces segments in a window.

Here's how it works:

  • Each time window is divided into multiple segments
  • The window slides one segment each segment interval
  • The segment interval is (window_time)/(segments_per_window)
  • When a segment expires, the requests taken in that segment are added to the current segment

The AddSlidingWindowLimiter method configures a sliding window limiter.

builder.Services.AddRateLimiter(rateLimiterOptions =>
{
    rateLimiterOptions.AddSlidingWindowLimiter("sliding", options =>
    {
        options.PermitLimit = 10;
        options.Window = TimeSpan.FromSeconds(10);
        options.SegmentsPerWindow = 2;
        options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        options.QueueLimit = 5;
    });
});

Token Bucket Limiter

The token bucket algorithm is similar to the sliding window, but instead of adding back the requests from the expired segment, a fixed number of tokens are added after each replenishment period.

The total number of tokens can never exceed the token limit.

The AddTokenBucketLimiter method configures a token bucket limiter.

builder.Services.AddRateLimiter(rateLimiterOptions =>
{
    rateLimiterOptions.AddTokenBucketLimiter("token", options =>
    {
        options.TokenLimit = 100;
        options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        options.QueueLimit = 5;
        options.ReplenishmentPeriod = TimeSpan.FromSeconds(10);
        options.TokensPerPeriod = 20;
        options.AutoReplenishment = true;
    });
});

When AutoReplenishment is true, an internal timer will execute every ReplenishmentPeriod and replenish the tokens.

Concurrency Limiter

The concurrency limiter is the most straightforward algorithm, and it just limits the number of concurrent requests.

The AddConcurrencyLimiter method configures a concurrency limiter.

builder.Services.AddRateLimiter(rateLimiterOptions =>
{
    rateLimiterOptions.AddConcurrencyLimiter("concurrency", options =>
    {
        options.PermitLimit = 10;
        options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        options.QueueLimit = 5;
    });
});

There's no time component involved in this case. The only parameter is the number of concurrent requests.

Using Rate Limiting In Your API

Now that we have configured our rate limiting policies, let's see how we can use them in our API.

There are slight differences between controllers and minimal API endpoints, so I'll cover them in separate examples.

Controllers

To add rate limiting on a controller we use the EnableRateLimiting and DisableRateLimiting attributes.

EnableRateLimiting can be applied on the controller or on the individual endpoints.

[EnableRateLimiting("fixed")]
public class TransactionsController
{
    private readonly ISender _sender;

    public TransactionsController(ISender sender)
    {
        _sender = sender;
    }

    [EnableRateLimiting("sliding")]
    public async Task<IActionResult> GetTransactions()
    {
        return Ok(await _sender.Send(new GetTransactionsQuery()));
    }

    [DisableRateLimiting]
    public async Task<IActionResult> GetTransactionById(int id)
    {
        return Ok(await _sender.Send(new GetTransactionByIdQuery(id)));
    }
}

In the previous example:

  • All endpoints in the TransactionsController will use a fixed window policy
  • The GetTransactions endpoint will use a sliding window policy
  • The GetTransactionById endpoint won't have any rate limiting applied

Minimal APIs

In a Minimal API endpoint you can configure the rate limit policy by calling RequireRateLimiting and specifying the policy name.

We're using the token bucket policy in this example.

app.MapGet("/transactions", async (ISender sender) =>
{
    return Results.Ok(await sender.Send(new GetTransactionsQuery()));
})
.RequireRateLimiting("token");

Closing Thoughts

It's great that we can quickly introduce rate limiting in ASP.NET Core.

You can choose from one of the existing rate limiter algorithms:

  • Fixed window
  • Sliding window
  • Token bucket
  • Concurrency

Here are some resources if you want to learn more about rate limiting:

I'm excited to try out rate limiting in my projects.

That's all for today.

Have an awesome Saturday!


Whenever you're ready, there are 4 ways I can help you:

  1. (COMING SOON) REST APIs in ASP.NET Core: You will learn how to build production-ready REST APIs using the latest ASP.NET Core features and best practices. It includes a fully functional UI application that we'll integrate with the REST API. Join the waitlist!
  2. Pragmatic Clean Architecture: Join 3,600+ students in this comprehensive course that will teach you the system I use to ship production-ready applications using Clean Architecture. Learn how to apply the best practices of modern software architecture.
  3. Modular Monolith Architecture: Join 1,600+ engineers in this in-depth course that will transform the way you build modern systems. You will learn the best practices for applying the Modular Monolith architecture in a real-world scenario.
  4. Patreon Community: Join a community of 1,000+ engineers and software architects. You will also unlock access to the source code I use in my YouTube videos, early access to future videos, and exclusive discounts for my courses.
  5. Promote yourself to 60,000+ subscribers by sponsoring this newsletter.

Become a Better .NET Software Engineer

Join 60,000+ engineers who are improving their skills every Saturday morning.