-
Notifications
You must be signed in to change notification settings - Fork 4
Configuring a response
There are several overloads to configure a response. Here are just a few:
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Body("text data")
.ContentType("text/plain", Encoding.UTF8)
);
mockHttp
.When(...)
.Respond((context, cancellationToken) => new HttpResponseMessage(HttpStatusCode.BadRequest));
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.JsonBody(new Person { FullName = "John Doe" }) // Requires 'skwas.MockHttp.Json' package
)
To throw an exception in response to a request:
mockHttp
.When(...)
.Throws<InvalidOperationException>();
To simulate network latency, there are a few options.
The simplest method is to just introduce an artificial delay using the Latency()
extension. When the request is sent and handled by the mock handler, it will then wait using a Task.Delay()
for the specified amount of time, before returning the configured response.
using static MockHttp.NetworkLatency;
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Latency(ThreeG)
);
// Or some of the other overloads, like:
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Latency(Around(TimeSpan.FromMilliseconds(100)))
);
For convenience, MockHttp includes several helpful functions which adds some variance to the latency with the NetworkLatency
helper type (OneG
, TwoG
, ThreeG
, FourG
and FiveG
), but you can of course also specify the delay yourself with TimeSpan
s using Around()
and Between()
.
Alternatively, we can also simulate slow network transfer rates. A stream wrapper RateLimitedStream
will limit the amount of bytes that can be read by a specified bit rate.
You can use the stream directly to wrap another stream:
using Stream stream = ...; // A big stream
mock.When(...)
.Respond(with => with
.Body(() => new RateLimitedStream(stream, 512_000)) // Rate limited to 512 kbps
);
Or use the helper extension which works with any type of content returned:
byte[] data = ...;
mock.When(...)
.Respond(with => with
.Body(data)
.TransferRate(512_000) // Rate limited to 512 kbps
);
Tip: and of course, you can combine latency and transfer rate together!
For more complex response configurations and/or reusability implement IResponseStrategy
.
public class MyResponseStrategy : IResponseStrategy
{
public Task<HttpResponseMessage> ProduceResponseAsync(MockHttpRequestContext requestContext, CancellationToken cancellationToken)
{
// Custom response logic.
}
}
mockHttp
.When(...)
.RespondUsing(new MyResponseStrategy());
An added benefit is it helps to keep the unit tests themselves clean.
Multiple responses can be configured to form a sequence. This is useful when a request is expected to happen multiple times, but with different responses.
Use cases are for example:
- test resilience by tripping a circuit breaker/retry logic and only succeed after nth request.
- scroll/paginating API's which return a subset of a larger list of data
The Respond
, RespondUsing
, Throws
response configuration extensions can all be chained to form a sequence.
mockHttp
.When(...)
.Respond(with => with.StatusCode(HttpStatusCode.BadGateway))
.Respond(with => with.ClientTimeout(TimeSpan.FromMilliseconds(500)) // TaskCancelledException after 500 milliseconds
.Respond(with => with.StatusCode(HttpStatusCode.Ok))
.Throws<HttpRequestException>()
.Respond(with => with.StatusCode(HttpStatusCode.Ok))
.RespondUsing(new MyResponseStrategy())
.Respond(with => with.StatusCode(HttpStatusCode.Ok))
The last configured response will be repeated if more requests are executed.