max
max•8mo ago

Deno.serve: Is it possible to flush a streaming response?

I've been playing around with this: https://docs.deno.com/examples/http-server-streaming I am coming from Go and I'm used to having the functionality to flush the http response to the socket. Allowing lots of control over when bytes are sent over the wire (like so: https://robpike.io/) Is there any mechanism to prompt deno to write to the socket? Maybe another streaming response type? Thanks!
2 Replies
Deno AI Helper
Deno AI Helper•8mo ago
I wasn't able to find the answer to your question in the official Deno documentation. However, based on the information you've provided, I can offer some general advice. In Deno, when you're working with streaming responses, you typically use the ReadableStream API to send data in chunks. The data is sent to the client as it's being read from the stream. However, there isn't a direct equivalent to Go's Flush method in Deno's standard library that I can reference from the official documentation. If you're using the example from the Deno documentation for streaming responses, make sure you're following the pattern of using a ReadableStream to send data in chunks. Here's a simplified example of how you might send chunks of data:
const body = new ReadableStream({
start(controller) {
controller.enqueue("First chunk of data,");
controller.enqueue("second chunk of data,");
// ...
controller.close();
},
});

await Deno.serve((req) => new Response(body), { port: 8000 });
const body = new ReadableStream({
start(controller) {
controller.enqueue("First chunk of data,");
controller.enqueue("second chunk of data,");
// ...
controller.close();
},
});

await Deno.serve((req) => new Response(body), { port: 8000 });
In this example, each call to controller.enqueue() sends a chunk of data to the client. However, this is not based on the Deno documentation, and I advise you to double-check the accuracy of this advice. If you need more control over the timing of when bytes are sent over the wire, you might need to look into lower-level APIs or third-party libraries that could offer more granular control over the response stream. Remember to check the Deno third-party modules (https://deno.land/x) for any libraries that might provide the functionality you're looking for.
dantheman
dantheman•8mo ago
I find that Deno will flush the response continuously, as you give data to the response stream. And for emitting a streaming response I sometimes find it easier to yield chunks from an async generator function, here's a basic example: https://dash.deno.com/playground/sour-bobcat-21 I don't know what makes a browser show partial pages like the robpike example Ah! It looks like Deno Deploy's default compression buffers output into chunks. I got the expected continuous output in Chrome after adding this response header: Cache-Control: no-transform Hope this helps 🙂

Did you find this page helpful?