D
Deno

help

Properly continuously read data from a Reader

Wwulfey2/12/2023
What's the proper way to continuously read from a Deno.Reader? I'm trying to parse incoming HTTP2 frames from a Deno.TlsConn, but my current method of reading involves two sets of while loops, one to continuously perform the read operation, and the other, to try read everything currently available to read from the stream in chunks of 16k. Naturally, the while loops make it take approximately one entire core of my processor, which is far from ideal. Can this in any way be improved? Is there a way to know (wait for a promise to resolve or such) when there is available data?
Jjeff.hykin2/12/2023
If you're using an await inside of your loops, it definitely won't be taking up a whole thread just waiting on data. Something like
const returnReader = stream.getReader()
let blocks = []
while (true) {
const {value, done} = await returnReader.read()
if (done) {
break
}
blocks.push(value)
}
const string = new TextDecoder().decode(concatUint8Arrays(blocks))
const returnReader = stream.getReader()
let blocks = []
while (true) {
const {value, done} = await returnReader.read()
if (done) {
break
}
blocks.push(value)
}
const string = new TextDecoder().decode(concatUint8Arrays(blocks))
Wwulfey2/12/2023
Wwulfey2/12/2023
that is precisely what is going on
Jjeff.hykin2/12/2023
then you're good. The await is exactly being triggered by effectively a 'newDataAvailable' event, and it resumes the loop every time that event fires. It loops around, hits the await, and the thread goes to work on other stuff until the next 'newDataAvailable' event
Wwulfey2/12/2023
i see, thank you though, how can i find out whats causing all that cpu usage?
Wwulfey2/12/2023
Jjeff.hykin2/12/2023
That I'm not sure of, I would take a look at deno profiling https://deno.land/manual@v1.30.0/references/contributing/profiling
Deno
Profiling | Manual | Deno
Tools that can be used to generate/ visualise perf results:
Wwulfey2/12/2023
thank you would you mind if i asked another thing?
Jjeff.hykin2/12/2023
sure haha
Wwulfey2/12/2023
in the case that the data received happens to exactly be the size of my buffer (currently 16384 bytes) wont the next read in that while technically halt until newDataAvailable? i mean, i could test that myself but, in the case that is correct, how would i go around that? yes, that is in fact correct actually
Jjeff.hykin2/12/2023
it should hit the break condition (unless I'm misunderstanding the question)
Wwulfey2/12/2023
oh, let me try to clarify
let eof = false;
while (true) {
const buf = new Uint8Array(bufsize),
read = await conn.read(buf);

if (read === null) {
eof = true;
break;
}

buffers.push(buf.slice(0, read));

if (read !== bufsize) break;
}
let eof = false;
while (true) {
const buf = new Uint8Array(bufsize),
read = await conn.read(buf);

if (read === null) {
eof = true;
break;
}

buffers.push(buf.slice(0, read));

if (read !== bufsize) break;
}
in the case the conn.read call returns the bufsize (the read result was exactly bufsize long) but, assuming that no more data than that is available the while loop will continue, because it will try to check if there is more data
Jjeff.hykin2/12/2023
I see. One way to sidestep the buffer size issue would be like this:
// simplified from: https://stackoverflow.com/questions/49129643/how-do-i-merge-an-array-of-uint8arrays
const concatUint8Arrays = (arrays) => new Uint8Array( arrays.reduce((acc, curr) => (acc.push(...curr),acc), [])
)

const streamToString = async (stream) => {
console.debug(`stream is:`,stream)
const returnReader = stream.getReader()
let blocks = []
while (true) {
const {value, done} = await returnReader.read()
if (done) {
break
}
blocks.push(value)
}
const string = new TextDecoder().decode(concatUint8Arrays(blocks))
return string
}
// simplified from: https://stackoverflow.com/questions/49129643/how-do-i-merge-an-array-of-uint8arrays
const concatUint8Arrays = (arrays) => new Uint8Array( arrays.reduce((acc, curr) => (acc.push(...curr),acc), [])
)

const streamToString = async (stream) => {
console.debug(`stream is:`,stream)
const returnReader = stream.getReader()
let blocks = []
while (true) {
const {value, done} = await returnReader.read()
if (done) {
break
}
blocks.push(value)
}
const string = new TextDecoder().decode(concatUint8Arrays(blocks))
return string
}
Wwulfey2/12/2023
but the next conn.read call will have no data available, so it would wait for new data to be available do note that the Reader in question is a Deno.TlsConn
Deno.Reader.read(p: Uint8Array): Promise<number | null>`
Deno.Reader.read(p: Uint8Array): Promise<number | null>`
the only thing it returns is a number, or null if EOF (end of file, connection close) though, only now i notice
Jjeff.hykin2/12/2023
I meant more that, you don't have to specify a buffer size, just put all the non-null results into a regular array, then use concatUint8Arrays() to put them all together
Wwulfey2/12/2023
Deno.TlsConn has a readable property i see, that method uses the actual "getReader"
Jjeff.hykin2/12/2023
Yeah sorry about that, the streamToString wouldn't directly work in your case
Wwulfey2/12/2023
it in theory should if i just do some replacing there thank you for your help!!
Jjeff.hykin2/12/2023
no problem! I spent a long time trying to get readable streams working haha so I'm happy to help someone else. The Deno dev's were nice enought to help me when I needed it.
Wwulfey2/12/2023
thank you have a great day :)
Jjeff.hykin2/12/2023
you too 👍

Looking for more? Join the community!

Recommended Posts
npm:redis – everything returns `string`?Every command I try to use from `npm:redis` seems to have inferred return type `string`. Look at theMy magic middlewares typeIf anyone wants a typescript challange here is one: I am trying to create a tuple type which containRead one byte at a time from ReadableStreamWhat would be the most obvious way to read a single byte from a ReadableStream at a time?My deno.lock file exceeds 4000 lines of JSON nowIs this common for application projects? Is there any nice way to clean it up?importing packages breaks typesIm importing `Receiver` from https://deno.land/x/upstash_qstash@v0.3.2. but as soon as i import thisBug report: deno.land website returns a 500 error for JS filesOn https://deno.land, I noticed that the Node vs Deno comparison didn't seem to do anything. From Deno stopps at file.read()I am currently working on a way to communicate with a COM device. I can successfully open and read f[Resolved][Fresh] How to load data asynchronously after rendering a page ?Hi ! I was wondering if someone could help me understand the paradigm to load data asynchronously afSupabase Auth UI in FreshI'm building a simple web. I'd like to use Supabase's Auth UI, namely the `Auth` component, for handAny tool that will complie my TS code to JS?I have coded a website in typescript using Deno, but I want to convert it to JS code. I know the DenHow to get remote IP address when using Deno.serve?I'm using connInfo when using Deno from STD but the native http server doesn't seem to have such intTwo questions on documentationRTFM in 3... 2... 🙂 Question 1: How do i hide documentation for specific exports, i do not want Is there any simple way to keep track of the heap/stack usage of a running Deno application?I have made a Deno application and I'm curious to know if there's heap/stack issues that I should fiusing data from one deno bench in anotheris there any good way to re-use the results generated in one deno bench in another one? example: `docker compose up of Oak server throws: SyntaxError: Duplicate export of 'type'I am just learning how to use Docker, and am trying to put up a container of an Oak server. When runPublishing Deno Module with Bin (Answered)I know there's a way to include a `bin/` when publishing a module and then perform some kind of `denUse local tgz file instead of tgz from npm registryHello, Is it possible to reference a local tgz file inside a deno project (like calling `npm instalvscode LSP being a disgraceful little bitchfor no reason obvious to me, vscode and deno's lsp has been acting up lately. any ideas what could bIs there a way to fetch() with headers, WITHOUT forcing header keys to lowercase?Title. I'm trying to create an API wrapper for a service which uses "Incapsula" as their firewall orHow do I use std/log?I'm trying to create a log file. I don't seem to be able to make it work tho. ```ts import { handle