Asaf
Asafā€¢8mo ago

Deno write large file

How can I convert this node code to Deno
import fs from 'node:fs';

function* largeArrayGenerator() {
for (let i = 0; i < 1e9; i++) {
yield i;
}
}

const writeStream = fs.createWriteStream('largeArray-node.txt');

for (const i of largeArrayGenerator()) {
const data = `${i}\n`;
if (!writeStream.write(data)) {
await new Promise(resolve => writeStream.once('drain', resolve));
}
}
import fs from 'node:fs';

function* largeArrayGenerator() {
for (let i = 0; i < 1e9; i++) {
yield i;
}
}

const writeStream = fs.createWriteStream('largeArray-node.txt');

for (const i of largeArrayGenerator()) {
const data = `${i}\n`;
if (!writeStream.write(data)) {
await new Promise(resolve => writeStream.once('drain', resolve));
}
}
how can I write large file in Deno and write only when the highWaterMark reached so I can apply draining?
6 Replies
Deno AI Helper
Deno AI Helperā€¢8mo ago
In Deno, you can achieve similar functionality to Node's streams by using Deno's built-in writable streams. The documentation provides an example of using writable streams for file operations, which can be adapted for your use case. Here's how you can convert your Node.js code to Deno: 1. Use a generator function to create your large array. 2. Open a file for writing using Deno.open with the create and write options. 3. Get a writer from the file's writable property. 4. Write to the file using the writer's write method, which returns a promise that resolves when the write operation is complete. 5. Use the ready property of the writer to wait until it's ready to write again, which is analogous to Node's drain event. Here's an example based on the documentation:
// Define your generator function
function* largeArrayGenerator() {
for (let i = 0; i < 1e9; i++) {
yield i;
}
}

// Open the file for writing
const file = await Deno.open("largeArray-deno.txt", {
create: true,
write: true,
});

// Get a writer from the file's writable stream
const writer = file.writable.getWriter();

// Iterate over your large array
for (const i of largeArrayGenerator()) {
const data = new TextEncoder().encode(`${i}\n`);
// Write to the file
await writer.write(data);
// Wait for the writer to be ready before continuing
await writer.ready;
}

// Close the writer, which also closes the file
await writer.close();
// Define your generator function
function* largeArrayGenerator() {
for (let i = 0; i < 1e9; i++) {
yield i;
}
}

// Open the file for writing
const file = await Deno.open("largeArray-deno.txt", {
create: true,
write: true,
});

// Get a writer from the file's writable stream
const writer = file.writable.getWriter();

// Iterate over your large array
for (const i of largeArrayGenerator()) {
const data = new TextEncoder().encode(`${i}\n`);
// Write to the file
await writer.write(data);
// Wait for the writer to be ready before continuing
await writer.ready;
}

// Close the writer, which also closes the file
await writer.close();
This code does not explicitly check for a highWaterMark or implement a drain mechanism like in Node.js. However, the await writer.ready line ensures that the writer is ready to handle more data, which serves a similar purpose to waiting for a drain event in Node.js. Remember to run your Deno script with the
Deno AI Helper
Deno AI Helperā€¢8mo ago
--allow-write flag to grant permission for file writing operations:
deno run --allow-write your_script.js
deno run --allow-write your_script.js
šŸ¦• If you need more granular control over the backpressure mechanism, you might need to implement additional logic to handle the case when the internal buffer of the writable stream is full. However, the provided example should suffice for most use cases where you simply want to write to a file and wait as needed for the system to be ready to accept more data.
crowlKats
crowlKatsā€¢8mo ago
2 ways i'd do it:
await Deno.writeTextFile(
"largeArray-node.txt",
ReadableStream.from(largeArrayGenerator()).pipeThrough(
new TransformStream({
transform(chunk, controller) {
controller.enqueue(`${chunk}\n`);
},
}),
),
);
await Deno.writeTextFile(
"largeArray-node.txt",
ReadableStream.from(largeArrayGenerator()).pipeThrough(
new TransformStream({
transform(chunk, controller) {
controller.enqueue(`${chunk}\n`);
},
}),
),
);
or if you have a open file already:
const file = await Deno.open("largeArray-node.txt", { write: true, create: true });

const readable = ReadableStream.from(largeArrayGenerator()).pipeThrough(
new TransformStream({
transform(chunk, controller) {
controller.enqueue(`${chunk}\n`);
},
}),
).pipeThrough(new TextEncoderStream());

await readable.pipeTo(file.writable);
const file = await Deno.open("largeArray-node.txt", { write: true, create: true });

const readable = ReadableStream.from(largeArrayGenerator()).pipeThrough(
new TransformStream({
transform(chunk, controller) {
controller.enqueue(`${chunk}\n`);
},
}),
).pipeThrough(new TextEncoderStream());

await readable.pipeTo(file.writable);
Asaf
AsafOPā€¢8mo ago
I tried both of the methods you recommended, but they are significantly slower compared to the Node version. Is there any way to make them as fast as the Node implementation?
Leokuma
Leokumaā€¢8mo ago
Sorry for my ignorance, but why is the TransformStream needed in this case?
crowlKats
crowlKatsā€¢8mo ago
the generator generates numbers, so gotta turn it into strings, and since we want to add newlines, we add those as well hm, i am not entirely sure

Did you find this page helpful?