phoenisx
phoenisx3y ago

Deno Subprocess piping is slow when piping large data between processes

I could be wrong about this but this is what I have been facing on my M1 Mac. The following code snippet, pipes stdout of one process to another, and it works seamlessly for small json data.
import { copyN } from "https://deno.land/std/io/util.ts";

try {
const catProcess = Deno.run({
cmd: ["cat", "small-stats.json"],
stdout: "piped",
});
const proc = Deno.run({
cmd: [
"grep",
"search_text",
],
stdin: "piped",
stdout: "piped"
});

await copyN(catProcess.stdout, proc.stdin, 65536);
console.log("Cat Process Status: ", await catProcess.status());
proc.stdin.close();

console.log("Proc Status: ", await proc.status());
const encoder = new TextDecoder();
const out = encoder.decode(await proc.output());
console.log(out);
} catch (e) {
console.error(e);
}
import { copyN } from "https://deno.land/std/io/util.ts";

try {
const catProcess = Deno.run({
cmd: ["cat", "small-stats.json"],
stdout: "piped",
});
const proc = Deno.run({
cmd: [
"grep",
"search_text",
],
stdin: "piped",
stdout: "piped"
});

await copyN(catProcess.stdout, proc.stdin, 65536);
console.log("Cat Process Status: ", await catProcess.status());
proc.stdin.close();

console.log("Proc Status: ", await proc.status());
const encoder = new TextDecoder();
const out = encoder.decode(await proc.output());
console.log(out);
} catch (e) {
console.error(e);
}
The program completely freezes, when a json data of around 1GB is piped using cat to second sub-process. Am I doing something wrong here?
0 Replies
No replies yetBe the first to reply to this messageJoin

Did you find this page helpful?