How to encrypt and decrypt files from S3 without loading the hole content into memory?
How to encrypt and decrypt files from S3 without loading the hole content into memory?
I have this basic example (example in comments), I wonder is some of this can be used.
3 Replies
I wasn't able to find the answer to your question in the official Deno documentation. However, based on your message, I can provide some general advice.
When dealing with large files and trying to avoid loading the entire content into memory, you would typically use streams. Streams allow you to process data incrementally as it is being read or written, which is ideal for handling large files.
In Deno, you can use the
reader
and writer
interfaces along with the copy
function from the std/streams
module to work with streams. If you have an example in the comments, please ensure that it uses streaming methods to read from and write to S3.
Please note that this advice is not based on the Deno documentation, and I would advise you to double-check the accuracy of this advice. If you're using a third-party library for S3 interactions, make sure to consult the library's documentation for streaming support.The Web Crypto API that Deno uses has no support for streams so I don't think its possible with the basic crypto.subtle. You could break the data into chunks and encrypt each chunk separately but then you are basically creating your own file format that only your code understands. Chunking can also potentially weaken the encryption if you don't handle IV generation and auth tags properly.
I would look for a encryption library that can handle streams. Node's crypto module has support for ReadStreams so maybe Deno's Node compatibility is far enough along that it supports streams.