Bairdy
Bairdy2y ago

Is anyone working on LLMs for Deno so I don't have to learn it in disgusting Python?

I'm coming to terms with the reality of having to learn this technology but not on Python.. I mean.. very much hopefully not on Python.
5 Replies
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Bairdy
BairdyOP2y ago
Amazing! Thanks for the tip!
Snabel
Snabel2y ago
You could use https://github.com/denosaurs/deno_python and that way have access to the whole python ecosystem of llm's and ml while in the comfy and familiar confines of deno. Heres a falcon7b example running locally in deno! Just run pip install git+https://www.github.com/huggingface/transformers git+https://github.com/huggingface/accelerate bitsandbytes einops torch and then the following program:
import { python, kw } from "https://deno.land/x/python/mod.ts";

const torch = python.import("torch");
const transformers = python.import("transformers");
const { AutoTokenizer } = transformers;

const model = "tiiuae/falcon-7b";

const tokenizer = AutoTokenizer.from_pretrained(model);
const pipeline = transformers.pipeline(
"text-generation",
kw`model=${model}`,
kw`tokenizer=${tokenizer}`,
kw`torch_dtype=${torch.bfloat16}`,
kw`trust_remote_code=${true}`,
kw`device_map=${"auto"}`
);
const sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
kw`max_length=${200}`,
kw`do_sample=${true}`,
kw`top_k=${10}`,
kw`num_return_sequences=${1}`,
kw`eos_token_id=${tokenizer.eos_token_id}`
);

for (const sequence of sequences) {
console.log(`Result: ${sequence.get_attr("generated_text")}`);
}
import { python, kw } from "https://deno.land/x/python/mod.ts";

const torch = python.import("torch");
const transformers = python.import("transformers");
const { AutoTokenizer } = transformers;

const model = "tiiuae/falcon-7b";

const tokenizer = AutoTokenizer.from_pretrained(model);
const pipeline = transformers.pipeline(
"text-generation",
kw`model=${model}`,
kw`tokenizer=${tokenizer}`,
kw`torch_dtype=${torch.bfloat16}`,
kw`trust_remote_code=${true}`,
kw`device_map=${"auto"}`
);
const sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
kw`max_length=${200}`,
kw`do_sample=${true}`,
kw`top_k=${10}`,
kw`num_return_sequences=${1}`,
kw`eos_token_id=${tokenizer.eos_token_id}`
);

for (const sequence of sequences) {
console.log(`Result: ${sequence.get_attr("generated_text")}`);
}
GitHub
GitHub - denosaurs/deno_python: 🐍 Python interpreter bindings for D...
🐍 Python interpreter bindings for Deno. Contribute to denosaurs/deno_python development by creating an account on GitHub.
mattvr
mattvr2y ago
You can also check out my deno project ShellGPT which has some easy to use wrappers for OpenAI calls https://github.com/mattvr/ShellGPT/blob/main/lib/ai.ts
GitHub
ShellGPT/lib/ai.ts at main · mattvr/ShellGPT
Upgrade your terminal with GPT-4. Contribute to mattvr/ShellGPT development by creating an account on GitHub.
ioB
ioB2y ago
There's always https://deno.land/x/openai@1.4.1 if you need to make OpenAI calls

Did you find this page helpful?