add.js
llama.js
mnist-inference.js
mnist-training.js
Install with
npm install @jsgrad/jsgrad
Usage
import { Tensor } from '@jsgrad/jsgrad'
console.log(await new Tensor([2, 2, 2]).add(5).tolist())
We are prioritizing supporting web runtimes like WebGPU and WASM at first, but you can still use any tinygrad runtime with CLOUD device.
Our CLOUD device is compatible with tinygrad CLOUD, so you can start a tinygrad CLOUD server, set the jsgrad device to
CLOUD:https://url_to_your_gpu.com
and now the computation runs on any tinygrad runtime.
With CLOUD you could publish your models as a website and users could bring their own GPU (or buy GPU time) to your site.