XGitHub

Fast and Simple
ML for JS

Based on tinygrad, 0 dependencies,
runs in web, Node, Deno and Bun.

add.js

llama.js

mnist-inference.js

mnist-training.js

Getting started

Install with

npm install @jsgrad/jsgrad

Usage

import { Tensor } from '@jsgrad/jsgrad' console.log(await new Tensor([2, 2, 2]).add(5).tolist())

Runtimes

We are prioritizing supporting web runtimes like WebGPU and WASM at first, but you can still use any tinygrad runtime with CLOUD device.

WEBGPU
*
*
CLANG
JS
WASM
CLOUD
AMD
METAL
CUDA
GPU
DSP
HIP
LLVM
NV
QCOM

CLOUD

Our CLOUD device is compatible with tinygrad CLOUD, so you can start a tinygrad CLOUD server, set the jsgrad device to CLOUD:https://url_to_your_gpu.com and now the computation runs on any tinygrad runtime.

With CLOUD you could publish your models as a website and users could bring their own GPU (or buy GPU time) to your site.

CLOUD docs

Example apps

Chat with Llama 3 1B MNIST model training Whisper speech to text