Deep Learning in Javascript

Rmag Breaking News

JS-Torch is a Deep Learning JavaScript library built from scratch, to closely follow PyTorch‘s syntax.

It is implemented from scratch in a well-documented, unit tested, and interpretable way, so it can help other JavaScript learners get into Machine Learning!

Feel free to try out a Web Demo!

 

1. TL;DR

src/tensor.ts contains a fully functional Tensor object, which can track gradients.

src/layers.ts contains many Deep Learning Layers and functions.

Note: The project’s README contains details on all available operations and layers.

 

2. Running it Yourself

Install & Import

To start off, you can install the package locally running npm install js-pytorch on the terminal.
Then, on your JavaScript file, import the package with:

const torch = require(js-pytorch);

 

Create Tensors

To use all of these cool deep learning Tensor Operations, we need to instantiate some Tensors:

// Instantiate Tensors:
let x = torch.randn([8,4,5]);
let w = torch.randn([8,5,4], requires_grad = true);
let b = torch.tensor([0.2, 0.5, 0.1, 0.0], requires_grad = true);

The syntax is the same as PyTorch‘s:

torch.tensor(Array) recieves an array and turns it into a Tensor.

torch.randn(shape) creates a Tensor filled with normally-distributed random numbers, with the provided shape.
The requires_grad argument is set to true if we want to optimize this parameter (by tracking it’s gradients).

 

Tensor Operations

Now, let’s run some operations on these Tensors:

// Make calculations:
let out = torch.matmul(x, w);
out = torch.add(out, b);

torch.matmul(x, w) performs matrix multiplication between x and w (just like in PyTorch).

torch.add(out, b) adds both Tensors.

Note: As w has require_grad set to true, its children out will also have it’s gradients tracked.

 

Getting Gradients

// Compute gradients on whole graph:
out.backward();

// Get gradients from specific Tensors:
console.log(w.grad);
console.log(b.grad);

Calling out.backward() calculates the gradients of every Tensor that let to it (its parents), relative to out.
IRL, we will call .backward() on the loss Tensor, to get the gradients necessary to reduce it.
To access a Tensor’s gradients, simply call Tensor.grad.

 

3. Full Example (Neural Network):

In this example, we implement a full Neural Network, with three Linear layers, and ReLU activations. The syntax for the nn.Module class is identical to PyTorch‘s.

const torch = require(js-pytorch);
const nn = torch.nn;
const optim = torch.optim;

// Implement Module class:
class NeuralNet extends nn.Module {
constructor(in_size, hidden_size, out_size) {
super();
// Instantiate Neural Network’s Layers:
this.w1 = new nn.Linear(in_size, hidden_size);
this.relu1 = new nn.ReLU();
this.w2 = new nn.Linear(hidden_size, hidden_size);
this.relu2 = new nn.ReLU();
this.w3 = new nn.Linear(hidden_size, out_size);
};

forward(x) {
let z;
z = this.w1.forward(x);
z = this.relu1.forward(z);
z = this.w2.forward(z);
z = this.relu2.forward(z);
z = this.w3.forward(z);
return z;
};
};

// Instantiate Model:
let in_size = 16;
let hidden_size = 32;
let out_size = 10;
let batch_size = 16;

let model = new NeuralNet(in_size,hidden_size,out_size);

// Define loss function and optimizer:
let loss_func = new nn.CrossEntropyLoss();
let optimizer = new optim.Adam(model.parameters(), 3e3);

// Instantiate input and output:
let x = torch.randn([batch_size, in_size]);
let y = torch.randint(0, out_size, [batch_size]);
let loss;

// Training Loop:
for (let i = 0; i < 256; i++) {
let z = model.forward(x);

// Get loss:
loss = loss_func.forward(z, y);

// Backpropagate the loss using torch.tensor’s backward() method:
loss.backward();

// Update the weights:
optimizer.step();

// Reset the gradients to zero after each training step:
optimizer.zero_grad();

// Print current loss:
console.log(`Iter: ${i} – Loss: ${loss.data}`);
}

 

4. Building for distribution & DevTools

To build for distribution, run npm run build. CJS and ESM modules and index.d.ts will be output in the dist/ folder.
To check the code with ESLint at any time, runnpm run lint.
To improve code formatting with prettier, run npm run prettier.

5. Conclusion

Hope you enjoyed the package!
New additions, such as GPU support are coming soon.

Leave a Reply

Your email address will not be published. Required fields are marked *