May 01, 2019

Optimizing CUDA Recurrent Neural Networks with TorchScript

This week, we officially released PyTorch 1.1, a large feature update to PyTorch 1.0. One of the new features we’ve added is better support for fast, custom Recurrent Neural Networks (fastrnns) with TorchScript (the PyTorch JIT) (https://pytorch.org/docs/stable/jit.html).

May 01, 2019

PyTorch adds new dev tools as it hits production scale

This is a partial re-post of the original blog post on the Facebook AI Blog. The full post can be viewed here

April 29, 2019

Stochastic Weight Averaging in PyTorch

In this blogpost we describe the recently proposed Stochastic Weight Averaging (SWA) technique [1, 2], and its new implementation in torchcontrib. SWA is a simple procedure that improves generalization in deep learning over Stochastic Gradient Descent (SGD) at no additional cost, and can be used as a drop-in replacement for any other optimizer in PyTorch. SWA has a wide range of applications and features: