The Living Thing / Notebooks :

pytorch

#torched

Usefulness: 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧

Successor to Lua’s torch. Evil twin to googles’s Tensorflow. Probably ascendant over Tensorflow for researchers if not for industrial uses.

They claim to be aiming for fancy features such as reversible learning and what-have-you.

PyTorch has a unique way of building neural networks: using and replaying a tape recorder.

Most frameworks such as TensorFlow, Theano, Caffe and CNTK have a static view of the world. One has to build a neural network, and reuse the same structure again and again. Changing the way the network behaves means that one has to start from scratch. [… Pytorch] allows you to change the way your network behaves arbitrarily with zero lag or overhead.

Of course the overhead is not truly zero; rather they have shifted the overhead baseline down a little. But whatever, it’s comparatively convenient.

Not, however, as convenient for my purposes as frameworks that avoid industrial neural frameworks altogether when what I usually want is fairly basic autodiff.

jax is a python autodiff jit that I currently use, and also julia has many many options some of which I use, and overall this is much more efficient for me than either .

Getting started

DSP in pytorch

Keunwoo Choi has some beautiful examples, e.g. Inverse STFT, Harmonic Percussive separation.

Today we have torchaudio or alternatively nnAudio (Source) which is similar but has fewer dependencies.

Custom functions

There is some bad advice in the manual

nn exports two kinds of interfaces – modules and their functional versions. You can extend it in both ways, but we recommend using modules for all kinds of layers, that hold any parameters or buffers, and recommend using a functional form parameter-less operations like activation functions, pooling, etc.

So, important missing information.

It’s just as well it’s easy to roll your own recurrent nets because the default implementations are bad

The default RNN layer is heavily optimised using cuDNN, which is sweet, but you only have a choice of 2 activation functions, and neither of them is “linear”.

Ding Ke made a beautiful and simple RNN implementation.

These used to be horribly slow. But recent pytorch includes JITed RNN.

Logging and visualizing training

Visualising graphs

Fiddly. The official way is via ONNX.

conda install -c ezyang onnx pydot # or
pip install onnx pydot
brew cask install netron # or
pip install netron
brew install graphviz

Also available, pytorchviz.

pip install git+https://github.com/szagoruyko/pytorchviz
from pytorchviz import make_dot
y = model(x)
make_dot(y, params = dict(model.named_parameters())

Utility libraries, derived software

Pytorch ships with a lot of included functionality, so you don’t necessarily ned to wrap it in anything else. Nonetheless, you can for specific use. NB this is list not up to date.

ODEs

torchdiffeq.

NLP

Like other deep learning frameworks, there is some basic NLP support in pytorch; see pytorch.text.

flair is a commercially-backed NLP framework.

Visdom

Pump graphs to a visualisation server. No pytorch-specific, but seems well-integrated. visdom

Pyro

pytorch + Bayes = pyro, an Edwardlib competitor.

Pyro launch announcment:

We believe the critical ideas to solve AI will come from a joint effort among a worldwide community of people pursuing diverse approaches. By open sourcing Pyro, we hope to encourage the scientific world to collaborate on making AI tools more flexible, open, and easy-to-use. We expect the current (alpha!) version of Pyro will be of most interest to probabilistic modelers who want to leverage large data sets and deep networks, PyTorch users who want easy-to-use Bayesian computation, and data scientists ready to explore the ragged edge of new technology.

pyprob

pyprob: (Le, Baydin, and Wood 2017)

pyprob is a PyTorch-based library for probabilistic programming and inference compilation. The main focus of this library is on coupling existing simulation codebases with probabilistic inference with minimal intervention.

The main advantage of pyprob, compared against other probabilistic programming languages like Pyro, is a fully automatic amortized inference procedure based on importance sampling. pyprob only requires a generative model to be specified. Particularly, pyprob allows for efficient inference using inference compilation which trains a recurrent neural network as a proposal network.

In Pyro such an inference network requires the user to explicitly define the control flow of the network, which is due to Pyro running the inference network and generative model sequentially. However, in pyprob the generative model and inference network runs concurrently. Thus, the control flow of the model is directly used to train the inference network. This alleviates the need for manually defining its control flow.

The flagship application seems to be etalumis (Baydin et al. 2019) a probabilistic programming framework with emphasis AFAICT on Bayesian inverse problems.

Inferno

inferno is a grab-bag library for torch.

Current features include:

I’m not sold on this one; A whole new library to reduce an already small amount of boilerplate, without adding any new non-trivial capabilities.

TNT

TNT is a reimplementation of some lua library that lua torch users used, that the current generation of ML users never witnessed. I think it aims to be semi-official library for pytorch, but it’s not especially active.

TNT (imported as torchnet) is a framework for PyTorch which provides a set of abstractions for PyTorch aiming at encouraging code re-use as well as encouraging modular programming. It provides powerful dataloading, logging, and visualization utilities.[…]

For example, TNT provides simple methods to record model performance in the torchnet.meter module and to log them to Visdom (or in the future, TensorboardX) with the torchnet.logging.

TNT docs.

Debugging

Memory leaks

Apparently you use normal python garbage collector analysis.

import torch
import gc
for obj in gc.get_objects():
    try:
        if torch.is_tensor(obj) or (hasattr(obj, 'data') and torch.is_tensor(obj.data)):
            print(type(obj), obj.size())
    except Exception as e:
        pass

See also usual python debugging.

Refs

Baydin, Atılım Güneş, Lei Shao, Wahid Bhimji, Lukas Heinrich, Lawrence Meadows, Jialin Liu, Andreas Munk, et al. 2019. “Etalumis: Bringing Probabilistic Programming to Scientific Simulators at Scale,” August. http://arxiv.org/abs/1907.03382.

Cheuk, Kin Wai, Kat Agres, and Dorien Herremans. 2019. “nnAUDIO: A Pytorch Audio Processing Tool Using 1d Convolution Neural Networks,” 2.

Le, Tuan Anh, Atılım Güneş Baydin, and Frank Wood. 2017. “Inference Compilation and Universal Probabilistic Programming.” In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), 54:1338–48. Proceedings of Machine Learning Research. Fort Lauderdale, FL, USA: PMLR. http://arxiv.org/abs/1610.09900.