The Living Thing / Notebooks :

pytorch

#torched

Successor to Lua’s torch. Evil twin to googles’s Tensorflow.

They claim to be aiming for fancy features such as reversible learning and other such advanced technique of regularisation.

PyTorch has a unique way of building neural networks: using and replaying a tape recorder.

Most frameworks such as TensorFlow, Theano, Caffe and CNTK have a static view of the world. One has to build a neural network, and reuse the same structure again and again. Changing the way the network behaves means that one has to start from scratch. [… Pytorch] allows you to change the way your network behaves arbitrarily with zero lag or overhead.

Contents

Getting started

Custom functions

There is some bad advice in the manual

nn exports two kinds of interfaces - modules and their functional versions. You can extend it in both ways, but we recommend using modules for all kinds of layers, that hold any parameters or buffers, and recommend using a functional form parameter-less operations like activation functions, pooling, etc.

So, important missing information.

  1. If your desired loss is already just a composition of existing functions, you don’t need to define a Function subclass
  2. The given options are not a binarism but two things you need to do in concert. A better summary would be:
  • If you need to have a function which is differentiable in a non-trivial way, implement a Function
  • If you need to bundle a Function with some state or differentiable parameters, additionally wrap it in a nn.Module
  • Some people claim you can also create custom layers using plain python functions. However, these don’t work as layers, in an nn.Sequential model, so I’m not sure how to take this advice.

It’s just as well it’s easy to roll your own recurrent nets because the default implementations are bad

The default RNN layer is heavily optimised using CUDNN, which is sweet, but you only have a choice of 2 activation functions and neither of them is “linear”.

Ding Ke made a beautiful and simple RNN implementation.

thing using pytorch

Pyro

pytorch + Bayes = pyro, an Edwardlib competitor.

Pyro launch announcment:

We believe the critical ideas to solve AI will come from a joint effort among a worldwide community of people pursuing diverse approaches. By open sourcing Pyro, we hope to encourage the scientific world to collaborate on making AI tools more flexible, open, and easy-to-use. We expect the current (alpha!) version of Pyro will be of most interest to probabilistic modelers who want to leverage large data sets and deep networks, PyTorch users who want easy-to-use Bayesian computation, and data scientists ready to explore the ragged edge of new technology.