Piotr Migdał
2 min readJan 29, 2019

--

In the topic of diagrams, I wrote a review of deep learning architecture visualizations: Simple diagrams of convoluted neural networks. While most people don’t write networks visually, such visualizations are often the most effective tools to communicate one’s results (whether it is a neural network architecture or just a single block).

Moreover, in particle physics, people use Feynman diagrams a lot. And they are nothing more or less than a graphical representation of summations and integrations over many variables. Also, as an ex-quantum physicist, I am a big fan of tensor diagrams — exactly in the spirit of Bob Coecke. I recommend Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks as a practical introduction. You can think about it as the Einstein summation convention with no dummy indices (see Einsum in PyTorch).

Tensor diagrams from http://tensornetwork.org/diagrams/. Very useful for Matrix Product States.

Also, speaking about the visual representation of tensor operations (though — not diagrams), I wrote Quantum Game with Photons. It features sortable matrices:

A Faraday rotator matrix from the Quantum Game with Photons encyclopedia.

When it comes to languages, while there are some interesting approaches (e.g. Luna) the only one I actually used was LabView (in an optics laboratory, where it is (or at least: was) a mainstream approach). For some reason, even NoFlo (flow-based programming for JavaScript) didn’t catch enough traction.

EDIT:
I created a JS demo of creating tensor diagrams: https://jsfiddle.net/stared/8huz5gy7/
If you have any comments or feedback, I would be happy to develop that.

--

--

Piotr Migdał
Piotr Migdał

Written by Piotr Migdał

PhD in quantum physics, deep learning & data viz specialist. Founder at Quantum Flytrap. https://p.migdal.pl/ / https://quantumflytrap.com/

Responses (1)