A hands-on deep learning introduction, from pieces.
For an interactive, installation-free version, use Colab: https://colab.research.google.com/github/stared/thinking-in-tensors-writing-in-pytorch/
By Piotr Migdał et al. (Weronika Ormaniec, possibly others)
“Study hard what interests you the most in the most undisciplined, irreverent and original manner possible.” ― Richard Feynman
“Scientists start out doing work that's perfect, in the sense that they're just trying to reproduce work someone else has already done for them. Eventually, they get to the point where they can do original work. Whereas hackers, from the start, are doing original work; it's just very bad. So hackers start original, and get good, and scientists start good, and get original.” - Paul Graham in Hackers and Painters
This project supported by: Jacek Migdał, Marek Cichy. Join the sponsors - show your ❤️ and support! It will give me time and energy to work on this project!
This project benefited from University of Silesia in Katowice course, which they let me to open source.
Mathematical concepts behind deep learning using PyTorch 1.0.
- All math equations as PyTorch code
- Explicit, minimalistic examples
- Jupyter Notebook for interactivity
- “On the shoulders of giants” - I link and refer to the best materials I know
- Fully open source & open for collaboration (I guess I will go with MIT for code, CC-BY for anything else)
There are quite a few practical introductions to deep learning. I recommend Deep Learning in Python by François Chollet (the Keras author). Or you want, you can classify small pictures, or extraterrestrial beings, today.
When it comes to the mathematical background, Deep Learning Book by Ian Goodfellow et al. is a great starting point, giving a lot of overview. Though, it requires a lot of interest in maths. Convolutional networks start well after page 300.
I struggled to find something in the middle ground - showing mathematical foundations of deep learning, step by step, at the same time translating it into code. The closest example is CS231n: Convolutional Neural Networks for Visual Recognition (which is, IMHO, a masterpiece). Though, I believe that instead of using NumPy we can use PyTorch, giving a smooth transition between mathematic ideas and a practical, working code.
Of course, there are quite a few awesome posts, notebooks and visualizations. I try to link to the ones that are useful for reader. In particular, I maintain a collaborative list of Interactive Machine Learning, Deep Learning and Statistics websites.
Crucially, this course is for you, the reader. If you are interested in one topic, let us know! There is nothing more inspiring that eager readers.
- Start with concrete examples first
- First 1d, then more
- Equations in LaTeX AND PyTorch
x.matmul(y).pow(2).sum()
nottorch.sum(torch.matmul(x, y) ** 2)
A few links of mine:
- Learning deep learning wth Keras - an overview of deep learning (what's that? what one should learn before); post from 2017 but surprisingly up-to-date
- My deep learning framework credo: Keras or PyTorch as your first deep learning framework
- Keras vs. PyTorch: Alien vs. Predator recognition with transfer learning
- My general overview of “how to start data science” (turns out - not only for math/phys background; though, I intend to write a separate text for non-STEM backgrounds)
- Quantum Tensors - a JavaScript / TypeScript package for sparse tensor operations on complex numbers. For example for quantum computing, quantum information, and well - the Quantum Game.
- Simple diagrams of convoluted neural networks - on deep learning architecture visualizations
- I am an independent AI consultant, specializing in giving hands-on trainings in deep learning (and general machine learning). If you are interested in a workshop, let me know at p.migdal.pl!