Listen "Multiple dispatch in __torch_function__"
Episode Synopsis
Python is a single dispatch OO language, but there are some operations such as binary magic methods which implement a simple form of multiple dispatch. torch_function__ (through its Numpy predecessor __array_function) generalizes this mechanism so that invocations of torch.add with different subclasses work properly. This podcast describes how this mechanism works and how it can be used (in an unconventional way) to build composable subclasses ala JAX in functorch.Further reading:This podcast in written form https://dev-discuss.pytorch.org/t/functorch-levels-as-dynamically-allocated-classes/294Multiple dispatch resolution rules in the RFC https://github.com/pytorch/rfcs/blob/master/RFC-0001-torch-function-for-methods.md#process-followed-during-a-functionmethod-call
More episodes of the podcast PyTorch Developer Podcast
Compiler collectives
04/08/2024
TORCH_TRACE and tlparse
29/04/2024
Higher order operators
21/04/2024
Inductor - Post-grad FX passes
12/04/2024
CUDA graph trees
24/03/2024
Min-cut partitioner
17/03/2024
AOTInductor
02/03/2024
Tensor subclasses and PT2
24/02/2024
Compiled autograd
19/02/2024
PT2 extension points
05/02/2024
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.