Listen "torch.use_deterministic_algorithms"
Episode Synopsis
torch.use_deterministic_algorithms lets you force PyTorch to use deterministic algorithms. It's very useful for debugging!There are some errors in the recording: the feature is called torch.use_deterministic_algorithms, and there is not actually a capability to warn (this was in an old version of the PR but taken out), we just error if you hit nondeterministic code.Docs: https://pytorch.org/docs/stable/generated/torch.use_deterministic_algorithms.html#torch.use_deterministic_algorithms
More episodes of the podcast PyTorch Developer Podcast
Compiler collectives
04/08/2024
TORCH_TRACE and tlparse
29/04/2024
Higher order operators
21/04/2024
Inductor - Post-grad FX passes
12/04/2024
CUDA graph trees
24/03/2024
Min-cut partitioner
17/03/2024
AOTInductor
02/03/2024
Tensor subclasses and PT2
24/02/2024
Compiled autograd
19/02/2024
PT2 extension points
05/02/2024
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.