Submitted by gahaalt t3_ypkfwq in MachineLearning
SEND_ALL_DOG_PICS t1_ivk1q1f wrote
Why would I use this over torchdynamo? There is a solution built into pytorch for graph capture, which you can then “replay” (in your words) on any data.
gahaalt OP t1_ivkmxtm wrote
Thanks for this question!
Pytorch Symbolic is simplifying the definition of the neural network models. It is indeed creating a graph under the hood to do this. In this graph, every edge is an nn.Module
.
torchdynamo
looks great as a tool for optimizing existing models to perform better on the GPU by removing the CPU overhead entirely. Sometimes the improvement is really impressive.
Yes, torchdynamo
does some kind of graph capture as well. It even modifies the byte-code to speed up the execution. But in the end it is a wrapper for an nn.Module
that speeds it up. To speed up the model, you have to define it first.
So the two libraries are actually independent. You can use torchdynamo
to speed up models created with Pytorch Symbolic. IMO it is a great combination.
[deleted] t1_ivkn9st wrote
[removed]
Viewing a single comment thread. View all comments