Rewriting TensorFlow Graphs with the GTT

networks

Photo by Stephen D. Strowes

One of the most interesting things about neural networks for me is that they’re programs you can do meaningful computation on. The most obvious example of that is automatic differentiation, but even after you’ve trained a model there are lots of other interesting transformations you can apply. These can be as simple as trimming parts of the graph that aren’t needed for just running inference, all the way to folding batch normalization nodes into precalculated weights, turning constant sub expressions into single nodes, or rewriting calculations in eight bit.

Many of these operations have been available as piecemeal Python scripts inside the TensorFlow codebase, but I’ve spent some time rewriting them into what I hope is a much cleaner and easier to extend C++ Graph Transform Tool. As well as a set of predefined operations based on what we commonly need ourselves, I’ve tried to create a simple set of matching operators and other utilities to encourage contributors to create and share their own rewriting passes.

I think there’s a lot of potential for computing on compute graphs, so I’m excited to hear what you can come up with! Do cc me (@petewarden) on github too with any issues you encounter.

 

2 responses

  1. hi pete,
    Optimizing TF code for mobile is very fun. Recently I focus on Tensorflow with opencl and there are two approaches: codeplay and hughperkins’s ( nv code transform to opencl). codeplay’s solution can’t be apply for arm platform because of no opencl arm compiler. I have tried hughperkins’s solution, but still have some issues in arm platform.
    Do u have any good suggestions?

  2. Hi Pete,

    Your blogs have come in handy quite a few times in the past, so thanks a ton!

    How did graph transform tool compare to the toco tool that converts to tf lite profile? From what I understand, toco also converts operators to be tf lite compatible. If that’s the case, is graph transform like a subset of toco?

Leave a comment