Skip to content
/ tfpyth Public

Putting TensorFlow back in PyTorch, back in TensorFlow (differentiable TensorFlow PyTorch adapters).

License

Notifications You must be signed in to change notification settings

BlackHC/tfpyth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TfPyTh

Build Status codecov

Putting TensorFlow back in PyTorch, back in TensorFlow (with differentiable TensorFlow PyTorch adapters).

Do you have a codebase that uses TensorFlow and one that uses PyTorch and want to train a model that uses both end-to-end?

This library makes it possible without having to rewrite either codebase!

It allows you to wrap a TensorFlow graph to make it callable (and differentiable) through PyTorch, and vice-versa, using simple functions.

The only caveat is that tensors have to be copied and routed through the CPU until TensorFlow supports __cuda_array_interface (please star the GitHub issue).

Install

pip install tfpyth

Example

import tensorflow as tf
import torch as th
import numpy as np
import tfpyth

session = tf.Session()

def get_torch_function():
    a = tf.placeholder(tf.float32, name='a')
    b = tf.placeholder(tf.float32, name='b')
    c = 3 * a + 4 * b * b

    f = tfpyth.torch_from_tensorflow(session, [a, b], c).apply
    return f

f = get_torch_function()
a = th.tensor(1, dtype=th.float32, requires_grad=True)
b = th.tensor(3, dtype=th.float32, requires_grad=True)
x = f(a, b)

assert x == 39.

x.backward()

assert np.allclose((a.grad, b.grad), (3., 24.))

What it's got

torch_from_tensorflow

Creates a PyTorch function that is differentiable by evaluating a TensorFlow output tensor given input placeholders.

eager_tensorflow_from_torch

Creates an eager Tensorflow function from a PyTorch function.

tensorflow_from_torch

Creates a TensorFlow op/tensor from a PyTorch function.

Future work

  • support JAX
  • support higher-order derivatives

About

Putting TensorFlow back in PyTorch, back in TensorFlow (differentiable TensorFlow PyTorch adapters).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages