Skip to content

Latest commit

 

History

History
66 lines (52 loc) · 1.58 KB

README.md

File metadata and controls

66 lines (52 loc) · 1.58 KB

TensorShort Logo

TensorShort

Official TensorFlow fork with shorter, cleaner and better naming conventions for functions and methods. TensorShort works on top of TensorFlow:

  • Any existing code in TensorFlow works with TensorShort
  • All new methods and functions mirror their aliases

Here's a lovely example:

Eager execution

Old:

tf.enable_eager_execution()

New:

eager()
  1. Use t. for TensorFlow and k. for Keras API before defining methods
  2. See full changelog here

Here are some code comparisons between an original and TensorShort versions:

Keras API

Old:

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation=tf.nn.relu),
    keras.layers.Dense(10, activation=tf.nn.softmax)
])

New:

model = k.sequential([
    k.flatten(input_shape=(28, 28)),
    k.dense(128, activation=relu),
    k.dense(10, activation=softmax)
])

CNN layer with Estimator dropout

Old:

  pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64])
  dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu)
  dropout = tf.layers.dropout(
      inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN)

New:

  pool2_flat = t.reshape(pool2, [-1, 7 * 7 * 64])
  dense = t.dense(input=pool2_flat, units=1024, activation=t.relu)
  
  dropout = t.dropout(
      input=dense, rate=0.4, train=mode == estimator.TRAIN)

TensorShort is an functional experiment project for personal use, maintained by Andrew Stepin.