Skip to content

Releases: patlevin/tfjs-to-tf

TFJS Graph Converter v1.6.3

22 Jan 10:14
Compare
Choose a tag to compare

Bugfix Release

  • fixes issue #43: compatibility with Numpy 1.24

The release is available on PyPI.

TFJS Graph Converter v1.6.2

20 Aug 09:06
9f5107b
Compare
Choose a tag to compare

Bugfix Release

  • fixes issue #39
  • fixes unit tests
  • adds support for Apple Silicon Macs by installing the proper dependencies

Notes

Release v1.6.1 introduced a bug wherein some models couldn't be properly converted anymore.
The cause of the problem was an overly complicated and unnecessary naming scheme for new nodes
generated during fused-op splitting.

The offending code has been removed and node names are always kept unique. In addition, unit tests
have been cleaned up and dependencies amended with proper support for Macs with Apple Silicon.

TFJS Graph Converter v1.6.1

01 Jul 22:18
Compare
Choose a tag to compare

Bugfix Release

  • fixes issue #37

Notes

The converter didn't remap the inputs of fused operations if the fused node had another fused node as input.
The bugfix allows Blazepose models to be successfully converted.

Use pip install tfjs-graph-converter -U to update to the latest version.

TFJS Graph Converter v1.6.0

25 Mar 13:56
Compare
Choose a tag to compare

Feature Release

This release addresses Issue #36 and allows generating more compatible converted models.
The change includes a slight update of the API. The compat_mode argument has been
changed from a bool to a CompatMode-enum.

This allows for future compatibility modes without changing function signatures.

The update is available on pypi and can
be installed via pip install tfjs-graph-converter==1.6.0

API Changes

The CompatMode enumeration can be used to control the operations included in converted
models. It replaces the compat_mode Boolean switch used in the following functions:

  • load_graph_model_and_signature(..., compat_mode: CompatMode = CompatMode.NONE)
  • load_graph_model(..., compat_mode: CompatMode = CompatMode.NONE)
  • graph_model_to_frozen_graph(..., compat_mode: CompatMode = CompatMode.NONE)
  • graph_model_to_saved_model(..., compat_mode: CompatMode = CompatMode.NONE)
  • graph_models_to_saved_model(..., compat_mode: CompatMode = CompatMode.NONE)

New Compatibility Modes

The CompatMode enumeration currently features the following members:

  • CompatMode.NONE: the default; uses all supported native TF operations (including fused operations)
  • CompatMode.TFJS: harmonises weight types for compatibility with older TFJS versions
  • CompatMode.TFLITE: limits model operations to those natively supported by TFLite

Changed Command Line Options

The --compat_mode switch has changed to require a mode option. The following options are supported:

  • none: the default; uses CompatMode.NONE for converting
  • tfjs: use CompatMode.TFJS for converting
  • tflite: use CompatMode.TFLITE for converting

Notes

Existing programs that don't use the compat_mode argument don't require any changes.
Programs using the compat_mode=True need to be updated to use CompatMode.TFJS instead.

Many thanks to jvuillaumier for making me aware of the issue
and for the kind words.

TFJS Graph Converter v1.5.0

25 Feb 21:03
Compare
Choose a tag to compare

Feature Update

Changes

  • added a function enable_cuda() to enable CUDA for software that uses the converter as library
  • added documentation regarding CUDA and this converter
  • fixed Issue #35

Details

A user of this library brought to my attention that they cannot use CUDA hardware acceleration with this package.
This is true, as the converter disables CUDA devices in order to enable converting models that would crash the optimizer
when CUDA is used. This happens if the system has a CUDA-supported device installed that lacks some compute capabilities
or doesn't have enough memory (e.g. VRAM). Since the converter makes use of low-level APIs that don't support device
selection using Python, CUDA devices are disabled via the NVIDIA driver.

This is bad for users who want to use CUDA and the converter at the same time. This release introduces a function which, if
called before any call to a Tensorflow or converter API, re-enables the use of CUDA devices for the current process.

The default behaviour remains to disable CUDA by default.

This does change does not affect MacOS users or ROCm (e.g. AMD cards) in any way.

Installation

Via pypi: pip install -U tfjs-graph-converter

From source directory: pip install .

TFJS Graph Converter v1.4.2

03 Dec 14:16
Compare
Choose a tag to compare

Minor Patch Release

Changes

Users reported errors when trying to convert Keras models (#27, #28).
This patch addresses these issues by checking the model format prior to parsing and converting it.
If the detected format differs from the expected TFJS graph model format, a (hopefully) more helpful error message is displayed instead of a stack trace, e.g.:

> tfjs_graph_converter keras-model.json /home/user/models/ --output_format=tf_saved_model
> TensorFlow.js Graph Model Converter

Graph model:    keras-model.json
Output:         /home/user/models/
Target format:  tf_saved_model

Converting.... Error: The model is a KERAS layers-model.
This converter only handles GRAPH models.
You can load and convert Keras models directly (using Python):

        import tensorflowjs as tfjs

        model = tfjs.converters.load_keras_model("keras-model.json")
        model.save("/home/user/models")
>

Installation

Via pypi: pip install -U tfjs-graph-converter

From source directory: pip install .

TFJS Graph Converter 1.4.1

16 Oct 13:14
Compare
Choose a tag to compare

Minor Bugfix Release

Bugfixes

  • Tensorflow log message spamming wasn't suppressed as it's supposed to
  • enabled CUDA GPU could cause the converter to freeze mid-conversion
  • source code: some type hints were missing or wrong

Installation

Via pypi: pip install -U tfjs-graph-converter

From source directory: pip install .

TFJS Graph Converter v1.4

05 Oct 10:12
Compare
Choose a tag to compare

TFJS Graph Model Converter 1.4.0

This release adds a new flag for improved model compatibility with TensorflowJS for NodeJS.
Converted models that have int64 inputs aren't supported by tf.node.
Using the new flag --compat_mode, the converted model will now convert the input datatype to be compatible with TFJS ≤v2.4.0.

Features

  • New flag --compat_mode / -c for TFJS compatibility

Fixed Issues

  • Fixed issue #23

Installation

Use pip install -U tfjs-graph-converter to install or upgrade to the current version

TFJS Graph Converter 1.3.1

16 Sep 21:51
Compare
Choose a tag to compare

TFJS Graph Converter 1.3.1

This is a minor patch that fixes an issue with certain models that would prevent these models to be converted.

Features

  • Fixed issue #21 by supporting models with mismatching tensor data and graph nodes (e.g. toxicity)

Installation

To update an existing installation you can use

pip install -U tfjs-graph-model-converter

TFJS Graph Converter v1.3

09 Sep 22:30
Compare
Choose a tag to compare

Changes

Model support

  • Models with fused depth-wise convolution (e.g. efficientnet) are now supported

Bug fixes

Installation

Existing installations can be updated from PyPI:

pip install -U tfjs-graph-converter