You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I saw another repo very similar to this one and it had an issue of the problem I'm having with wavenet. I was able to train tacotron to 100k and it saved out the GTA files as well. I tried to train Wavenet but get the error Conv2DCustomBackpropFilterOp only supports NHWC.
Generated 578 test batches of size 1 in 0.807 sec
2020-03-18 13:38:25.399613: E T:\src\github\tensorflow\tensorflow\core\common_runtime\executor.cc:697] Executor failed to create kernel. Invalid argument: Conv2DCustomBackpropFilterOp only supports NHWC.
[[Node: model/optimizer/gradients/model/inference/conv2d_transpose_1/conv2d_transpose_grad/Conv2DBackpropFilter = Conv2DBackpropFilter[T=DT_FLOAT, _class=["loc:@model/optimizer/clip_by_global_norm/mul_199"], data_format="NCHW", dilations=[1, 1, 1, 1], padding="SAME", strides=[1, 1, 1, 16], use_cudnn_on_gpu=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](model/optimizer/gradients/model/inference/conv2d_transpose_1/BiasAdd_grad/tuple/control_dependency/_2171, model/optimizer/gradients/model/inference/conv2d_transpose/conv2d_transpose_grad/Shape, model/inference/conv2d_transpose/BiasAdd/_2173)]]
Exiting due to Exception: Conv2DCustomBackpropFilterOp only supports NHWC.
[[Node: model/optimizer/gradients/model/inference/conv2d_transpose_1/conv2d_transpose_grad/Conv2DBackpropFilter = Conv2DBackpropFilter[T=DT_FLOAT, _class=["loc:@model/optimizer/clip_by_global_norm/mul_199"], data_format="NCHW", dilations=[1, 1, 1, 1], padding="SAME", strides=[1, 1, 1, 16], use_cudnn_on_gpu=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](model/optimizer/gradients/model/inference/conv2d_transpose_1/BiasAdd_grad/tuple/control_dependency/_2171, model/optimizer/gradients/model/inference/conv2d_transpose/conv2d_transpose_grad/Shape, model/inference/conv2d_transpose/BiasAdd/_2173)]]
Caused by op 'model/optimizer/gradients/model/inference/conv2d_transpose_1/conv2d_transpose_grad/Conv2DBackpropFilter', defined at:
File "train.py", line 127, in <module>
main()
File "train.py", line 119, in main
wavenet_train(args, log_dir, hparams, args.wavenet_input)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\train.py", line 244, in wavenet_train
return train(log_dir, args, hparams, input_path)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\train.py", line 167, in train
model, stats = model_train_mode(args, feeder, hparams, global_step)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\train.py", line 119, in model_train_mode
model.add_optimizer(global_step)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\models\wavenet.py", line 365, in add_optimizer
gradients, variables = zip(*optimizer.compute_gradients(self.loss))
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\training\optimizer.py", line 514, in compute_gradients
colocate_gradients_with_ops=colocate_gradients_with_ops)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gradients_impl.py", line 596, in gradients
gate_gradients, aggregation_method, stop_gradients)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gradients_impl.py", line 779, in _GradientsHelper
lambda: grad_fn(op, *out_grads))
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gradients_impl.py", line 398, in _MaybeCompile
return grad_fn() # Exit early
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gradients_impl.py", line 779, in <lambda>
lambda: grad_fn(op, *out_grads))
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\nn_grad.py", line 54, in _Conv2DBackpropInputGrad
data_format=op.get_attr("data_format")),
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gen_nn_ops.py", line 1190, in conv2d_backprop_filter
dilations=dilations, name=name)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\util\deprecation.py", line 454, in new_func
return func(*args, **kwargs)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\framework\ops.py", line 3155, in create_op
op_def=op_def)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\framework\ops.py", line 1717, in __init__
self._traceback = tf_stack.extract_stack()
...which was originally created as op 'model/inference/conv2d_transpose_1/conv2d_transpose', defined at:
File "train.py", line 127, in <module>
main()
[elided 2 identical lines from previous traceback]
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\train.py", line 167, in train
model, stats = model_train_mode(args, feeder, hparams, global_step)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\train.py", line 117, in model_train_mode
feeder.input_lengths, x=feeder.inputs)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\models\wavenet.py", line 169, in initialize
y_hat = self.step(x, c, g, softmax=False) #softmax is automatically computed inside softmax_cross_entropy if needed
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\models\wavenet.py", line 435, in step
c = transposed_conv(c)
File "C:\Users\camja\Desktop\gst_tacotron2_wavenet\wavenet_vocoder\models\modules.py", line 333, in __call__
return self.convt(inputs)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\layers\base.py", line 362, in __call__
outputs = super(Layer, self).__call__(inputs, *args, **kwargs)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 736, in __call__
outputs = self.call(inputs, *args, **kwargs)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\keras\layers\convolutional.py", line 781, in call
data_format=conv_utils.convert_data_format(self.data_format, ndim=4))
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 1254, in conv2d_transpose
name=name)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\ops\gen_nn_ops.py", line 1340, in conv2d_backprop_input
dilations=dilations, name=name)
File "C:\Users\camja\Anaconda3\envs\taco\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
InvalidArgumentError (see above for traceback): Conv2DCustomBackpropFilterOp only supports NHWC.
[[Node: model/optimizer/gradients/model/inference/conv2d_transpose_1/conv2d_transpose_grad/Conv2DBackpropFilter = Conv2DBackpropFilter[T=DT_FLOAT, _class=["loc:@model/optimizer/clip_by_global_norm/mul_199"], data_format="NCHW", dilations=[1, 1, 1, 1], padding="SAME", strides=[1, 1, 1, 16], use_cudnn_on_gpu=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](model/optimizer/gradients/model/inference/conv2d_transpose_1/BiasAdd_grad/tuple/control_dependency/_2171, model/optimizer/gradients/model/inference/conv2d_transpose/conv2d_transpose_grad/Shape, model/inference/conv2d_transpose/BiasAdd/_2173)]]
The text was updated successfully, but these errors were encountered:
I saw another repo very similar to this one and it had an issue of the problem I'm having with wavenet. I was able to train tacotron to 100k and it saved out the GTA files as well. I tried to train Wavenet but get the error
Conv2DCustomBackpropFilterOp only supports NHWC.
I found Rayhane-mamah/Tacotron-2/issues/73#issuecomment-497370684 which has a solution for it but it's with code that was added to that repo after this one was forked so it's not working for me.
Is it possible to support Windows with this code?
The text was updated successfully, but these errors were encountered: