Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug][MXNet] MXNet dot not working with non 2D tensors #11691

Closed
petuca opened this issue Jun 13, 2022 · 5 comments
Closed

[Bug][MXNet] MXNet dot not working with non 2D tensors #11691

petuca opened this issue Jun 13, 2022 · 5 comments
Labels
frontend:mxnet python/tvm/relay/frontend/mxnet* type: bug

Comments

@petuca
Copy link
Contributor

petuca commented Jun 13, 2022

Thanks for participating in the TVM community! We use https://discuss.tvm.ai for any general usage questions and discussions. The issue tracker is used for actionable items such as feature proposals discussion, roadmaps, and bug tracking. You are always welcomed to post on the forum first 😸

Issues that are inactive for a period of time may get closed. We adopt this policy so that we won't lose track of actionable issues that may fall at the bottom of the pile. Feel free to reopen a new one if you feel there is an additional problem that needs attention when an old one gets closed.

Here we have a dot operation problem established in MXNet model with non 2D tensors.

For example here we want to do dot product of two tensors:

  • data tensor: [3]
  • weight tensor: [3,1]
import numpy as np
import mxnet as mx
from mxnet import gluon
from mxnet.gluon import nn
import tvm
from tvm import relay, transform
from tvm.contrib import graph_executor

shape_myx = (3,)
shape_params = (3,1)
transpose_b = False

class MyNetHybrid(gluon.HybridBlock):
    def __init__(self, **kwargs):
        super(MyNetHybrid, self).__init__(**kwargs)
        
        with self.name_scope():
            self.mat_weights = self.params.get('mat_weights', shape=shape_params) 
            
    
    def hybrid_forward(self, F, x, mat_weights):
        x = F.dot(x, mat_weights, transpose_b=transpose_b)
        return x

mynet = MyNetHybrid()
mynet.initialize()

myx = mx.nd.uniform(shape=shape_myx)


shape_dict = {'data' : myx.shape}
mod, params = relay.frontend.from_mxnet(mynet, shape_dict)
dev = tvm.cpu()

with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(mod, target='llvm', params=params)

Looks like this bug is very similar to those reported in #10651 and in PR #11174 for ONNX and PyTorch models.

Similar error is obtained using any shape different from 2D for any of the data and weight tensors.

Expected behavior

Should be compiled by TVM, as it follows correct MXNet specification and can be executed by MXNet.

Actual behavior

Traceback (most recent call last):

  File "/home/syrmia/anaconda3/envs/tvmenv/lib/python3.7/site-packages/spyder_kernels/py3compat.py", line 356, in compat_exec
    exec(code, globals, locals)

  File "/home/syrmia/Desktop/tvm_tutorial/my_scripts/untitled1.py", line 40, in <module>
    mod, params = relay.frontend.from_mxnet(mynet, shape_dict)

  File "/home/syrmia/tvm/python/tvm/relay/frontend/mxnet.py", line 2975, in from_mxnet
    func = _from_mxnet_impl(sym, shape, dtype, params, mod)

  File "/home/syrmia/tvm/python/tvm/relay/frontend/mxnet.py", line 2884, in _from_mxnet_impl
    res = _convert_map[op_name](*op_params)

  File "/home/syrmia/tvm/python/tvm/relay/frontend/mxnet.py", line 802, in _mx_dot
    raise tvm.error.OpAttributeUnimplemented("Only 2-D arrays are supported.")

AttributeError: module 'tvm.error' has no attribute 'OpAttributeUnimplemented'

When I comment the lines for checking ranks in from_mxnet.py file I got this error:

...
  File "/home/syrmia/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 81, in cfun
    rv = local_pyfunc(*pyargs)
  File "/home/syrmia/tvm/python/tvm/relay/op/nn/_nn.py", line 112, in alter_op_layout_dense
    return topi.nn.dense_alter_layout(attrs, inputs, tinfos, out_type)
  File "/home/syrmia/anaconda3/envs/tvmenv/lib/python3.7/site-packages/decorator.py", line 232, in fun
    return caller(func, *(extras + args), **kw)
  File "/home/syrmia/tvm/python/tvm/target/generic_func.py", line 286, in dispatch_func
    return dispatch_dict[k](*args, **kwargs)
  File "/home/syrmia/tvm/python/tvm/topi/x86/dense_alter_op.py", line 48, in _alter_dense_layout
    M, K = get_const_tuple(data_tensor.shape)
ValueError: not enough values to unpack (expected 2, got 1)

Steps to reproduce

The code above successfully reproduce this problem.

Potential solution

Changing the _mx_dot function in from_mxnet.py with:

def _mx_dot(inputs, attrs):
    assert len(inputs) == 2
    
    a = inputs[0]
    b = inputs[1]
    
    rank_a = len(_infer_type(a).checked_type.shape)
    rank_b = len(_infer_type(b).checked_type.shape)
    
    if rank_a < 1 or rank_b < 1:
        raise tvm.error.OpAttributeInvalid("Unsupported shape of input tensors.")

    transpose_a = attrs.get_bool("transpose_a", False)
    transpose_b = attrs.get_bool("transpose_b", False)
    
    if transpose_a is True:
        msg = 'Value {} in attribute "transpose_a" of operator dot ' "is not valid."
        raise tvm.error.OpAttributeInvalid(msg.format(transpose_a))

    # When performing dot product we need to properly handle shape of result -> out_shape
    if rank_a == 1:
        out_shape = list()
        a = _op.expand_dims(a, axis=0)
    else:
        shape_a = list(_infer_type(a).checked_type.shape)
        out_shape = shape_a[:-1]
        a = _op.reshape(a, newshape=(-1, shape_a[-1]))
        
    if rank_b == 1:
        if not out_shape:
            out_shape = [1,]
        b = _op.expand_dims(b, axis=0)
    else:
        # Transpose matrix b if needed
        trans_axes = list(range(rank_b))
        if transpose_b:
            trans_axes = trans_axes[-1:] + trans_axes[:-1] 
            b = _op.transpose(b, axes=trans_axes)
                        
        shape_b = list(_infer_type(b).checked_type.shape)
        out_shape += shape_b[1:]
        
        # Additional transpose is mandatory since _op.nn.dense function transposes second tensor by default
        b = _op.transpose(_op.reshape(b, newshape=(shape_b[0], -1)), axes=[1, 0])

    out = _op.reshape(_op.nn.dense(a, b), newshape=out_shape)
    
    return out

cc: @masahi @junrushao1994 @kevinthesun @ganler

@petuca petuca changed the title [Bug][MxNet] MXNet dot not working with non 2D tensors [Bug][MXNet] MXNet dot not working with non 2D tensors Jun 13, 2022
@petuca
Copy link
Contributor Author

petuca commented Jun 13, 2022

Can you confirm me that my solution is acceptable?

If it is, I will write the tests to check combinations of different tensor's dimensions and create PR for this issue.

Additionally, is there some reason for not using transpose_a in this scenario? If the reason doesn't exist, I can handle it, too.

If you have some better ideas or some advices of upgrading my solution, please tell me.

cc: @masahi @junrushao1994 @kevinthesun @ganler

@ganler
Copy link
Contributor

ganler commented Jun 14, 2022

@petuca Thanks for looking into it. According to how it is fixed in the two PRs you linked, I guess the workaround is to simply return the following line of code after if rank_a == 1's true condition.

return _op.squeeze(_op.nn.matmul(_op.expand_dims(inputs_0, axis=0), inputs_1), axis=[0])

PR is welcome or I can help you patch that when I am more available.

@petuca
Copy link
Contributor Author

petuca commented Jun 16, 2022

Hi @ganler ,

Thank you for comment and advices. I saw your changes and took it as a start idea for solving this, but I found optimal structure with minimal conditions on this way to obtain all possible cases of tensor's dimensions. Eventually, I could swap nn.dense with nn.matmul where I could reduce my code for some extra lines. I will sent the PR tomorrow.

@masahi
Is there some specific reason for not using transpose_a in this scenario? If we could use that, I can implement that, too.

@areusch areusch added the needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it label Oct 19, 2022
@hpanda-naut hpanda-naut added frontend:mxnet python/tvm/relay/frontend/mxnet* and removed needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it labels Nov 16, 2022
@padreofthegame
Copy link
Contributor

Looks like this Issue may be closed?
@petuca @masahi

@petuca
Copy link
Contributor Author

petuca commented Jan 12, 2023

Yes, @padreofthegame this issue has been fixed in PR #11760 .
Thank you for the notification.

@petuca petuca closed this as completed Jan 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
frontend:mxnet python/tvm/relay/frontend/mxnet* type: bug
Projects
None yet
Development

No branches or pull requests

5 participants