-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logical Layout #416
Logical Layout #416
Conversation
83f59fb
to
afb5fb3
Compare
3c2f7c1
to
0d47d84
Compare
97da808
to
63c122b
Compare
56bfe08
to
2c8f2df
Compare
Hi @vinx13! I just ran the unit test P.S. Here is the error message:
|
Interesting… this is a negative case intended to fail, but pytest didn’t catch it, I will take a look
… On Aug 29, 2021, at 10:37 AM, Ruihang Lai ***@***.***> wrote:
Hi @vinx13! I just ran the unit test test_tir_transform_lower_logical_layout.py on the latest main branch. But surprisingly it failed... Could you take a look at the test to see what is going wrong 😳?
P.S. Here is the error message:
Traceback (most recent call last):
File "unittest/test_tir_transform_lower_logical_layout.py", line 240, in <module>
test_non_bijective()
File "unittest/test_tir_transform_lower_logical_layout.py", line 230, in test_non_bijective
_check_fail(cache_read_non_bijective)
File "unittest/test_tir_transform_lower_logical_layout.py", line 37, in _check_fail
mod = tvm.tir.transform.LowerLogicalLayout()(mod)
File "/home/rhlai/lrh-tensorir/python/tvm/ir/transform.py", line 161, in __call__
return _ffi_transform_api.RunPass(self, mod)
File "/home/rhlai/lrh-tensorir/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
raise get_last_ffi_error()
ValueError: Traceback (most recent call last):
[bt] (8) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>::VisitStmt(tvm::tir::Stmt const&)+0xb4) [0x7f0461b62aaa]
[bt] (7) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::NodeFunctor<tvm::tir::Stmt (tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*) const+0x148) [0x7f0461b66a10]
[bt] (6) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*)#14}::_FUN(tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*)+0x40) [0x7f0461b662ba]
[bt] (5) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*)#14}::operator()(tvm::runtime::ObjectRef const&, tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>*) const+0x59) [0x7f0461b6625b]
[bt] (4) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::tir::LogicalLayoutMutator::VisitStmt_(tvm::tir::BufferStoreNode const*)+0x8b5) [0x7f0462764fef]
[bt] (3) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::tir::LogicalLayoutMutator::GetInverseAffineIterMap(tvm::runtime::Array<tvm::PrimExpr, void> const&, std::unordered_set<tvm::tir::Var, tvm::runtime::ObjectPtrHash, tvm::runtime::ObjectPtrEqual, std::allocator<tvm::tir::Var> >&, tvm::runtime::Array<tvm::PrimExpr, void> const&, tvm::runtime::String const&)+0x30a) [0x7f046276666c]
[bt] (2) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::runtime::detail::LogFatal::~LogFatal()+0x36) [0x7f0461b2ea2a]
[bt] (1) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::runtime::detail::LogFatal::Entry::Finalize()+0x4a) [0x7f0461b2eb9e]
[bt] (0) /home/rhlai/lrh-tensorir/build/libtvm.so(tvm::runtime::Backtrace[abi:cxx11]()+0x35) [0x7f046338bf13]
File "/home/rhlai/lrh-tensorir/src/tir/transforms/lower_logical_layout.cc", line 295
ValueError: Check failed: iter_map.size() == loop_vars.size() (0 vs. 1) : Logical layout warp.non_bijective should be bijective.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
@vinx13 I see. I didn't look into it before. I just noticed that what you catch is |
TODO: