Conversation
|
It somehow makes the API over-complicated compared with numpy's, but I do believe it is okay to be a super set of numpy, so it looks good to me |
|
Not against this change, but curious why do we need this? |
|
Some MXNet models use "reverse" to reshape tensors. It seems unlikely to support it without this argument since it requires alignment from the right to infer shape. |
|
While this makes things easy for MXNet in certain cases, I feel we should not introduce it in the normal reshape, or introduce an experimental operator(_contrib_reverse_reshape, level=10) to support such a case. It would be great to keep the core IR simple and elegant. |
|
@tqchen Sure, I can move this to an experimental operator. |
|
@tqchen Now implements reverse_reshape as a level-10 op. In C++, I re-use the code of reshape infer_type to reduce the code duplication. But Python and C++ sees reshape and reverse_reshape as two different ops. |
|
@icemelon9 please rebase against the master and you can merge the PR |
|
@vinx13 @junrushao1994 @jroesch @tqchen Thanks for the review. |
|
Thanks @icemelon9 @junrushao1994 @jroesch @vinx13 @tqchen now it is merged. |
* Enable reverse in reshape * Fix lint and typo * Put reverse_reshape into a separate op * Fix pylint
* Enable reverse in reshape * Fix lint and typo * Put reverse_reshape into a separate op * Fix pylint
* Enable reverse in reshape * Fix lint and typo * Put reverse_reshape into a separate op * Fix pylint
* Enable reverse in reshape * Fix lint and typo * Put reverse_reshape into a separate op * Fix pylint
Thanks for contributing to TVM! Please refer to guideline https://docs.tvm.ai/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers.