[Relay/TOPI][Frontend] Add tile and repeat operators in Relay and TOPI#2720
Merged
yzhliu merged 6 commits intoapache:masterfrom Mar 11, 2019
Merged
[Relay/TOPI][Frontend] Add tile and repeat operators in Relay and TOPI#2720yzhliu merged 6 commits intoapache:masterfrom
yzhliu merged 6 commits intoapache:masterfrom
Conversation
Contributor
Author
|
@tqchen Why does CI failed at tests/python/contrib/test_sort.py? CI passed after I refreshed the PR. |
yzhliu
reviewed
Mar 8, 2019
src/relay/op/tensor/transform.cc
Outdated
| // check dimension match | ||
| CHECK(!reps.defined()) | ||
| << "repetition array is not defined. data.ndim = " << ndim; | ||
| const int rndim = static_cast<int>(reps.size()); |
Member
There was a problem hiding this comment.
please keep the type as size_t, also change index variable to size_t, e.g., for (size_t i = 0; ...
src/relay/op/tensor/transform.cc
Outdated
| bool RepeatRel(const Array<Type>& types, | ||
| int num_inputs, | ||
| const Attrs& attrs, | ||
| const TypeReporter& reporter) { |
|
|
||
| def repeat(data, repeats, axis): | ||
| """Repeats elements of an array. | ||
| By default, repeat flattens the input array into 1-D and then repeats the elements. |
Member
There was a problem hiding this comment.
can we add an example? I guess we can simply copy from numpy's doc.
|
|
||
|
|
||
| def tile(data, reps): | ||
| """Repeats the whole array multiple times. |
python/tvm/relay/op/transform.py
Outdated
| .. note:: | ||
| Each dim size of reps must be a positive integer. If reps has length d, | ||
| the result will have dimension of max(d, a.ndim); If a.ndim < d, a is | ||
| promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps |
Member
There was a problem hiding this comment.
Suggested change
| promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps | |
| promoted to be d-dimensional by prepending new axes. If data.ndim >= d, reps |
python/tvm/relay/op/transform.py
Outdated
| The input data to the operator. | ||
|
|
||
| reps : tuple of int | ||
| The number of times repeating the tensor a. |
Member
There was a problem hiding this comment.
Suggested change
| The number of times repeating the tensor a. | |
| The number of times repeating the tensor data. |
python/tvm/relay/op/transform.py
Outdated
| Each dim size of reps must be a positive integer. If reps has length d, | ||
| the result will have dimension of max(d, a.ndim); If a.ndim < d, a is | ||
| promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps | ||
| is promoted to a.ndim by pre-pending 1's to it. |
Member
There was a problem hiding this comment.
Suggested change
| is promoted to a.ndim by pre-pending 1's to it. | |
| is promoted to data.ndim by prepending 1's to reps. |
yzhliu
reviewed
Mar 9, 2019
Member
yzhliu
left a comment
There was a problem hiding this comment.
good to me. just some redundant cast.
src/relay/op/tensor/transform.cc
Outdated
| return false; | ||
| } | ||
| const auto* param = attrs.as<TileAttrs>(); | ||
| const size_t ndim = static_cast<size_t>(data->shape.size()); |
Member
There was a problem hiding this comment.
data->shape.size() is already size_t, right?, no need to cast.
topi/include/topi/transform.h
Outdated
| std::string tag = kBroadcast) { | ||
| int ndim = static_cast<int>(x->shape.size()); | ||
| int rdim = static_cast<int>(reps.size()); | ||
| int tdim = (ndim > rdim) ? ndim : rdim; |
Member
There was a problem hiding this comment.
Array.size() is size_t, no need to cast to int.
src/relay/op/tensor/transform.cc
Outdated
| // check dimension match | ||
| CHECK(!reps.defined()) | ||
| << "repetition array is not defined. data.ndim = " << ndim; | ||
| const size_t rndim = static_cast<size_t>(reps.size()); |
wweic
pushed a commit
to neo-ai/tvm
that referenced
this pull request
Mar 12, 2019
apache#2720) * tile and repeat operator added in rely * fix pylint * fix make warnings * comments addressed * fix lint error * comment addressed
wweic
pushed a commit
to neo-ai/tvm
that referenced
this pull request
Mar 12, 2019
apache#2720) * tile and repeat operator added in rely * fix pylint * fix make warnings * comments addressed * fix lint error * comment addressed
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
tile and repeat are used both in mxnet and numpy. BlockGrad in mxnet during inference is just like drop out which we skip.