[microNPU] Add support for SIGMOID#9627
Conversation
lhutton1
left a comment
There was a problem hiding this comment.
LGTM Overall! Just a couple minor things I noticed :)
| return y | ||
|
|
||
|
|
||
| class SigmoidRewriter(DFPatternCallback): |
There was a problem hiding this comment.
This looks very similar to TanhRewriter, is it worth creating a subclass?
There was a problem hiding this comment.
Good idea! I made the change
|
|
||
| def is_valid(self): | ||
| """ | ||
| This function checks whether reshape has compatible attributes with the NPU |
There was a problem hiding this comment.
| This function checks whether reshape has compatible attributes with the NPU | |
| This function checks whether sigmoid has compatible attributes with the NPU |
|
|
||
| class Model(tf.Module): | ||
| @tf.function | ||
| def tanh_function(self, x): |
There was a problem hiding this comment.
| def tanh_function(self, x): | |
| def sigmoid_function(self, x): |
| dtype = "int8" | ||
|
|
||
| def create_tflite_graph(): | ||
| tf.config.run_functions_eagerly(True) |
There was a problem hiding this comment.
| tf.config.run_functions_eagerly(True) |
We don't need it, do we?
a550c44 to
90ac839
Compare
|
Thanks for the review @lhutton1! |
NicolaLancellotti
left a comment
There was a problem hiding this comment.
Some type hints are missing, could you add them?
| self.activation_type = activation_type | ||
| self.calc_func = calc_func | ||
|
|
||
| def callback(self, pre, post, node_map): |
There was a problem hiding this comment.
| def callback(self, pre, post, node_map): | |
| def callback(self, pre: tvm.relay.Expr, post: tvm.relay.Expr, node_map: tvm.ir.container.Map): |
| pass | ||
|
|
||
|
|
||
| def sigmoid_calc_func(x): |
There was a problem hiding this comment.
| def sigmoid_calc_func(x): | |
| def sigmoid_calc_func(x: float) -> float: |
| """A class to create an identity operator with the LUT""" | ||
|
|
||
| def __init__(self): | ||
| def __init__(self, params_class, activation_type, calc_func): |
There was a problem hiding this comment.
| def __init__(self, params_class, activation_type, calc_func): | |
| def __init__(self, params_class: Type, activation_type: string, calc_func: Callable[[float], float]): |
|
|
||
| def find_tanh_values(ifm_scale, ifm_zp, ofm_scale, ofm_zp): | ||
| """Method to calculate the values of the tanh lookup table""" | ||
| def get_lut_from_func(ifm_scale, ifm_zp, ofm_scale, ofm_zp, func): |
There was a problem hiding this comment.
| def get_lut_from_func(ifm_scale, ifm_zp, ofm_scale, ofm_zp, func): | |
| def get_lut_from_func(ifm_scale: float, ifm_zp: int, ofm_scale: float, ofm_zp: int, func: Callable[[float], float]) -> list[int]: |
lhutton1
left a comment
There was a problem hiding this comment.
Apologies, something I missed in my first review. Not essential, but given @NicolaLancellotti's review I suppose there's no harm in including this also :)
| return pattern | ||
|
|
||
|
|
||
| class SigmoidParams: |
There was a problem hiding this comment.
Should we also create a subclass for this and TanhParams?
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
90ac839 to
f818ef3
Compare
|
Thanks for the reviews @NicolaLancellotti and @lhutton1! :) |
|
Thanks all! |
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup table mechanism in the NPU.
Add support for SIGMOID activation function using the lookup
table mechanism in the NPU.