A hardware-efficient leaky rectified linear unit (ReLU) activation function with polynomial approximation and shifter implementation is proposed to facilitate the deployment of AI processors in edge ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results