A hardware-efficient leaky rectified linear unit (ReLU) activation function with polynomial approximation and shifter implementation is proposed to facilitate the deployment of AI processors in edge ...