Activation functions are an important are of deep learning research .Many new activation functions are being developed ,these include bio-inspired activtions, purely mathematical activation functions including others . Despite, such advancements we usually find ourselves using RELU and LeakyRELU commonly without using/thinking about others. In the following notebooks I showcase how easy/difficult it is to port an activation function using Custom Layers in Keras and Tensorflow!
Link to main notebook --> Activations.ipynb
- LeakyReLu
- ParametricReLu
- Elu
- SElu
- Swish
- GELU
src
|
|-- Activations.ipynb
|-- utils
|-- Utils.ipynb
|-- utils.py
references
|
|--Ref1
|--Refn
git clone https://github.com/Agrover112/ActivationFunctions.git