Skip to content
Machine Learning Explained, Machine Learning Tutorials

MachineCurve

  • About MachineCurve
  • Articles
    • Deep learning
    • Other ML techniques
    • Frameworks
    • Applied AI
  • Ask Questions
  • Collections
    • About the Collections project
    • Dissecting Deep Learning (work in progress)
    • Mastering TensorFlow & Keras
    • Mastering PyTorch
    • Mastering Scikit-learn
    • HuggingFace Transformers
  • Newsletter
MachineCurve

Tag: activation functions

Using ReLU, Sigmoid and Tanh with PyTorch, Ignite and Lightning

Using ReLU, Sigmoid and Tanh with PyTorch, Ignite and Lightning

Chris21 January 202121 January 2021Leave a comment
Rectified Linear Unit, Sigmoid and Tanh are three activation functions that play an important role in how neural networks work....
Read More
Why nonlinear activation functions improve ML performance – with TensorFlow 2.0 example

Why nonlinear activation functions improve ML performance – with TensorFlow 2.0 example

Chris29 October 202020 January 20216 Comments
Machine Learning is here to stay. More and more organizations are grasping the benefits of the technology, as long as...
Read More
Overview of activation functions for neural networks

Overview of activation functions for neural networks

Chris24 January 202023 June 20205 Comments
The neurons of neural networks perform operations that are linear: they multiple an input vector with a weights vector and...
Read More
How to use FTSwish with Keras?

How to use FTSwish with Keras?

Chris6 January 20207 January 2020Leave a comment
Flatten-T Swish is a new (2018) activation function that attempts to find the best of both worlds between traditional ReLU...
Read More
What is the FTSwish activation function?

What is the FTSwish activation function?

Chris3 January 20203 January 20201 Comment
Over the last few years, we’ve seen the rise of a wide range of activation functions – such as FTSwish....
Read More
How to use ELU with Keras?

How to use ELU with Keras?

Chris9 December 20199 December 20191 Comment
Exponential Linear Unit is a new type of activation function that attempts to resolve both the defects of traditional ReLU...
Read More
How to use PReLU with Keras?

How to use PReLU with Keras?

Chris5 December 20195 December 20191 Comment
Rectified Linear Unit, or ReLU, is considered to be the standard activation function of choice for today’s neural networks. Even...
Read More
Beyond Swish: the LiSHT activation function

Beyond Swish: the LiSHT activation function

Chris17 November 201929 December 20203 Comments
Deep neural networks perform linear operations to combine weight vectors with input vectors. The values that are the outputs of...
Read More
Using Leaky ReLU with Keras

Using Leaky ReLU with Keras

Chris12 November 201921 April 20202 Comments
Even though the traditional ReLU activation function is used quite often, it may sometimes not produce a converging model. This...
Read More
Leaky ReLU: improving traditional ReLU

Leaky ReLU: improving traditional ReLU

Chris15 October 201922 October 20196 Comments
The Leaky ReLU is a type of activation function which comes across many machine learning blogs every now and then....
Read More

Posts navigation

1 2

Disclaimer

Although we make every effort to always display relevant, current and correct information, we cannot guarantee that the information meets these characteristics.

Privacy Policy

Stay up to date about ML developments 👨‍🎓

We post new blogs every week. Sign up to learn new things and better understand concepts you already know. We send emails every Friday.

By signing up, you consent that any information you receive can include services and special offers by email.

Follow MachineCurve.com

MachineCurve
Proudly powered by WordPress | Theme: refur by Crocoblock.
Show Buttons
Hide Buttons