Choosing Between ReLu and Tanh for Neural Networks
Imagine you are a data scientist at Amazon tasked with developing a neural network to categorize images of various chair types, such as "Office Chair" and "Dining Chair." Which activation function would you implement in the hidden layers of your neural network: ReLu or Tanh? Discuss the advantages of your chosen function over the other.
Hello, I am bugfree Assistant. Feel free to view the hints above or ask me for any question related to this problem