Learn what distinguishies GELU from other activation functions we use in neural networks.
Thomas Carr is a data-scientist with 7years experience in machine learning and artificial intelligence. He holds a BSc from University College Cork, an MSc from the University of Edinburgh and a PhD from Aston University. His PhD work focused on deep reinforcement learning and domain adaptation in that context. This work further explores generalisation and extrapolation arising from domain adaptation. He has also worked across a range of other modelling problems and has worked with tree based methods, deep visual recommender systems and large language models.
Here's what I've written (so far):
Baeldung on Computer Science
- Deep Learning (4)
- Machine Learning (3)
- Artificial Intelligence (1)
- Algorithms (1)
Explore non-linearities – a key component of neural networks.
Explore many practical and theoretical considerations when dealing with AI ethics.
Explore a deterministic policy that maps each state to a single action and a stochastic policy that maps each state to a probability distribution over actions.
Learn the difference between episodes and epochs.
Explore Slime Mould Algorithm.
Learn about maximum likelihood estimation.