Learn how self-organizing maps (SOMs) work.
I'm a PhD candidate at the University of Parma, Italy. I've been doing research in machine learning since 2014, focusing mainly on NLP models and applications. Besides doing research, I like learning languages.
Here's what I've written (so far):
Baeldung on Computer Science
- Machine Learning (9)
- Deep Learning (4)
- Algorithms (3)
- Core Concepts (2)
- Artificial Intelligence (2)
- Searching (1)
- Math and Logic (1)
- Graph Traversal (1)
Learn what PAC (probably approximately correct) means.
Explore the hinge and logistic loss functions.
Explore the differences between Hidden Markov Models and Conditional Random Fields.
Explore the difference between the recurrent and recursive neural networks in natural language processing.
Learn about the two types of inductive biases in traditional machine learning and deep learning.
Learn about the Bayesian Networks (BNs).
Explore vector representations of sentences using token representations.
Explore some of the fastest algorithms that we can use to generate prime numbers up to a given number.
Learn about the Beam Search algorithm.
Explore the Hill Climbing and Best First Search (BeFS) algorithms and compare their characteristics.
Learn about the evolutionary algorithms that have performed well compared to other techniques in artificial intelligence.
Explore the Skip-gram model for training word vectors and learn about how negative sampling is used for this purpose.
Learn about the difference between using a hard margin and a soft margin in SVM.
Learn the difference between Instance and Batch normalization
Understand how the big-O and little-o notations differ and what it means to be asymptotically tight.
Learn about heuristic functions, their benefits and pitfalls, and some of the examples where we can use them.