Learn what PAC (probably approximately correct) means.
Also find me here:
Baeldung Author
Akbar Karimi
I'm a PhD candidate at the University of Parma, Italy. I've been doing research in machine learning since 2014, focusing mainly on NLP models and applications. Besides doing research, I like learning languages.
Here's what I've written (so far):
Baeldung on Computer Science
- All
- Machine Learning (9)
- Deep Learning (4)
- Algorithms (3)
- Core Concepts (2)
- Searching (1)
- Math and Logic (1)
- Graph Traversal (1)
- Artificial Intelligence (1)
Differences Between Hinge Loss and Logistic Loss
Filed under Machine Learning
Explore the hinge and logistic loss functions.
Hidden Markov Models vs. Conditional Random Fields
Filed under Artificial Intelligence
Explore the differences between Hidden Markov Models and Conditional Random Fields.
Recurrent vs. Recursive Neural Networks in Natural Language Processing
Filed under Deep Learning, Machine Learning
Explore the difference between the recurrent and recursive neural networks in natural language processing.
What Is Inductive Bias in Machine Learning?
Filed under Deep Learning, Machine Learning
Learn about the two types of inductive biases in traditional machine learning and deep learning.
Bayesian Networks
Filed under Machine Learning
Learn about the Bayesian Networks (BNs).
How to Get Vector for A Sentence From Word2vec of Tokens
Filed under Machine Learning
Explore vector representations of sentences using token representations.
Fastest Algorithm to Find Prime Numbers
Filed under Math and Logic
Explore some of the fastest algorithms that we can use to generate prime numbers up to a given number.
Beam Search Algorithm
Filed under Graph Traversal
Learn about the Beam Search algorithm.
Hill Climbing Search vs. Best First Search
Filed under Searching
Explore the Hill Climbing and Best First Search (BeFS) algorithms and compare their characteristics.
An Overview of Evolutionary Algorithms
Filed under Algorithms
Learn about the evolutionary algorithms that have performed well compared to other techniques in artificial intelligence.
NLP’s word2vec: Negative Sampling Explained
Filed under Machine Learning
Explore the Skip-gram model for training word vectors and learn about how negative sampling is used for this purpose.
Using a Hard Margin vs. Soft Margin in SVM
Filed under Deep Learning, Machine Learning
Learn about the difference between using a hard margin and a soft margin in SVM.
Instance vs Batch Normalization
Filed under Deep Learning, Machine Learning
Learn the difference between Instance and Batch normalization
Difference between Big-O and Little-o Notations
Filed under Algorithms, Core Concepts
Understand how the big-O and little-o notations differ and what it means to be asymptotically tight.
What Is a Heuristic Function?
Filed under Algorithms, Core Concepts
Learn about heuristic functions, their benefits and pitfalls, and some of the examples where we can use them.