Learn about three techniques for improving the performance of ML models: boosting, bagging, and stacking.
Also find me here:
Baeldung Author
Emmanuella Budu
Emmanuella is a research student in Machine Learning and Healthcare, working on generating and assessing synthetic data.
Here's what I've written (so far):
Baeldung on Computer Science
- All
- Machine Learning (5)
- Deep Learning (4)
- Networking (3)
- Security (1)
- Programming (1)
- Core Concepts (1)
Differences Between Servers and Desktops
Filed under Networking
Learn more about the differences between servers and desktops.
What’s a Non-trainable Parameter?
Filed under Deep Learning
Learn how to work with non-trainable parameters
What Is a Peer in Computer Network?
Filed under Networking
Learn about peers in a computer network.
Deprecated vs. Depreciated vs. Obsolete in Software Development
Filed under Programming
Learn the difference between Deprecated, Depreciated and Obsolete.
What Is the Difference Between Antivirus and Firewalls
Filed under Security
A quick and practical comparison between antiviruses and firewalls.
Application Server vs. Web Server
Filed under Networking
Learn more about the difference between Application Servers and Web Servers.
Differences Between CPU and GPU
Filed under Core Concepts
Review the differences between a Central Processing Unit (CPU) and Graphics Processing Unit (GPU)
Differences Between Bias and Error
Filed under Deep Learning, Machine Learning
Lear the differences between bias and error.
Online Learning vs. Offline Learning
Filed under Deep Learning, Machine Learning
Learn the difference between online learning and offline learning.
Random Forest vs. Extremely Randomized Trees
Filed under Machine Learning
Learn the difference between Random Forest and Extremely Randomized Trees.
What Does Pre-training a Neural Network Mean?
Filed under Deep Learning, Machine Learning
Learn how to use pre-trained neural networks.