The Baeldung logo
  • The Baeldung LogoCS Sublogo
  • Start Here
  • About ▼▲
    • Full Archive

      The high level overview of all the articles on the site.

    • About Baeldung

      About Baeldung.

Category upArtificial Intelligence

Deep Learning

Deep learning is a technique to perform machine learning using neural networks. Learn about various techniques for training, optimizing, and using multi-layer neural networks.

  • Neural Networks (33)
  • Training (9)
  • Natural Language Processing (8)
  • Generative Adversarial Networks (4)
  • Reinforcement Learning (4)
  • SVM (3)

>> What is End-to-End Deep Learning?

>> How do Siamese Networks Work in Image Recognition?

>> Deep Neural Networks: Padding

>> Single Shot Detectors (SSDs)

>> What Is Maxout in a Neural Network?

>> Recurrent Neural Networks

>> Differences Between Luong Attention and Bahdanau Attention

>> What are Embedding Layers in Neural Networks?

>> Why Use a Surrogate Loss

>> Generative Adversarial Networks: Discriminator’s Loss and Generator’s Loss

>> What Does Backbone Mean in Neural Networks?

>> What Are Channels in Convolutional Networks?

>> Differences Between Bias and Error

>> Instance Segmentation Vs. Semantic Segmentation

>> Online Learning Vs. Offline Learning

>> Data Augmentation

>> Random Sample Consensus Explained

>> What Does Pre-training a Neural Network Mean?

>> Neural Networks: What Is Weight Decay Loss?

>> Neural Networks: Difference Between Conv and FC Layers

>> What Exactly is an N-Gram?

>> Multi-layer Perceptron Vs. Deep Neural Network

>> Model-free vs. Model-based Reinforcement Learning

>> 0-1 Loss Function Explained

>> Differences Between Backpropagation and Feedforward Networks

>> Cross-Validation: K-Fold vs. Leave-One-Out

>> Off-policy vs. On-policy Reinforcement Learning

>> Bias Update in Neural Network Backpropagation

>> Comparing Naïve Bayes and SVM for Text Classification

>> Recurrent vs. Recursive Neural Networks in Natural Language Processing

>> What Are “Bottlenecks” in Neural Networks?

>> Convolutional Neural Network vs. Regular Neural Network

>> Mean Average Precision in Object Detection

>> Hidden Layers in a Neural Network

>> Real-Life Examples of Supervised Learning and Unsupervised Learning

>> Real-World Uses for Genetic Algorithms

>> The Reparameterization Trick in Variational Autoencoders

>> What Is Inductive Bias in Machine Learning?

>> Latent Space in Deep Learning

>> Autoencoders Explained

>> Activation Functions: Sigmoid vs Tanh

>> An Introduction to Contrastive Learning

>> Intuition Behind Kernels in Machine Learning

>> Algorithms for Image Comparison

>> Image Processing: Occlusions

>> Applications of Generative Models

>> Calculate the Output Size of a Convolutional Layer

>> An Introduction to Generative Adversarial Networks

>> Linearly Separable Data in Neural Networks

>> The Effects of The Depth and Number of Trees in a Random Forest

>> Using GANs for Data Augmentation

>> Relation Between Learning Rate and Batch Size

>> Word2vec Word Embedding Operations: Add, Concatenate or Average Word Vectors?

>> Outlier Detection and Handling

>> Feature Selection and Reduction for Text Classification

>> Why Mini-Batch Size Is Better Than One Single “Batch” With All Training Data

>> How to Create a Smart Chatbot?

>> How to Calculate Receptive Field Size in CNN

>> k-Nearest Neighbors and High Dimensional Data

>> State Machines: Components, Representations, Applications

  • Formal Languages

>> Using a Hard Margin vs. Soft Margin in SVM

>> Value Iteration vs. Policy Iteration in Reinforcement Learning

>> How To Convert a Text Sequence to a Vector

>> Instance vs Batch Normalization

>> Trade-offs Between Accuracy and the Number of Support Vectors in SVMs

>> Open Source Neural Network Libraries

>> Semantic Similarity of Two Phrases

>> Word Embeddings: CBOW vs Skip-Gram

>> Ugly Duckling Theorem

>> Encoder-Decoder Models for Natural Language Processing

>> Solving the K-Armed Bandit Problem

>> Epoch in Neural Networks

>> Random Initialization of Weights in a Neural Network

>> Batch Normalization in Convolutional Neural Networks

>> Advantages and Disadvantages of Neural Networks Against SVMs

>> Neural Network Architecture: Criteria for Choosing the Number and Size of Hidden Layers

>> Training Data for Sentiment Analysis

>> F-1 Score for Multi-Class Classification

>> What is a Policy in Reinforcement Learning?

>> What is the Difference Between Gradient Descent and Gradient Ascent?

>> Normalizing Inputs for an Artificial Neural Network

>> Introduction to Convolutional Neural Networks

>> Bias in Neural Networks

>> Understanding Dimensions in CNNs

  • ↑ Back to Top
The Baeldung logo

Categories

  • Algorithms
  • Artificial Intelligence
  • Core Concepts
  • Data Structures
  • Graph Theory
  • Latex
  • Networking
  • Security

Series

About

  • About Baeldung
  • The Full archive
  • Editors
  • Terms of Service
  • Privacy Policy
  • Company Info
  • Contact
The Baeldung Logo