Mini-Batch: A mini-batch is a smaller subset of the entire dataset that is used for each iteration or update during the training of a machine learning model, as an alternative to using the full dataset in a single batch.
Stochastic Gradient Descent: Stochastic Gradient Descent is an optimization algorithm that updates the model parameters by computing the gradient on a single data point or a mini-batch, rather than the entire dataset, to improve computational efficiency.
Batch Normalization: Batch Normalization is a technique used in deep learning to improve the stability and performance of neural networks by normalizing the inputs to each layer using the mean and standard deviation of the current batch.