Convolutional Neural Networks form the backbone of current AI revolution and are used in a multitude of classification and regression problems. This lecture overviews the transition from multilayer perceptrons to deep architectures. The following topics are resented in detail: Tensors and mathematical formulations. Convolutional layers. Fully connected layers. Pooling. Neural Image Features and their relation to human vision are discussed. Various types of convolutions are presented: Atrous (Dilated) Convolution, 1×1 convolution, separable convolutions.  Training convolutional NNs is detailed, including. Initialization, Data augmentation, Batch Normalization, Dropout, Regularization. Various CNN architectures are presented: Siamese Networks, FRACTALNET, DenseNet, Inception, ResNets, Squeeze and Excitation, Network-In-Network, AlexNet / ZFNet,  Deployment on embedded systems. Lightweight deep learning.