Abstract
This paper focuses on the identification of age and gender based on facial images using a deep learning model, namely InceptionV3. Based on the transfer learning principles, we retrained the last layer of InceptionV3 that was pre-trained for the ImageNet classification for improved differentiation of age and gender from the photos in the UTKFace dataset. They need to be trained and tested on a considerable diversity of facial images with labels of age and gender which is offered by this dataset. The preprocessing process includes resize and normalization of the images and the data argumentation comprising rotation, width and height shifts, and horizontal flips to enhance the training data samples' variability and the model robustness. The age labels were further quantized evenly in to 20 groups so that the estimation of the age could be better organized. Age and gender classifications were accomplished with a one-branch multi-output CNN where age and gender were decided together. Gender classification uses binary cross entropy and age estimation uses categorical cross entropy as part of the model's architecture. Training was done with the Adam optimizer with the learning rate that was being adjusted dynamically to help it converge.