Grant: RFBR-Sirius
Grant period: 2019–2021
Project instructor: A. Frolov
Description: In the last decade, neural networks have made significant progress in solving many machine learning problems. Despite their great success, there is still no comprehensive understanding of the internal organization of DNNs, and their application are often compared to mysterious ”black boxes”. A major step towards solving this problem is the information-theoretical approach proposed by Tishby and Zaslavsky in 2015, which consists in describing a neural network in the form of a Markov chain from sequential representations of the input layer and studying the change in mutual information between them. In this case, the training of neural network is divided into a fitting phase and a compression phase. At the same time, the theoretical process of evolution of mutual information in the last phase often does not correlate with experimental data. In the proposed project, we will consider a possible way to describe this phenomenon, which consists in examining various approximations of mutual information, as well as adding additive Gaussian noise to neurons on each layer to fulfill the conditions for applying information-theoretic inequalities. In this case, the problem of the evolution of mutual information in a neural network can be considered as information transmission over noisy channel. In addition, we will consider methods for reducing the number of active neurons in each layer based on the analysis of changes in mutual information between layers and apply them to the construction of neural networks for solving clustering and multi-class classification problems.
2020 year
Master classes:
Summer school “Modern methods of information theory, optimization and control”:
We were responsible for the Information theory module (37 students).
Instructors: Frolov A.A., Kabatyansky G.A., Ivanov F.I., Andreev K.V., Marshakov E.A., Kruglik S.A.
Lectures:
Student projects: