Call for Papers: Special Issue "Application of Entropy to Computer Vision and Medical Imaging":
Shannon entropy is initially devoted to quantifying the minimum bits necessary to encode a signal without loss of information; it represents the asymptotic limit of the compression ratio in the Huffman algorithm. Moreover, Shannon entropy is linked to the amount of disorder in random signals. Since Shannon’s work, generalizations of entropy (Rényie, Havrda–Charvat) as well as various applications have emerged. In statistics, as well as in machine learning, different entropies have been used to model uncertainty in data and in parameter estimation and can be also used to evaluate the amount of information in data. From entropies, one can define divergences which are used as “distances” between probability distributions. In deep learning, these entropies are usually used as loss functions for probabilistic neural networks.
This Special Issue is devoted to applications of probabilistic neural networks for computer vision and medical image analysis.
This Special Issue will accept unpublished original papers and comprehensive reviews focused on (but not restricted to) the following research areas:
- Modeling new loss functions in neural networks;
- Use of entropies and information measures for uncertainty quantification in a neural network;
- Choice of relevant entropy depending on the data and the task;
- Influence of activation functions on the choice of entropy;
- Axioms behind the choice of entropy;
- Entropy measures for the evaluation of image quality;
- Applications to medical image analysis and computer vision.
(c) GdR 720 ISIS - CNRS - 2011-2022.