## Entropy-based closure for probabilistic learning on manifolds

### Abstract

In a recent paper, the authors proposed a general methodology for probabilistic learning on manifolds. The method was used to generate numerical samples that are statistically consistent with an existing dataset construed as a realization from as a non-Gaussian random vector. The manifold structure is learned using diffusion manifolds and the statistical sample generation is accomplished using a projected Ito stochastic differential equation. This probabilistic learning approach has been extended to polynomial chaos representation of databases on manifolds and to probabilistic nonconvex constrained optimization with a fixed budget of function evaluations. The methodology introduces an isotropic-diffusion kernel with hyperparameter. Currently, is more or less arbitrarily chosen. In this paper, we propose a selection criterion for identifying an optimal value of, based on an entropy argument. The result is a comprehensive, closed, probabilistic model for characterizing data sets with hidden constraints. Applications are presented for several databases.

Type
Date
@article{Soize2017,