Skip to main content
SHARE
Publication

A Hybrid Gradient Method to Designing Bayesian Experiments for Implicit Models...

by Jiaxin Zhang, Sirui Bi, Guannan Zhang
Publication Type
Conference Paper
Journal Name
Conference on Neural Information Processing Systems
Book Title
Machine Learning and the Physical Sciences: Workshop at the 34th Conference on Neural Information Processing Systems (NeurIPS)
Publication Date
Page Numbers
1 to 7
Publisher Location
Canada
Conference Name
Conference on Neural Information Processing Systems (NeurIPS)
Conference Location
Virtual, Tennessee, United States of America
Conference Sponsor
Conference on Neural Information Processing Systems
Conference Date
-

Bayesian experimental design (BED) aims at designing an experiment to maximize the information gathering from the collected data. The optimal design is usually achieved by maximizing the mutual information (MI) between the data and the model parameters. When the analytical expression of the MI is unavailable, e.g.,having implicit models with intractable data distributions, a neural network-based lower bound of the MI was recently proposed and a gradient ascent method was used to maximize the lower bound [1]. However, the approach in [1] requires a pathwise sampling path to compute the gradient of the MI lower bound with respect to the design variables, and such a pathwise sampling path is usually inaccessible for implicit models. In this work, we propose a hybrid gradient approach that leverages recent advances in variational MI estimator and evolution strategies (ES)combined with black-box stochastic gradient ascent (SGA) to maximize the MI lower bound. This allows the design process to be achieved through a unified scalable procedure for implicit models without sampling path gradients. Several experiments demonstrate that our approach significantly improves the scalability of BED for implicit models in high-dimensional design space.