My research interests
Scalable Bayesian Computation
Bayesian methods provide a powerful paradigm for reasoning about uncertainty and making data-driven inferences in science. However, conventional Bayesian techniques like Markov chain Monte Carlo often struggle to scale to the large, complex models required in many fields. Challenges arise in processes like Bayesian parameter estimation and model comparison, where evaluating the posterior distribution becomes intractable for high-dimensional or non-differentiable models. Even state-of-the-art methods like Hamiltonian Monte Carlo face difficulties in large-scale scientific applications. My research focuses on developing scalable Bayesian computation techniques that can circumvent these challenges. I aim to design novel methods that do not require direct gradient information but can still efficiently explore complex, high-dimensional posterior distributions. This will facilitate full Bayesian analyses for demanding problems in the physical sciences, where parameter spaces are vast and models contain non-differentiable components. By advancing scalable Bayesian inference, my work will provide scientists with more robust tools for reasoning under uncertainty and discovering insights from complex data.
Machine Learning Accelerated Discovery
Machine Learning Accelerated Scientific Discovery is an emerging field with enormous potential to transform the way physical scientists analyze data. By leveraging artificial intelligence and machine learning techniques, we can develop models that accurately approximate complex physical processes and simulations but at a fraction of the computational cost. My research focuses on developing such methods for applications in astronomy and physics, where large datasets and costly simulations often limit the pace of discovery. My approach is to train neural network models that can emulate physical models with high fidelity, providing accelerate insights from data that would otherwise require orders of magnitude more traditional computing power. By developing more efficient ML proxies, I aim to greatly expand the scope and scale of scientific analysis in fields limited by computational resources. This has the potential to massively accelerate discovery and unlock new frontiers of understanding in physics and astronomy.
Open Source Scientific Software
In addition to developing new statistical methods, I believe in the importance of releasing efficient, well-documented open source software implementations. This allows the broader scientific community to readily adopt advanced techniques in their own work. I have contributed several widely used tools in this vein. These include "zeus", my Python implementation of Ensemble Slice Sampling that enables efficient Bayesian inference on challenging posteriors. It has become a standard in astronomy for analyzing highly correlated models. Another tool I introduced is "pocoMC", a Python code for Preconditioned Monte Carlo - a method I developed for sampling multimodal distributions and estimating Bayes factors. It provides astronomers an accessible means for Bayesian model comparison. Finally, "hankl" offers a fast Hankel transform implementation that greatly accelerates computations involving Bessel integrals, widely used in astrophysics and cosmology. All three tools are now popular in the astronomy community and have been utilized in peer-reviewed studies.