Powered by RND
PodcastsÉducationLearning Bayesian Statistics

Learning Bayesian Statistics

Alexandre Andorra
Learning Bayesian Statistics
Dernier épisode

Épisodes disponibles

5 sur 159
  • #136 Bayesian Inference at Scale: Unveiling INLA, with Haavard Rue & Janet van Niekerk
    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:INLA is a fast, deterministic method for Bayesian inference.INLA is particularly useful for large datasets and complex models.The R INLA package is widely used for implementing INLA methodology.INLA has been applied in various fields, including epidemiology and air quality control.Computational challenges in INLA are minimal compared to MCMC methods.The Smart Gradient method enhances the efficiency of INLA.INLA can handle various likelihoods, not just Gaussian.SPDs allow for more efficient computations in spatial modeling.The new INLA methodology scales better for large datasets, especially in medical imaging.Priors in Bayesian models can significantly impact the results and should be chosen carefully.Penalized complexity priors (PC priors) help prevent overfitting in models.Understanding the underlying mathematics of priors is crucial for effective modeling.The integration of GPUs in computational methods is a key future direction for INLA.The development of new sparse solvers is essential for handling larger models efficiently.Chapters:06:06 Understanding INLA: A Comparison with MCMC08:46 Applications of INLA in Real-World Scenarios11:58 Latent Gaussian Models and Their Importance15:12 Impactful Applications of INLA in Health and Environment18:09 Computational Challenges and Solutions in INLA21:06 Stochastic Partial Differential Equations in Spatial Modeling23:55 Future Directions and Innovations in INLA39:51 Exploring Stochastic Differential Equations43:02 Advancements in INLA Methodology50:40 Getting Started with INLA56:25 Understanding Priors in Bayesian ModelsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad
    --------  
    1:17:37
  • BITESIZE | Understanding Simulation-Based Calibration, with Teemu Säilynoja
    Get 10% off Hugo's "Building LLM Applications for Data Scientists and Software Engineers" online course!Today’s clip is from episode 135 of the podcast, with Teemu Säilynoja.Alex and Teemu discuss the importance of simulation-based calibration (SBC). They explore the practical implementation of SBC in probabilistic programming languages, the challenges faced in developing SBC methods, and the significance of both prior and posterior SBC in ensuring model reliability. The discussion emphasizes the need for careful model implementation and inference algorithms to achieve accurate calibration.Get the full conversation here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
    --------  
    21:14
  • #135 Bayesian Calibration and Model Checking, with Teemu Säilynoja
    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Teemu focuses on calibration assessments and predictive checking in Bayesian workflows.Simulation-based calibration (SBC) checks model implementationSBC involves drawing realizations from prior and generating prior predictive data.Visual predictive checking is crucial for assessing model predictions.Prior predictive checks should be done before looking at data.Posterior SBC focuses on the area of parameter space most relevant to the data.Challenges in SBC include inference time.Visualizations complement numerical metrics in Bayesian modeling.Amortized Bayesian inference benefits from SBC for quick posterior checks. The calibration of Bayesian models is more intuitive than Frequentist models.Choosing the right visualization depends on data characteristics.Using multiple visualization methods can reveal different insights.Visualizations should be viewed as models of the data.Goodness of fit tests can enhance visualization accuracy.Uncertainty visualization is crucial but often overlooked.Chapters:09:53 Understanding Simulation-Based Calibration (SBC)15:03 Practical Applications of SBC in Bayesian Modeling22:19 Challenges in Developing Posterior SBC29:41 The Role of SBC in Amortized Bayesian Inference33:47 The Importance of Visual Predictive Checking36:50 Predictive Checking and Model Fitting38:08 The Importance of Visual Checks40:54 Choosing Visualization Types49:06 Visualizations as Models55:02 Uncertainty Visualization in Bayesian Modeling01:00:05 Future Trends in Probabilistic ModelingThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand...
    --------  
    1:12:13
  • Live Show Announcement | Come Meet Me in London!
    ICYMI, I'll be in London next week, for a live episode of the Learning Bayesian Statistics podcast 🍾 Come say hi on June 24 at Imperial College London! We'll be talking about uncertainty quantification — not just in theory, but in the messy, practical reality of building models that are supposed to work in the real world.🎟️ Get your tickets!Some of the questions we’ll unpack:🔍 Why is it so hard to model uncertainty reliably?⚠️ How do overconfident models break things in production?🧠 What tools and frameworks help today?🔄 What do we need to rethink if we want robust ML over the next decade?Joining me on stage: the brilliant Mélodie Monod, Yingzhen Li and François-Xavier Briol -- researchers doing cutting-edge work on these questions, across Bayesian methods, statistical learning, and real-world ML deployment.A huge thank you to Oliver Ratmann for setting this up!📍 Imperial-X, White City Campus (Room LRT 608)🗓️ June 24, 11:30–13:00🎙️ Doors open at 11:30 — we start at noon sharpCome say hi, ask hard questions, and be part of the recording.🎟️ Get your tickets!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh,...
    --------  
    3:04
  • BITESIZE | Exploring Dynamic Regression Models, with David Kohns
    Today’s clip is from episode 134 of the podcast, with David Kohns.Alex and David discuss the future of probabilistic programming, focusing on advancements in time series modeling, model selection, and the integration of AI in prior elicitation. The discussion highlights the importance of setting appropriate priors, the challenges of computational workflows, and the potential of normalizing flows to enhance Bayesian inference.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
    --------  
    14:34

Plus de podcasts Éducation

À propos de Learning Bayesian Statistics

Are you a researcher or data scientist / analyst / ninja? Do you want to learn Bayesian inference, stay up to date or simply want to understand what Bayesian inference is? Then this podcast is for you! You'll hear from researchers and practitioners of all fields about how they use Bayesian statistics, and how in turn YOU can apply these methods in your modeling workflow. When I started learning Bayesian methods, I really wished there were a podcast out there that could introduce me to the methods, the projects and the people who make all that possible. So I created "Learning Bayesian Statistics", where you'll get to hear how Bayesian statistics are used to detect black matter in outer space, forecast elections or understand how diseases spread and can ultimately be stopped. But this show is not only about successes -- it's also about failures, because that's how we learn best. So you'll often hear the guests talking about what *didn't* work in their projects, why, and how they overcame these challenges. Because, in the end, we're all lifelong learners! My name is Alex Andorra by the way, and I live in Estonia. By day, I'm a data scientist and modeler at the https://www.pymc-labs.io/ (PyMC Labs) consultancy. By night, I don't (yet) fight crime, but I'm an open-source enthusiast and core contributor to the python packages https://docs.pymc.io/ (PyMC) and https://arviz-devs.github.io/arviz/ (ArviZ). I also love https://www.pollsposition.com/ (election forecasting) and, most importantly, Nutella. But I don't like talking about it – I prefer eating it. So, whether you want to learn Bayesian statistics or hear about the latest libraries, books and applications, this podcast is for you -- just subscribe! You can also support the show and https://www.patreon.com/learnbayesstats (unlock exclusive Bayesian swag on Patreon)!
Site web du podcast

Écoutez Learning Bayesian Statistics, Le Fil Mental - Fabien Olicard ou d'autres podcasts du monde entier - avec l'app de radio.fr

Obtenez l’app radio.fr
 gratuite

  • Ajout de radios et podcasts en favoris
  • Diffusion via Wi-Fi ou Bluetooth
  • Carplay & Android Auto compatibles
  • Et encore plus de fonctionnalités
Applications
Réseaux sociaux
v7.20.2 | © 2007-2025 radio.de GmbH
Generated: 7/10/2025 - 11:08:39 AM