Contact Us

Give us a call or drop by anytime, we endeavor to answer all inquiries within 24 hours.

map

Find us

PO Box 16122 Collins Street West Victoria, Australia

Email us

info@domain.com / example@domain.com

Phone support

Phone: + (066) 0760 0260 / + (057) 0760 0560

Machine Learning

  • February 5, 2024
  • HDSIComm

The Synergy between Machine Learning and the Natural Sciences | Max Welling

Abstract: Traditionally machine learning has been heavily influenced by neuroscience (hence the name artificial neural networks) and physics (e.g. MCMC, Belief Propagation, and Diffusion based Generative AI). We have recently witnessed that the flow of information has also reversed, with new tools developed in the ML community impacting physics, chemistry and biology. Examples include faster DFT, Force-Field accelerated MD simulations, PDE Neural Surrogate models, generating druglike molecules, and many more. In this talk I will review the exciting opportunities for further cross fertilization between these fields, ranging from faster (classical) DFT calculations and enhanced transition path sampling to traveling waves in artificial neural networks.

Read More
  • January 5, 2024
  • HDSIComm

Uncertainty Quantification for Interpretable Machine Learning | Lili Zheng

Interpretable machine learning has been widely deployed for scientific discoveries and decision-making, while its reliability hinges on the critical role of uncertainty quantification (UQ). In this talk, I will discuss UQ in two challenging scenarios motivated by scientific and societal applications: selective inference for large-scale graph learning and UQ for model-agnostic machine learning interpretations. Specifically, the first part concerns graphical model inference when only irregular, patchwise observations are available, a common setting in neuroscience, healthcare, genomics, and econometrics. To filter out low-confidence edges due to the irregular measurements, I will present a novel inference method that quantifies the uneven edgewise uncertainty levels over the graph as well as an FDR control procedure; this is achieved by carefully disentangling the dependencies across the graph and consequently yields more reliable graph selection. In the second part, I will discuss the computational and statistical challenges associated with UQ for feature importance of any machine learning model. I will take inspiration from recent advances in conformal inference and utilize an ensemble framework to address these challenges. This leads to an almost computationally free, assumption-light, and statistically powerful inference approach for occlusion-based feature importance. For both parts of the talk, I will highlight the potential applications of my research in science and society as well as how it contributes to more reliable and trustworthy data science.

Read More
  • October 2, 2023
  • HDSIComm

The Uneasy Relation Between Deep Learning and Statistics

Deep learning uses the language and tools of statistics and classical machine learning, including empirical and population losses and optimizing a hypothesis on a training set. But it uses these tools in regimes where they should not be applicable: the optimization task is non-convex, models are often large enough to overfit, and the training and deployment tasks can radically differ. In this talk I will survey the relation between deep learning and statistics. In particular we will discuss recent works supporting the emerging intuition that deep learning is closer in some aspects to human learning than to classical statistics. Rather than estimating quantities from samples, deep neural nets develop broadly applicable representations and skills through their training. The talk will not assume background knowledge in artificial intelligence or deep learning.

Read More
  • June 5, 2023
  • HDSIComm

Deep Latent Variable Models for Compression and Natural Science | Stephan Mandt

Latent variable models have been an integral part of probabilistic machine learning, ranging from simple mixture models to variational autoencoders to powerful diffusion probabilistic models at the center of recent media attention. Perhaps less well-appreciated is the intimate connection between latent variable models and data compression, and the potential of these models for advancing natural science. This talk will explore these topics.

Read More
  • May 12, 2023
  • HDSIComm

Earth & Ocean Image Processing Made Easy with MATLAB – Lunch and Learn

Join us for a lunch-n-learn technical seminar from our MathWorks team on image processing with MATLAB! MathWorks is looking to create a connection point for conversations about the landscape of computational languages – and the broader impact that software has in academia and industry today. This session explores the basics of pixel-level image processing and high-level machine learning models in MATLAB for images.

Read More
  • April 24, 2023
  • Kaleigh O'Merry

Leveraging Simulators for ML Inference in Particle Physics

Abstract: The field of research investigating machine-learning (ML) methods that can exploit a physical model of the world through simulators is rapidly growing, particularly for applications in particle physics. While these methods have shown considerable promise in phenomenological studies, they are also known to be susceptible to inaccuracies in the simulators used to train them. In this work, we design a novel analysis strategy that uses the concept of simulation-based inference for a crucial Higgs Boson measurement, where traditional methods are rendered sub-optimal due to quantum interference between Higgs and non-Higgs processes. Our work develops uncertainty quantification methods that account for the impact of inaccuracies in the simulators, uncertainties in the ML predictions themselves, and novel strategies to test the coverage of these quoted uncertainties. These new ML methods leverage the vast computational resources that have recently become available to perform scientific measurements in a way that was not feasible before. In addition, this talk briefly discusses certain ML-bias-mitigation methods developed in particle physics and their potential wider applications.

Read More