include("./includes/mlsshead.php"); ?>
April 25 is a joint tutorial day of AISTATS and the MLSS Machine Learning Summer School.
Roderick Murray-Smith,
University of Glasgow
Machine learning and Human Computer Interaction The opportunities for interaction with computer systems are rapidly expanding beyond traditional input and output paradigms: full-body motion sensors, brain-computer interfaces, 3D displays, touch panels are now commonplace commercial items. The profusion of new sensing devices for human input and the new display channels which are becoming available offer the potential to create more involving, expressive and efficient interactions in a much wider range of contexts. Dealing with these complex sources of human intention requires appropriate mathematical methods; modelling and analysis of interactions requires sophisticated methods which can transform streams of data from complex sensors into estimates of human intention. This tutorial will focus on the use of inference and dynamical modelling in human-computer interaction. The combination of modern statistical inference and real-time closed loop modelling offers rich possibilities in building interactive systems, but there is a significant gap between the techniques commonly used in HCI and the mathematical tools available in other fields of computing science. This tutorial aims to illustrate how to bring these mathematical tools to bear on interaction problems, and will cover basic theory and example applications from mobile interaction, interaction with large music collections and full-body interaction. |
|
Christian P. Robert,
Ceremade - Université Paris-Dauphine
Approximate Bayesian computation (ABC), methodology and applications ABC appeared in 1999 to solve complex genetic problems where the likelihood of the model was impossible to compute. They are now a standard tool in the statistical genetic community but have also addressed many other problems where likelihood computation was also an issue, including dynamic models in signal processing and financial data analysis. However, these methods suffer to some degree from calibration difficulties that make them rather volatile in their implementation and thus render them suspicious to the users of more traditional Monte Carlo methods. Nonetheless, ABC techniques have several claims to validity: first, they are connected with econometric methods like indirect inference. Second, they can be expressed in terms of various non-parametric estimators of the likelihood or of the posterior density and follow standard convergence patterns. At last, they appear as regular Bayesian inference over noisy data. The tutorial covers those validation steps but also details different implementations of ABC algorithms and calibration of their parameters. |
|
Håvard Rue,
Norwegian University of Science and Technology
Bayesian computing with INLA In this lecture, I will discuss approximate Bayesian inference for the class of latent Gaussian models (LGMs). LGMs are perhaps the most commonly used class of models in statistical applications. It includes, among others, most of (generalised) linear models, (generalised) additive models, smoothing spline models, state space models, semiparametric regression, spatial and spatiotemporal models, log-Gaussian Cox processes and geostatistical and geoadditive models. The concept of LGMs is extremely useful when doing inference as we can treat models listed above in a unified way and using the same algorithms and software tool. Our approach to (approximate) Bayesian inference, is to use integrated nested Laplace approximations (INLA). Using this new tool, we can directly compute very accurate approximations to the posterior marginals. Another advantage with our approach is its generality, which makes it possible to perform Bayesian analysis in an automatic, streamlined way, and to compute model comparison criteria and various predictive measures so that models can be compared and the model under study can be challenged. I will discuss the background for understanding LGM and INLA, end by illustrating INLA on some examples in R. Please visit www.r-inla.org to download the package and for further information. |