Gagnon, PhilippeAbstractModel selection and parameter estimation are two main aspects of statistical analysis. This thesis discusses these aspects from a Bayesian point of view via three papers. The first one deals with a computational procedure, named the reversible jump algorithm, that allows to simultaneously select models and estimate parameters. This sampler being difficult to tune in practice, we aim at providing guidelines to users for an optimal implementation. An optimally tuned sampler corresponds to a sampler that generates Markov chains that optimally explore their state space. Our goal is achieved through the optimisation of a stochastic process that corresponds to the limit (in distribution) of the sequence of stochastic processes engendered by the algorithm. In the second paper, a strategy leading to robust estimation of the parameters of a linear regression model in presence of outliers is presented. The strategy is to make assumptions that are more adapted to the eventual presence of outliers, compared with the traditional model assuming normality of errors.
This normality assumption is indeed replaced by a super heavy-tailed distribution assumption. Robustness, which is represented by the convergence of the posterior distribution of the parameters (based on the whole sample) towards that arising from the nonoutliers only, is guaranteed when the number of outliers does not exceed a given threshold. Finally, the results presented in the first two papers are combined to introduce a Bayesian robust principal component regression approach that involves model selection in the prediction process. The characteristics of this approach contribute to increase the accuracy of the predictions produced.