Statistics on Batı Şengül
http://www.batisengul.co.uk/tags/statistics/
Recent content in Statistics on Batı ŞengülHugo -- gohugo.iobatisengul@gmail.combatisengul@gmail.comFri, 02 Jul 2021 00:00:00 +0000Introduction To Hamiltonian Monte Carlo
http://www.batisengul.co.uk/post/2021-07-02-intro-to-hmc/
Fri, 02 Jul 2021 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/2021-07-02-intro-to-hmc/Introduction to Hamiltonian Monte Carlo One thing that has been occupying my head in the past couple of weeks has been HMC and how it can be used in large data/large model context. HMC stands for Hamiltonian Monte Carlo and it’s the de facto Bayesian method for sampling due to it’s speed. Before getting into big datasets and big models, let me motivate this problem a little bit.
If you are new to Bayesian modelling, I have a little primer on the topic so I will assume for the most part you are familiar with basic Bayesianism.Variational inference, the art of approximate sampling
http://www.batisengul.co.uk/post/variational-inference-the-art-of-approximate-sampling/
Sat, 21 Jul 2018 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/variational-inference-the-art-of-approximate-sampling/In the spirit of looking at fancy word topics, this post is about variational inference. Suppose you granted me one super power and I chose the ability to sample from any distribution in a fast and accurate way. Now, you might think that’s a crappy super-power, but that basically enables me to fit any model I want and provide uncertainty estimates.
To make the problem concrete, lets suppose you are trying to sample from a distribution \(p(x)\).Spike and slab: Bayesian linear regression with variable selection
http://www.batisengul.co.uk/post/spike-and-slab-bayesian-linear-regression-with-variable-selection/
Wed, 20 Jun 2018 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/spike-and-slab-bayesian-linear-regression-with-variable-selection/Spike and slab is a Bayesian model for simultaneously picking features and doing linear regression. Spike and slab is a shrinkage method, much like ridge and lasso regression, in the sense that it shrinks the “weak” beta values from the regression towards zero. Don’t worry if you have never heard of any of those terms, we will explore all of these using Stan. If you don’t know anything about Bayesian statistics, you can read my introductory post before reading this one.Bayesian analysis of Premier League football
http://www.batisengul.co.uk/post/bayesian-analysis-of-premier-league-football/
Mon, 02 Apr 2018 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/bayesian-analysis-of-premier-league-football/In this post we are going to look at some football statistics. In particular, we will examine English football, the Premier League, using Bayesian statistics with Stan. If you have no idea what Bayesian statistics is, you can read my introductory post on it. Otherwise this post shouldn’t be a difficult read.
All right, let’s get to it. First, we need some data. I will use all the matches from the Premier League seasons 16/17 and 17/18 (which is still ongoing at the time of the writing).Summer Olympics: the countries that beat the expectations
http://www.batisengul.co.uk/post/summer-olympics-the-countries-that-beat-the-expectations/
Mon, 19 Mar 2018 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/summer-olympics-the-countries-that-beat-the-expectations/In this post we take a look at the summer Olympics and try to see which countries performed substantially differently than was expected of them. We will look at the Olympics from 1964 through to 2008. For each year, we will run a predictive model, trying to predict the number of medals a country wins, using selected datasets that are available before each of the Olympics. We will see that this model performs well out of sample and this model will be what we expect.Causal impact and Bayesian structural time series
http://www.batisengul.co.uk/post/causal-impact-and-bayesian-structural-time-series/
Sat, 03 Feb 2018 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/causal-impact-and-bayesian-structural-time-series/Causal impact is a tool for estimating the impact of a one time action. As an example (which we will actually look at the data) consider the BP oil spill in 2010. Let’s say you want to evaluate the impact that this had on BP stocks. Typically with questions like this, we would like to be able to collect multiple samples from a control group and a test group. As this is not possible we would have to try something else.Bayes of our lives: a gentle introduction to Bayesian statistics
http://www.batisengul.co.uk/post/bayes-of-our-lives-a-gentle-introduction-to-bayesian-statistics/
Tue, 07 Nov 2017 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/bayes-of-our-lives-a-gentle-introduction-to-bayesian-statistics/Bayesian statistics is an interpretation of statistics. It is used to help explain the frequentist methods and can give much more information. Even if you have never really learnt about Bayesian statistics, I guarantee you have encountered it in some way.
Bayes, it’s everywhere
In this post, we will only consider a linear model: \(y = \beta x + \epsilon\) where \(\epsilon\) is a standard normal. Suppose we have gathered some data \((Y=\{y_i\}_{i=1}^n, X=\{\{x_{k,i}\}_{k=1}^p\}_{i=1}^n)\), which consist of \(p\) predictors and \(n\) observations, and we wish to fit a linear model.Analysis of calving of JH Dorrington Farm Part III
http://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-iii/
Tue, 10 Oct 2017 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-iii/Drum roll please. This is the long awaited third and final part of the analysis from JH Dorrington Farm. If you have not already, read the first part and second part.
Leaving where I left off, almost all of our models fit pretty well except for CART, so in what follows, I will ignore the CART model. That leaves us with linear regression models and MARS. MARS essentially builds a piecewise linear model using hinges.Analysis of calving of JH Dorrington Farm Part II
http://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-ii/
Fri, 22 Sep 2017 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-ii/This is the second part of the analysis for the data from JH Dorrington Farm. You might want to read the first part before reading this one.
Before we put on our science hats, let us make an outline for what we will do. Previously we split the data into training and test sets 80/20. We will fit all of our models and calibrate them on the training set. Decisions about keeping/dropping predictors, transforming predictors and which model to chose will be left to the test set.Analysis of calving of JH Dorrington Farm Part I
http://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-i/
Tue, 19 Sep 2017 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/analysis-of-calving-of-jh-dorrington-farm-part-i/Here I will analyse a real life problem. My friend Chris at JH Dorrington Farm has kindly provided me with the data and allowed me to make this post. This will be several parts as I explore the data and try to fit various models.
I’m going to stop milking this introduction and get right to it.
My friend Chris has been collecting various forms of data about his cows.Correlation in linear regression
http://www.batisengul.co.uk/post/correlation-in-linear-regression/
Sun, 03 Sep 2017 00:00:00 +0000batisengul@gmail.comhttp://www.batisengul.co.uk/post/correlation-in-linear-regression/If you have a data set with large number of predictors, you might use some basic models to try and eliminate some of the predictors that don’t show a significant relationship to the response variable. In such cases it is important to look at the correlation between the predictors. How important? Let’s find out.
Let us consider a very simple example here with two predictors and one response variable.