2015 BFY II Abstract Detail Page
Previous Page |
New Search |
Browse All
Abstract Title: |
Maximum Likelihood is Least Squares (and not just for Gaussian random variables) |
Abstract: |
One often hears that a weighted, least squares fit produces maximum likelihood parameters only if the measured y-values are governed by Gaussian distributions. Here we show that iteratively reweighted least squares (IRLS) also produces maximum likelihood parameters for y-values governed by Poisson, binomial and exponential distributions, whose standard deviation (or variance) depends on the distribution mean.[1] The mathematics for this assertion are demonstrated and show a surprising appearance of the variance of the distribution involved. The IRLS algorithm starts with initial estimates for the fitting parameters from which an initial set of fitted y-values is calculated. The variances are then determined based on the assumed distribution type and using the fitted (not measured) y-values for the distribution mean. Minimizing the chi-square keeping the variances fixed generates an improved fit and new variances. IRLS uses each prior fit to calculate variances for the next and iterations continue until self-consistent. Keeping the standard deviations fixed during each least squares fit, ultimately using their values as calculated at the best fit, guarantees the final fit parameters will be maximum likelihood estimates. [1] A. Charnes, E.L. Frome and P.L. Yu, "Equivalence of Generalized Least Squares and the Maximum Likelihood Estimates in the Exponential Family," J. of the Amer. Stat. Assoc., 71 (1976) 169-171 |
Abstract Type: |
Poster
|
Author/Organizer Information |
Primary Contact: |
Robert DeSerio University of Florida Department of Physics Gainesville, FL 32611 Phone: (352) 392-1690
|
Presentation Documents |
Contributed Poster: |
Download the Contributed Poster
|
|