Linear regression is one of the most widely used statistical methods available today. It is used by data analysts and students in almost every discipline. However, for the standard ordinary least squares method, there are several strong assumptions made about data that is often not true in real world data sets. This can cause numerous problems in the least squares model. One of the most common issues is a model overfitting the data. Ridge Regression and LASSO are two methods used to create a better and more accurate model. I will discuss how overfitting arises in least squares models and the reasoning for using Ridge Regression and LASSO include analysis of real world example data and compare these methods with OLS and each other to further infer the benefits and drawbacks of each method.
Under a partly linear model we study a family of robust estimates for the regression parameter and the regression function when some of the predictors take values on a Riemannian manifold. We obtain the consistency and the asymptotic normality of the proposed estimators. Simulations and an application to a real dataset show the good performance of our proposal under small samples and contamination.
This is a set of notes for the first two chapters of an Abstract Algebra course, following the Hungerford textbook table of contents.
One notable feature is the use of a couple of commands that allow one to show only definitions, or only the examples, etc., and another command that allows one to format examples for making handouts.