Below are R packages that I developed for the sole purpose of studying. Thus they will not be maintained. The codes are well-commented for step-by-step understanding of algorithms. These can be used for teaching or self-studying.
-
varngc: Penalized regressions on Vector Autoregressive models. The loss functions used include penalized least square and penalized log-likelihood. The penalty include lasso and group lasso. This package is parallelizable.
-
t.regression: Contains regression models from UWaterloo’s STAT844 course instructed by Professor Liang Kun, adapted from the lecture notes of Professor Wayne Oldford. This package includes robust regressions, splines, local weighted regressions, general additive cubic splines, random forests, tree boosting, gradient boosting and XGBoost.
-
t.compinf: Contains functions that closely follow UWaterloo’s STAT840 course material provided by Professor Martin Lysy at University of Waterloo. The package covers computer inference topics including profile likelihood, bootstrap, EM algorithm, and Markov Chain Monte Carlo. The package also contains models such as generalized linear model, heteroskedastic linear regression, elastic net regression, forward stage-wise regression, least angle regression, probit regression, gamma regression, quantile regression, and Gaussian finite mixture model.
-
t.classification: Contains functions that closely follow UWaterloo’s STAT841 course material provided by Professor Ali Ghodsi at University of Waterloo. The functions cover classification methods including linear and quadratic discrimination analysis, Fisher’s discrimination analysis, naive bayes, principle component analysis, logistic regression, multinomial regression, fast-forward neural network, radial basis function network, support vector machine, decision tree, and adaboost.