Helpful Resources for ISYE 6501: Intro to Analytics Modeling — Georgia Tech’s OMSCS

Anika Neela
5 min readDec 10, 2022

This is a deep dive into all the homework resources that helped me secure an A in the ISYE 6501: Intro to Analytics Modeling course. If you have not yet seen my review and other helpful tips for this course, I strongly recommend giving this article a read. That should help gauge if this course is the best fit in your semester.

Weekly Homework

This course has weekly homework due. Yes, even during exam weeks. That being said, they do give you an opportunity to drop your lowest two grades on the weekly homework.

Below are the resources that became super helpful for each week’s homework and I hope it does the same for you. Don’t let that put you in a box to not explore other resources. What maybe helpful for me, might be something that you already knew and vice-versa. The list is curated to help you on those weeks that are too overwhelming for you because life happens.

Photo by Sigmund from Unsplash.com

Week 1 — Classification

Covered concepts mostly on Support vector machine (SVM) and K-nearest neighbor (KNN) algorithms. I found that doing an R Programming Tutorial truly helped me learn the ins-and-outs of not just the language, but also the R-Studio IDE.

Week 2 — Validation & Clustering

Covered concepts mostly on supervised and unsupervised learning algorithms and cross-validation. I personally loved the concept of cross-validation and tried implementing it with both a for-loop and an in-built library. Don’t forget to explore in this course!

Week 3 — Basic Data Preparation & Change Detection

Covered concepts of CUSUM, outliers and outlier detection using Box-and-Whisker.

Week 4 — Time Series

Covered concepts of ARIMA, GARCH and Exponential Smoothing.

Week 5 — Basic Regression

Covered concepts of Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). These are crucial to understand in order to be successful in this course.

Week 6 — Advanced Data Preparation

Covered concepts of Box-Cox transformation and how PCA can be used to reduce dimensionality.

Week 7 — Advanced Regression

Covered concepts of Tree-Based Models.

Week 8 — Variable Selection

Covered concepts of both greedy and non-greedy variable selection methods. Greedy would include Forward selection, Backward elimination and Stepwise regression. Non-greedy would include Lasso, Elastic net, and Ridge regression.

Week 9 — Design of Experiments (DoE) & Probability-based Models

Concepts covered for DoE were: A/B Testing, Factorial design, Multi-armed Bandits. Concepts covered for Probability-based models were Poisson, Weibull, Exponential, Geometric, and Binomial.

Week 10 — Missing Data & Optimization

Covered concepts of data imputations and optimization problem types (convex, non-convex, etc.).

Week 11 — Optimization & Advanced Models

Covered concepts of Stochastic Optimization and other advanced models such as Natural Language Process (NLP), Survival Models, Gradient Boosting etc.

Remaining Weeks

Week 12–14 were Case Study analysis which required you to pull your knowledge from all the prior weeks, and Week 15 was a course summary.

Thank you for reading this article. If you enjoyed my content, please consider following me to show your support for my work.

--

--

Anika Neela

Software Engineer II at Microsoft | Master's in Machine Learning (in-progress) | Poetess | Blogger | Fitness Enthusiast | @anikaneela@me.dm