Garrett Mayock's About Me

Alles an mir


What's all this?

Well, this is my personal website! Now, I am going to talk about work as well, because work is a part of my person, but I don't want anyone to get the idea that this is my resume (although you can check out my CV here). Yes, technically speaking, this is a bunch of code I put together so I could learn to code in order to make more money at a job by virtue of being able to program - but this is still my personal site. I'm going to speak colloquially and include a few jokes here and there.

Job stuff

Hello! I'm Garrett and I'm trying to alter my career trajectory a bit. I am learning applied data science to become a customer-facing data science resource.

In my career I have worked as a business analyst, because I'm personable, detail-oriented, and talented at understanding problems and implementing solutions. As I progressed in my career, however, I noticed that there always seemed to be opportunities to do things a better way - if I knew how to code.

I have a unique skillset of being able to succeed in three important phases of data science consulting projects:
  1. Understanding the business problem, data, and the environment(s) which surround it
  2. Brainstorming and modeling potential solutions
  3. Working with data scientists and developers to translate those plans into solutions
Nevertheless, gettting from phase 2 to phase 3 always presents a challenge. This is because the person building the solution isn't always the one who best understands the client's business problem, their business, or the meaning behind the client's data.

It is in this challenge that I see an opportunity. For every bit better I can understand the code behind what's built, that much quicker the iterative communication is between client and coder. Furthermore, I can start prototyping in phase 2 - instead of handing data scientists boxes and lines on paper, I can hand them working code frameworks.

Why does that matter? Sources say 80% of a data scientist's time is spent finding, cleaning, and organizing data. If I as a business analyst can knock that down by, say, a quarter - to 60% - that literally doubles the amount of time the data scientist has to perform actual analysis.

So why the website? Don't I know no data scientist uses HTML or PHP? Well, yeah. But if I want to show off Python code I've written, for example, making it publicly available is going to make it a lot easier to spark interest from prospective employers. They just have to go to and boom - proof I can code. Besides, starting off by building my own website serves two other purposes:
  1. I get to learn to code in one of the most ubiquitous tech stacks out there (WAMP/LAMP)
  2. I get to prove I can
So that's why this site exists. Feel free to browse my blog, my portfolios of work, or continue reading.


Here is a list of my accomplishments as a customer-facing business analyst within the data science industry.

From a product standpoint: From a data standpoint: From a consulting standpoint: From a leadership standpoint:

Data science tech I've touched

So, here's a list of technologies I've touched while coding this website and studying data science, and a bit about how they're used:
Language How I've used it
HTML5 This entire website has been handcoded in HTML5.
CSS The styling of this website is handcoded in CSS.
PHP The login-system has been created in PHP. Sessions are persisted, and various features are visibile or invisible depending on whether or not a visitor is logged in. I've also added dynamic meta image tag capabilities using PHP.
Javascript Javascript has been used for a few actions, to call the QuillJS rich text editor I'm using for posting blogs, and to highlight the code in my blogposts.
SQL Technically MySQL. Using this to register new users to the database, to verify user passwords when logging in, to save and surface blog posts, etc.
LaTeX Briefly touched while studying Python.
Python Python is the core of my data science efforts. See next table for more details.

Python module/tool How I've used it
Anaconda Navigator When I installed Python, it was commonly recommended to install Anaconda Navigator as a newbie to help kick-start me. I've only used conda to install one module (PyPDF2) which did not come with it (as of Mar 6, '19). I don't know all of its features, which is probably a sign that its job of making things easier is done well. Most of the work I've done which required installing modules (like graphviz, for example) has been done in Google Colab or Kaggle's Jupyter notebooks.
Jupyter Notebooks - Local, Google Colab, & Kaggle Notebooks I've spun up Jupyter Notebooks locally, as well as using hosted notebooks. Specifically, I've taken a data-science bootcamp from Lambda School where the coursework was all done in Google Colab, as well as using Kaggle's via their website.
pandas I've used Pandas for various data structure and analytical tasks. I started with loading data, dealing with null values, and encoding categorical data with get_dummies(). Recently (as of Mar 6, '19) I've using read_excel often and doing slightly more complex tasks such as merging dataframes.
matplotlib, matplotlib.pyplot, mpl_toolkits.mplot3d Lots of plots, mostly 2D with some 3D. Primarily line plots, scatter plots, and bar charts.
numpy Used for linear algebra and matrix manipulation including manually calculating linear regression. I've also used polyfit() and lstsq() to calculate regression models.
sklearn I've used sci-kit-learn for a variety of regression and classification models, as well as using their metrics. I've also used it to create train_test_splits, cross validation with KFold and GridSearchCV, to scale features, for feature selection, and to calculate various metrics such as error, accuracy, and recall.
The models I've used from sklearn include ensemble classifiers such as random forest, extra trees, and ada and gradient boost classifiers, as well as linear and logistic regressors, nearest neighbors, naive bayes, and support vector machines.
xgboost I used XGBClassifier as one of the classifiers I ran on the Titanic data.
graphviz, dtreeviz I used these to visualize the decision trees from sklearn.
SciPy I used this to calculate critical value for chi-squared values, linear regression, and curve fitting. I've also used SciPy to optimize variables such as for exponential smoothing models or Holt's Model of demand forecasting.
seaborn Pretty data visualization. Better looking than pyplot as far as I've seen. Also has useful training datasets which can be easily called like load_dataset(diamonds).
statistics As the name implies, quickly calculate statistics.
random Pseudo-random number generation. Used to create example data as well as create bootstrap samples using choices().
statsmodel Used several times for Ordinary Least Squares linear regression. I like the .summary() visual that it produces for linear regression, so at this point (as of Mar 6, '19) it's become my primary package for OLS.
itertools Building blocks for iteration. I've used takewhile() and cycle().
functools reduce()
collections Specialized container datatypes. I first used to create a function to return all modes in multimodal datasets, specifically because statistics.mode() doesn't return modes in multimodal datasets. I've also used it to count key-value pairs for a handful of functions on the Titanic data set.
glob I've only used glob a couple of times to access files and directories on my computer.
os I've only used os a couple of times to interface with my operating system by splitting file names or talking to the environ.
logging I used this once to create a log to track what was happening with some really inefficient code I wrote, and then used the log results to fix it.
sys I used this once while creating the log to fix some really inefficient code I wrote.
time This module provides various time related functions. I used it to measure optimization of inefficient code.

Program How I've used it
Visual Studio Code This entire website has been handcoded in Visual Studio Code. I also used the Live Server Extension to automatically refresh pages upon save, and the Python Extension to run Python files in the terminal in VSC.
WAMPServer Once I started coding in PHP, I realized I needed to set up an environment to be able to run server-side code. Hey, the more you learn, right? Anyway, WAMPServer makes it a piece of cake, from installation to running to editing config files.
cPanel cPanel is my hosting service's control panel. I upload files here, set up my domain's email addresses, and edited an ini file or two to get the site running how I wanted.
Apache Apache HTTP Server is my server software. I've only made a couple edits to the ini.
phpMyAdmin I use phpMyAdmin to manage my database(s) - user access, database table creation, and verification that my MySQL queries are working.
Adobe XD I sketched my original website mockup using XD. It also helped me create the SVG (scalable vector graphic) you see in the bottom right of each page.
Notepad++ I briefly used Notepad++ before discovering the joys of Visual Studio Code.
Chrome, Brave I tested this website on these browsers, using the inspect functionality to check code, read logged javascript, and find and troubleshoot errors.

That's it for now. I'll be updating this list as I go. For my most recent plans, please check my blog.

Not job stuff

Hi, I'm Garrett. I'm a decent human being most of the time, probably.

Outside of learning to code, I like acting, improv, stand-up. Observing and participating. I love performing. Karaoke is a blast.

I also love physical activities. Working out, playing games, snowboarding, jogging around unknown cities and exploring.

I stay away from a lot of social media. I did make a Twitter, but only for the purposes of testing my website's meta tags. I used to have a Facebook, but man ... boy oh boy. Facebook. Toxic. No thank you. That kinda stuff can't be healthy. I do have an Instagram. I know there's a big bunch of instafamous people who get a few thousand fake followers and then use the positive feedback mechanism of instagram to perpetuate their delusions of grandeur, but I don't mind that because I don't follow them and out-of-sight = out-of-mind.

I used to smoke but I quit over 10 months ago now as of writing (I quit Apr 9 2018). Quitting has been one of the greatest decisions of my life. For those of you struggling with nicotine addiction, I highly recommend Allen Carr's "Easy Way to Quit Smoking". Spoiler alert: there is no easy way, but that's okay once you're in the right mindset.

My greatest moment came in grade school when I ad-libbed a line in a school play about green pizza. It's all been downhill from there ;-)

I speak German more or less fluently. I've always had a knack for languages. At one point in high school I was studying four foreign languages at once, being German, Spanish, French, and Japanese. I can read and speak Spanish and French to some extent (Spanish more than French) because I have a firm understanding of the grammar, but it's been so long that my vocabulary is incredibly limited. About the only thing I remember how to say in Japanese is korewa nihongoga hanashimasen!!!! and even that is probably said wrong.

Oh, and I have the best dog. The best. The greatest dog to ever be, there's no other dog like it, there's never been. It's the greatest dog. All the other dogs are worse. Sad! <<Trump Hands>>

Naw but really doe:

a <i>seriously</i> good boye

Look at the good boye!

contact me