Skip to content

Instantly share code, notes, and snippets.

@dudash
Last active June 19, 2024 19:58
Show Gist options
  • Save dudash/81f45cc456c31c1659792c54d55a1a76 to your computer and use it in GitHub Desktop.
Save dudash/81f45cc456c31c1659792c54d55a1a76 to your computer and use it in GitHub Desktop.
AI Learning Resources for Cloudy Friends

AI Learning for my Cloudy friends

Putting together a list of resources to share around ramping up on AI and Machine Learning. At least initially this will focus on things that my friends, who also have a vested interest in cloud computing (containers, k8s, microservices), will care about.

General AI/ML Introduction

TBD

               +-------------------------+
               |                         |
               |           AI            |
               |     +-------------+     |
               |     |             |     |
               |     |      ML     |     |
               |     |  +--------+ |     |
               |     |  |        | |     |
               |     |  |   Deep | |     |
               |     |  |Learning| |     |
               |     |  |   +    | |     |
               |     |  |   |    | |     |
               |     |  +---|---+  |     |
               |     |      | LLMs |     |
               |     +------------+      |
               +-------------------------+
  • AI (Artificial Intelligence): The overarching field that includes all techniques enabling machines to mimic human intelligence.
  • ML (Machine Learning): A subset of AI, involving algorithms and statistical models that enable machines to improve at tasks through experience.
  • Deep Learning: A subset of ML, utilizing neural networks with many layers to model complex patterns in data.
  • LLMs (Large Language Models): Placed within the intersection of Deep Learning and ML, as they are advanced models trained on vast amounts of text data using deep learning techniques.

Everything is shown as a part of AI, Machine Learning and Deep Learning are nested subsets within AI, and LLMs are a specific type of model within Deep Learning and Machine Learning.

Key Terminology

Some common terms and definitions to be familiar with.

  1. Training: The process of teaching a machine learning model by feeding it data and adjusting its parameters to minimize error.

  2. Dataset: A collection of data used for training and evaluating a machine learning model. Datasets are typically divided into training, validation, and test sets.

  3. Feature: An individual measurable property or characteristic used as input to a machine learning model. Features are derived from the dataset and represent the information used by the model to make predictions.

  4. Preprocessing: The steps taken to clean, transform, and prepare raw data before feeding it into a machine learning model for training or inference.

  5. Hyperparameters: Settings or configurations that are set before the learning process begins and control the behavior of the learning algorithm (e.g., learning rate, number of trees in a random forest).

  6. Model: The mathematical representation of a machine learning algorithm that has been trained on data to make predictions or decisions.

  7. Inference: The process of using a trained model to make predictions or decisions based on new data.

  8. Model Deployment: The process of making a trained machine learning model available for use in a production environment.

  9. Endpoint: A URL where a deployed model can be accessed to make predictions via API calls. Some endpoints follow standardized formats, like those provided by OpenAI.

  10. GPU (Graphics Processing Unit): A specialized processor designed to accelerate the rendering of graphics and performing parallel processing. In machine learning, GPUs are used to speed up both training and inference by handling multiple computations simultaneously. CUDA, a parallel computing platform and API model created by NVIDIA, allows developers to leverage the power of GPUs for general-purpose processing, significantly enhancing the performance of machine learning tasks.

CUDA and NVIDIA and GPUs

Python and PyTorch

Containers and K8s

Longer Learning

Coursera with Andrew Ng

A 3 course specialization that give a deeper introduction to ML including the math principles that make it work. Supervised ML Regression and Classification, Advanced Learning Algs, Unsupervised Learning.

MORE WILL GO HERE

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment