Building with Open-Source Generative AI (AIARC-101)

Price: $2,595.00
Duration: 3 days
Certification: 
Exam: 
Continuing Education Credits:
Learning Credits:

Learn how to write practical AI applications via hands-on labs. You will design and develop Transformer models, ensuring data security in your work. The course covers AI transformer architectures, Python programming, hardware requirements, training techniques, and AI tasks like classification and regression. It includes hands-on exercises with open-source LLM frameworks, advanced topics like fine-tuning and quantization, and offers AI certification from Alta3 Research. Ideal for Python Developers, DevSecOps Engineers, and Managers or Directors, the course requires basic Python skills and provides access to a GPU-accelerated server for practical experience.. 

Upcoming Class Dates and Times

All Sunset Learning courses are guaranteed to run

Course Outline and Details

  • Python – PCEP Certification or Equivalent Experience
  • Familiarity with Linux
  • Project Managers
  • Architects
  • Developers
  • Data Acquisition Specialist
  • Train and optimize Transformer models with PyTorch.
  • Master advanced prompt engineering.
  • Understand AI architecture, especially Transformers.
  • Write a real-world AI web application.
  • Describe tokenization and word embeddings.
  • Install and use frameworks like Llama-2.
  • Apply strategies to maximize model performance.
  • Explore model quantization and fine-tuning.
  • Compare CPU vs. GPU hardware acceleration.
  • Understand chat vs. instruct interaction modes.

Learning Your Environment

  • Using Vim
  • Tmux
  • VScode Integration
  • Revision Control with GitHub

Deep Learning Intro

  • What is Intelligence?
  • Generative AI Unveiled
  • The Transformer Model
  • Feed Forward Neural Networks
  • Tokenization
  • Word embeddings
  • Positional Encoding

Build a Transformer Model from Scratch

  • PyTorch
  • Construct a Tensor from a Dataset
  • Orchestrate Tensors in Blocks and Batches
  • Initialize PyTorch Generator Function
  • Train the Transformer Model
  • Apply Positional Encoding and Self-Attention
  • Attach the Feed Forward Neural Network
  • Build the Decoder Block
  • Transformer Model as Code

Prompt Engineering

  • Introduction to Prompt Engineering
  • Getting Started with Gemini
  • Developing Basic Prompts
  • Intermediate Prompts: Define Task/Inputs/Outputs/Constraints/Style
  • Advanced Prompts: Chaining, Set Role, Feedback, Examples

Hardware requirements

  • GPUs role in AI performance (CPU vs GPU)
  • Current GPUs and cost vs value
  • Tensorcore vs older GPU architectures

Pre-trained LLM

  • A History of Neural Network Architectures
  • Introduction to the LLaMa.cpp Interface
  • Preparing A100 for Server Operations
  • Operate LLaMa2 Models with LLaMa.cpp
  • Selecting Quantization Level to Meet Performance and Perplexity Requirements
  • Running the llama.cpp Package
  • Llama interactive mode
  • Persistent Context with Llama
  • Constraining Output with Grammars
  • Deploy Llama API Server
  • Develop LLaMa Client Application
  • Write a Real-World AI Application using the Llama API

Fine Tuning

  • Using PyTorch to fine tune models
  • Advanced Prompt Engineering Techniques

Testing and Pushing Limits

  • Maximizing Model Limits
  • Curriculum Path: GenerativeAI

Course Delivery Options

Train face-to-face with the live instructor. (Please note, not all classes will have this option)
Access to on-demand training content anytime, anywhere. (Please note, not all classes will have this option)
Attend the live class from the comfort of your home or office.
Interact with a live, remote instructor from a specialized, HD-equipped classroom near you. An SLI sales rep will confirm location availability prior to registration confirmation.