Rust for Large Language Model Operations (LLMOps)
- Offered byCoursera
Rust for Large Language Model Operations (LLMOps) at Coursera Overview
Duration | 16 hours |
Start from | Start Now |
Total fee | Free |
Mode of learning | Online |
Official Website | Explore Free Course |
Credential | Certificate |
Rust for Large Language Model Operations (LLMOps) at Coursera Highlights
- Earn a certificatefrom Duke University
- Add to your LinkedIn profile
Rust for Large Language Model Operations (LLMOps) at Coursera Course details
- What you'll learn
- Mastery in deploying Rust for intricate LLMOps workflows.
- Do you aspire to be a Rust developer at the forefront of the AI revolution? This groundbreaking course is designed specifically to train you in Large Language Model Operations (LLMOps) using Rust. This course doesn't just scratch the surface; it takes a deep dive into how you can integrate Rust with sophisticated LLM frameworks like HuggingFace Transformers. We'll also explore how to effectively deploy these large models on cloud infrastructures such as AWS, all while incorporating DevOps methodologies tailored for LLMOps.
Rust for Large Language Model Operations (LLMOps) at Coursera Curriculum
DevOps Concepts for MLOps
Instructor Intro
A Function, the Essence of Programming
Operationalize Microservices
Continuous Integration for Microservices
What is Makefile and how do you use it?
What is DevOps?
Kaizen methodology
Infrastructure as Code for Continuous Delivery
Responding to Compromised Resources and Workloads
Designing and Implementing Monitoring and Alerting
Audit Network Security
Rust Secure by Design
Preventing Data Races with Rust Compiler
Using AWS Config for Security
AWS Security Hub Demo
Explain How to Secure Your Account with 2FA
Understanding Access Permissions
Repository Permission Levels Explained
Repository Privacy Settings and Options
Unveiling Key Concepts of the GitHub Ecosystem
Demo: Implementing GitHub Actions
Demo: GitHub Codespaces
Demo: GitHub Copilot
Source Code Resources
Infrastructure as code
Continuous integration
Continuous delivery
Automation and tooling
Shared responsibility
Identity and access management
Infrastructure protection
Incident response
External Lab: Use GitHub Actions and Codespaces
About two-factor authentication
Access permissions on GitHub
About Continuous Integration
About continuous deployment
Final Week-Reflections
DevOps Concepts for MLOps
Lab: Using a Makefile with Rust
Lab: Preventing Data Races in Rust
Rust Hugging Face Candle
Candle: A Minimalist ML Framework for Rust
Using GitHub Codespaces for GPU Inference with Rust Candle
VSCode Remote SSH development AWS Accelerated Compute
Building Hello World Candle
Exploring StarCoder: A State-of-the-Art LLM
Using Whisper with Candle to Transcribe
Exploring Remote Dev Architectures on AWS
Advantages of Rust for LLMs
Serverless Inference
Rust CLI Inference
Rust Chat Inference
Continuous Build of Binaries for LLMOps
Chat Loop for StarCoder
Invoking Rust Candle on AWS G5-Part One
Invoking BigCode on AWS G5-Part Two
rust-candle-demos
Configuring NVIDIA CUDA for your codespace
Getting Started Candle
Candle Examples
External Lab: Candle Hello World
External Lab: Run an LLM with Candle
Developer Guide cuDNN
cuDNN Webinar
Programming Tensor Cores in CUDA 9
Tensor Ops Made Easier in cuDNN
External Lab: Using BigCode to Assist With Coding
StarCoder: A State-of-the-Art LLM for Code
Falcon LLM
Whisper LLM
Candle Structure
Final Week Reflection
Rust Hugging Face Candle
Key LLMOps Technologies
Introduction to Rust Bert
Installation and Setup
Basic Syntax and Model Loading
Building a sentiment analysis CLI
Introduction to Rust PyTorch
Running a PyTorch Hello World
PyTorch Pretrained
Running PyTorch Pretrained
Introduction to ONNX
ONNX Conversions
Getting Started Rust Bert
External Lab: Translate a Spanish song to English
Rust Bert pipelines
ONNX Support Rust Bert
Loading pretrained and custom model weights
External Lab: Run a Pretrained model
Rust bindings for PyTorch
ONNX Concepts
ONNX with Python
Converters
ONNX Model Hub
Final Week-Reflections
External Lab: Use ONNX
Using Rust Bert
Key Generative AI Technologies
Extending Google Bard
Exploring Google Colab with Bard
Exploring Colab AI
Exploring Gen App Builder
Responsible AI with AWS Bedrock
AWS Bedrock with Claude
Summarizing text with Claude
Using the AWS Bedrock API
Live Coding AWS CodeWhisperer Part One
Live Coding AWS CodeWhisperer Part Two
Live Coding AWS CodeWhisperer Part Three
Using AWS CodeWhisperer CLI
Bard FAQ
External Lab: Build a plot with Colab AI
External Lab: AWS Bedrock
AWS Cloud Adoption Framework for Artificial Intelligence, Machine Learning, and Generative AI
People perspective: Culture and change towards AI/ML-first
External Lab: Use CodeWhisperer for Rust Calculator
Key LLMOps Technologies
Final-Quiz