Coursera
Coursera Logo

Duke University - Operationalizing LLMs on Azure 

  • Offered byCoursera

Operationalizing LLMs on Azure
 at 
Coursera 
Overview

Ensure your LLM can handle varying workloads efficiently and deliver consistent response times, even at high traffic volumes

Duration

10 hours

Start from

Start Now

Total fee

Free

Mode of learning

Online

Official Website

Explore Free Course External Link Icon

Credential

Certificate

Operationalizing LLMs on Azure
 at 
Coursera 
Highlights

  • Earn a certificate from Coursera
  • Learn from industry experts
Details Icon

Operationalizing LLMs on Azure
 at 
Coursera 
Course details

Skills you will learn
What are the course deliverables?
  • Gain proficiency in leveraging Azure for deploying and managing Large Language Models (LLMs)
  • Develop advanced query crafting skills using Semantic Kernel to optimize interactions with LLMs within the Azure environment
  • Acquire hands-on experience in implementing patterns and deploying applications with Retrieval Augmented Generation (RAG)
More about this course
  • This course is designed for individuals at both an intermediate and beginner level, including data scientists, AI enthusiasts, and professionals seeking to harness the power of Azure for Large Language Models (LLMs)
  • Tailored for those with foundational programming experience and familiarity with Azure basics, this comprehensive program takes you through a four-week journey. In the first week, you'll delve into Azure's AI services and the Azure portal, gaining insights into large language models, their functionalities, and strategies for risk mitigation. Subsequent weeks cover practical applications, including leveraging Azure Machine Learning, managing GPU quotas, deploying models, and utilizing the Azure OpenAI Service
  • As you progress, the course explores nuanced query crafting, Semantic Kernel implementation, and advanced strategies for optimizing interactions with LLMs within the Azure environment
  • The final week focuses on architectural patterns, deployment strategies, and hands-on application building using RAG, Azure services, and GitHub Actions workflows
  • Whether you're a data professional or AI enthusiast, this course equips you with the skills to deploy, optimize, and build robust large-scale applications leveraging Azure and Large Language Models
Read more

Operationalizing LLMs on Azure
 at 
Coursera 
Curriculum

Introduction to LLMOps with Azure

Meet your instructor: Alfredo Deza

About this course

Introduction

Introduction to the Azure Portal

Using Microsoft Learn

Identifying the Azure AI solutions

Introduction to Azure Machine Learning

Introduction to Azure Open AI Service

Summary

Introduction

What are LLMs and how do they work?

Benefits and risks of using LLMs

Mitigating risks of LLMs

Introduction to LLMOps

Summary

Introduction

Discover and evaluate LLMs in Azure

Deployment options for inferencing

What is the Azure AI Content Safety feature?

Azure Machine Learning and Azure Open AI Service differences

Summary

Connect with your instructor

Course structure and discussion etiquette

External lab: Create an Azure account

What is Azure OpenAI Service?

What is Azure Machine Learning?

An introduction to LLMOps

External Lab: Explore Azure Machine Learning Studio

What is Azure Content Safety?

Introduction to LLMOps with Azure

Meet and greet (optional)

LLMs with Azure

Introduction

GPU quotas and availability

Creating a compute resource

Deploying the model

Using the inference API

Summary

Introduction

Getting access to Azure OpenAI Service

Creating an Azure OpenAI Service resource

Deploy an OpenAI model

Using the playground

Summary

Introduction

Using keys and endpoints

Creating a simple Python example

Reviewing usage and quotas

Cleaning up resources

Summary

Azure ML: Create Resources

External lab: Create a compute resource

External lab: Use the Azure OpenAI Service Playground

External lab: Using Azure OpenAI APIs

LLMs with Azure

Extending with Functions and Plugins

Introduction

What is Semantic Kernel?

Using Semantic Kernel with Azure

Using a system prompt

Advanced system prompts

Summary

Introduction

Overview of functions

Defining functions

Using the function with the LLM

Working with errors

Summary

Introduction

Creating a glue function

Consuming function arguments

Using a native function

Overview of a microservice for functions

Using an external microservice API

Summary

External lab: Using Semantic Kernel with Azure

External lab: Using Functions

External lab: Use native functions

Functions and Plugins

Building an End-to-End LLM application in Azure

Introduction

Architectural overview

What is RAG

Overview of Azure AI Search

Automation and deployment with GitHub

Summary

Introduction

Create the Azure resources

Create the embeddings

Create and upload the index

Verifying the embeddings

Using RAG with Azure OpenAI

Summary

Introduction

Application overview

Setting up Azure components

Architectural overview

Using GitHub Actions with Azure

Verifying and troubleshooting deployments

Summary

External lab: Create an Azure AI Search resource

Azure AI Document Intelligence and Azure OpenAI

Introduction to RAG

External lab: Create embeddings in an index

Azure Container Apps

External Lab: Deploy and end-to-end application

Next steps

End-to-end LLM applications

Operationalizing LLMs on Azure
 at 
Coursera 
Admission Process

    Important Dates

    May 25, 2024
    Course Commencement Date

    Other courses offered by Coursera

    – / –
    3 months
    Beginner
    – / –
    20 hours
    Beginner
    – / –
    2 months
    Beginner
    – / –
    3 months
    Beginner
    View Other 6715 CoursesRight Arrow Icon
    qna

    Operationalizing LLMs on Azure
     at 
    Coursera 

    Student Forum

    chatAnything you would want to ask experts?
    Write here...