Week 6 - Cloud and Azure Essentials
Introduction to Cloud and Azure
Week 6 Assignment: Deploy Your Pipeline to Azure
This week connects containers to the cloud. Students have read the material on Azure services, blob storage, Postgres, Container Apps Jobs, and cost awareness. Today is about consolidation and hands-on practice.
We are not just teaching "how to use Azure CLI"; we are showing students how to deploy a data pipeline to the cloud end to end. By the end of class, every student should have a working Container App Job that writes to blob storage and Postgres.
The teacher must provision shared resources before class:
# Create resource group
az group create --name rg-hyf-data --location westeurope
# Create storage account
az storage account create \\
--name hyfstoragedev \\
--resource-group rg-hyf-data \\
--location westeurope \\
--sku Standard_LRS
# Create blob containers for raw and processed data
az storage container create \\
--name raw \\
--account-name hyfstoragedev
az storage container create \\
--name processed \\
--account-name hyfstoragedev
# Create Postgres server (if not already created)
az postgres flexible-server create \\
--name hyf-data-pg \\
--resource-group rg-hyf-data \\
--location westeurope \\
--admin-user hyfadmin \\
--admin-password "<STRONG_PASSWORD>" \\
--sku-name Standard_B1ms \\
--tier Burstable
# Create a database per student/team
az postgres flexible-server db create \\
--server-name hyf-data-pg \\
--resource-group rg-hyf-data \\
--database-name team1
# Create Container Apps environment
az containerapp env create \\
--name env-hyf-data \\
--resource-group rg-hyf-data \\
--location westeurope
# Create Key Vault and store connection strings
az keyvault create \\
--name kv-hyf-data \\
--resource-group rg-hyf-data \\
--location westeurope
az keyvault secret set \\
--vault-name kv-hyf-data \\
--name postgres-url \\
--value "postgresql://pipeline_user:<PASSWORD>@hyf-data-pg.postgres.database.azure.com:5432/team1?sslmode=require"
az keyvault secret set \\
--vault-name kv-hyf-data \\
--name storage-connection-string \\
--value "<STORAGE_CONNECTION_STRING>"
If the Postgres server was stopped to save costs, start it before class:
az postgres flexible-server start \\
--name hyf-data-pg \\
--resource-group rg-hyf-data
Ensure every student/team has:
team1, team2)By the end of this lesson, students should be able to:
Navigate* Azure resources using the CLI and portal.
Retrieve* connection strings from Key Vault.
Upload* files to Azure Blob Storage from Python.
Connect* a Python app to Azure Postgres and query data with DBeaver.
Deploy* a container image as a Container App Job.
Verify* outputs using logs, blob listing, and SQL queries.
Estimate* costs and stop idle resources.
| Time | Activity | Duration |
|---|---|---|
| 0:00 | Welcome & Week 5 Recap | 5 min |
| 0:05 | Kahoot Quiz (Knowledge Check) | 10 min |
| 0:15 | Demo: Azure CLI, portal, and Key Vault | 15 min |
| 0:30 | Demo: Upload to Blob Storage from Python | 10 min |
| 0:40 | Demo: Connect to Postgres from Python + DBeaver | 15 min |
| 0:55 | Break | 10 min |
| 1:05 | Workshop: Students connect to Postgres and upload a blob | 20 min |
| 1:25 | Demo: Create and run a Container App Job | 15 min |
| 1:40 | Cost Awareness and cleanup rules | 5 min |
| 1:45 | Assignment Launch and Q&A | 15 min |
| 2:00 | End | - |
Total: 2 hours
Goal: Verify understanding of Week 6 reading material before diving into hands-on work.
sslmode=require matter on Azure?Goal: Orient students in the Azure portal and CLI. Show them how to retrieve their connection strings.
rg-hyf-data).az resource list --resource-group rg-hyf-data --output table to see what is provisioned.--output table vs --output json) and --query.az keyvault secret show \\
--vault-name kv-hyf-data \\
--name postgres-url \\
--query value -o tsv
Goal: Show students how to upload files to cloud storage using the Python SDK.
raw and processed containers in the portal -- explain the naming convention.import os
from azure.storage.blob import BlobServiceClient
conn_str = os.environ["AZURE_STORAGE_CONNECTION_STRING"]
client = BlobServiceClient.from_connection_string(conn_str)
container_client = client.get_container_client("raw")
container_client.upload_blob(name="demo.json", data='{"hello": "world"}', overwrite=True)
az storage blob list \\
--account-name hyfstoragedev \\
--container-name raw \\
--output table
overwrite=True, uploading the same blob name twice will fail."Goal: Show students two ways to interact with Postgres: code and a GUI tool.
psycopg2 using the shared connection string (from Key Vault):import psycopg2
from contextlib import closing
with closing(psycopg2.connect(db_url)) as conn:
cur = conn.cursor()
cur.execute("SELECT version()")
print(cur.fetchone())
cur.close()
sslmode=require in the connection string and the closing() pattern.hyf-data-pg.postgres.database.azure.com), port (5432), database, user, password.require.SELECT version(), and show the result grid.psycopg2, but DBeaver is great for checking what your pipeline wrote."Goal: Critical Path. Every student leaves with a working Postgres connection and blob upload.