Week 6 - Cloud and Azure Essentials

Introduction to Cloud and Azure

Azure CLI and the Portal

Azure Blob Storage

Azure PostgreSQL Databases

Azure Container Apps Jobs

Cost Awareness

History of Cloud Computing

Week 6 Gotchas & Pitfalls

Practice

Week 6 Assignment: Deploy Your Pipeline to Azure

Week 6 Lesson Plan

🗓️ Week 6 Lesson Plan

This week connects containers to the cloud. Students have read the material on Azure services, blob storage, Postgres, Container Apps Jobs, and cost awareness. Today is about consolidation and hands-on practice.

We are not just teaching "how to use Azure CLI"; we are showing students how to deploy a data pipeline to the cloud end to end. By the end of class, every student should have a working Container App Job that writes to blob storage and Postgres.

Preparation (before class)

The teacher must provision shared resources before class:

# Create resource group
az group create --name rg-hyf-data --location westeurope

# Create storage account
az storage account create \\
  --name hyfstoragedev \\
  --resource-group rg-hyf-data \\
  --location westeurope \\
  --sku Standard_LRS

# Create blob containers for raw and processed data
az storage container create \\
  --name raw \\
  --account-name hyfstoragedev

az storage container create \\
  --name processed \\
  --account-name hyfstoragedev

# Create Postgres server (if not already created)
az postgres flexible-server create \\
  --name hyf-data-pg \\
  --resource-group rg-hyf-data \\
  --location westeurope \\
  --admin-user hyfadmin \\
  --admin-password "<STRONG_PASSWORD>" \\
  --sku-name Standard_B1ms \\
  --tier Burstable

# Create a database per student/team
az postgres flexible-server db create \\
  --server-name hyf-data-pg \\
  --resource-group rg-hyf-data \\
  --database-name team1

# Create Container Apps environment
az containerapp env create \\
  --name env-hyf-data \\
  --resource-group rg-hyf-data \\
  --location westeurope

# Create Key Vault and store connection strings
az keyvault create \\
  --name kv-hyf-data \\
  --resource-group rg-hyf-data \\
  --location westeurope

az keyvault secret set \\
  --vault-name kv-hyf-data \\
  --name postgres-url \\
  --value "postgresql://pipeline_user:<PASSWORD>@hyf-data-pg.postgres.database.azure.com:5432/team1?sslmode=require"

az keyvault secret set \\
  --vault-name kv-hyf-data \\
  --name storage-connection-string \\
  --value "<STORAGE_CONNECTION_STRING>"

If the Postgres server was stopped to save costs, start it before class:

az postgres flexible-server start \\
  --name hyf-data-pg \\
  --resource-group rg-hyf-data

Ensure every student/team has:

Goals

By the end of this lesson, students should be able to:

Navigate* Azure resources using the CLI and portal.

Retrieve* connection strings from Key Vault.

Upload* files to Azure Blob Storage from Python.

Connect* a Python app to Azure Postgres and query data with DBeaver.

Deploy* a container image as a Container App Job.

Verify* outputs using logs, blob listing, and SQL queries.

Estimate* costs and stop idle resources.

Schedule

Time Activity Duration
0:00 Welcome & Week 5 Recap 5 min
0:05 Kahoot Quiz (Knowledge Check) 10 min
0:15 Demo: Azure CLI, portal, and Key Vault 15 min
0:30 Demo: Upload to Blob Storage from Python 10 min
0:40 Demo: Connect to Postgres from Python + DBeaver 15 min
0:55 Break 10 min
1:05 Workshop: Students connect to Postgres and upload a blob 20 min
1:25 Demo: Create and run a Container App Job 15 min
1:40 Cost Awareness and cleanup rules 5 min
1:45 Assignment Launch and Q&A 15 min
2:00 End -

Total: 2 hours


Kahoot Quiz

Goal: Verify understanding of Week 6 reading material before diving into hands-on work.

Topics to include

  1. Cloud models: What is the difference between IaaS, PaaS, and SaaS?
  2. Blob Storage: What is a blob container? Can you upload to a container that does not exist?
  3. Postgres: Why does sslmode=require matter on Azure?
  4. Container Apps: What is the difference between an App and a Job?
  5. Cost: Which resource bills 24/7 even when idle -- Postgres server or Container App Job?
  6. Security: Why should you not use admin credentials in your pipeline?

Demo: Azure CLI, Portal, and Key Vault

Goal: Orient students in the Azure portal and CLI. Show them how to retrieve their connection strings.

Talking points

  1. Open the Azure portal and show the shared resource group (rg-hyf-data).
  2. Run az resource list --resource-group rg-hyf-data --output table to see what is provisioned.
  3. Explain output formats (--output table vs --output json) and --query.
  4. Show Key Vault in the portal -- explain that secrets are stored here, not in code or Slack.
  5. Retrieve a connection string live:
az keyvault secret show \\
  --vault-name kv-hyf-data \\
  --name postgres-url \\
  --query value -o tsv
  1. Emphasise: "You will learn how Key Vault works in Week 12. For now, this is just a secure way to get your credentials."

Demo: Upload to Blob Storage from Python

Goal: Show students how to upload files to cloud storage using the Python SDK.

Talking points

  1. Show the raw and processed containers in the portal -- explain the naming convention.
  2. Walk through the upload code from Chapter 3:
import os
from azure.storage.blob import BlobServiceClient

conn_str = os.environ["AZURE_STORAGE_CONNECTION_STRING"]
client = BlobServiceClient.from_connection_string(conn_str)
container_client = client.get_container_client("raw")
container_client.upload_blob(name="demo.json", data='{"hello": "world"}', overwrite=True)
  1. List blobs using the CLI to confirm:
az storage blob list \\
  --account-name hyfstoragedev \\
  --container-name raw \\
  --output table
  1. Point out: "If you forget overwrite=True, uploading the same blob name twice will fail."

Demo: Connect to Postgres from Python + DBeaver

Goal: Show students two ways to interact with Postgres: code and a GUI tool.

Talking points: Python

  1. Connect with psycopg2 using the shared connection string (from Key Vault):
import psycopg2
from contextlib import closing

with closing(psycopg2.connect(db_url)) as conn:
    cur = conn.cursor()
    cur.execute("SELECT version()")
    print(cur.fetchone())
    cur.close()
  1. Emphasise sslmode=require in the connection string and the closing() pattern.
  2. Create a table and insert a row to show the full flow.

Talking points: DBeaver

  1. Open DBeaver and create a new PostgreSQL connection.
  2. Fill in host (hyf-data-pg.postgres.database.azure.com), port (5432), database, user, password.
  3. On the SSL tab, set SSL mode to require.
  4. Test the connection, then open a SQL script, run SELECT version(), and show the result grid.
  5. Point out: "DBeaver is useful for exploring tables visually. Your pipeline code uses psycopg2, but DBeaver is great for checking what your pipeline wrote."

Workshop: Students Connect and Upload

Goal: Critical Path. Every student leaves with a working Postgres connection and blob upload.

Steps

  1. Students retrieve their connection strings from Key Vault (or teacher distributes them directly if Key Vault access is not set up).