Week 6 - Cloud and Azure Essentials

Introduction to Cloud and Azure

Azure CLI and the Portal

Azure Blob Storage

Azure PostgreSQL Databases

Azure Container Apps Jobs

Cost Awareness

History of Cloud Computing

Week 6 Gotchas & Pitfalls

Practice

Week 6 Assignment: Deploy Your Pipeline to Azure

Week 6 Lesson Plan

🛠️ Practice

<aside> 📝 These exercises combine concepts from multiple chapters. Use them to verify your understanding before starting the assignment.

</aside>

Exercise 1: Trace a resource group

Concepts: Azure CLI, resource hierarchy, cost awareness.

Instructions:

  1. Run az resource list --resource-group <your-group> --output table to see all resources in your shared resource group.
  2. For each resource, identify: what type it is (Postgres, storage, container app, etc.), which chapter introduced it, and whether it bills when idle.
  3. Write a short table (in a text file or on paper) with columns: Resource Name, Type, Chapter, Bills When Idle (yes/no).

Why this helps: In the assignment, you will create resources yourself. Knowing what already exists prevents duplicates and wasted credits.


Exercise 2: End-to-end blob verification

Concepts: Blob Storage (Python + CLI), naming conventions.

Instructions:

  1. Write a Python script that generates a JSON file with today's date in the filename (e.g. test/practice_2026-04-01.json) and uploads it to the shared storage account.
  2. Without looking at your script, use only the CLI to: list blobs with the test/ prefix, download the blob you uploaded, and verify the contents match.
  3. Delete the test blob using the CLI: az storage blob delete --account-name <name> --container-name raw --name test/practice_2026-04-01.json.

Why this helps: The assignment requires you to verify pipeline output. This exercise practices the verification loop (upload from code, check from CLI) that you will use to prove your pipeline works.


Exercise 3: Debug a broken connection string

Concepts: Postgres connection, error reading, SSL.

Instructions:

Your teacher gives you this connection string (it has three problems):

postgresql://admin:password@hyf-data-pg/weather_db
  1. Try to connect using psql or Python's psycopg. Read the error message.
  2. Identify and fix the three issues (hint: host format, SSL, port).
  3. Once connected, run SELECT version(); to confirm you reached Azure Postgres.

Why this helps: Connection string errors are the #1 cause of "it works locally but not on Azure" failures. Practicing error diagnosis now saves time during the assignment.


Exercise 4: Dry-run a container job

Concepts: Container Apps Jobs, environment variables, log reading.

Instructions:

  1. Before creating a job, write down (on paper or in a file) the exact az containerapp job create command you would run, including: image name, registry server, environment variables, replica timeout.
  2. Check your command against the Gotchas list. Did you include --registry-server? Is your --container name correct?
  3. Create the job, start it manually, and read the logs. If the execution fails, use the gotchas chapter to diagnose the issue.
  4. Verify output: check that rows appeared in Postgres (SELECT COUNT(*)) and a blob appeared in storage (az storage blob list).

Why this helps: Writing the command before running it forces you to think about each flag. Self-reviewing against the gotchas list catches the most common mistakes (missing --registry-server, wrong --container name).


Exercise 5: Cost estimation challenge

Concepts: Cost awareness, pricing calculator, resource lifecycle.

Instructions:

  1. Open the Azure pricing calculator.
  2. Estimate the monthly cost of: a Standard_B1ms Postgres server running 24/7, the same server stopped 16 hours/day, and a Container App Job running 5 times/day for 60 seconds each.
  3. Calculate how much the class saves per month if the shared Postgres server is stopped outside of class hours (assume 8 hours/day, 5 days/week).
  4. Write your findings as a short paragraph (3-4 sentences) explaining what you would do differently in a real project.

Why this helps: The assignment requires a running pipeline on Azure. Understanding costs before you start prevents surprises and builds a habit you will need in any professional data engineering role.


All resources live in the shared resource group your teacher created. Do not delete the resource group itself.


The HackYourFuture curriculum is licensed under CC BY-NC-SA 4.0 *https://hackyourfuture.net/*

CC BY-NC-SA 4.0 Icons

Built with ❤️ by the HackYourFuture community · Thank you, contributors

Found a mistake or have a suggestion? Let us know in the feedback form.