2. AI Foundations and Career Readiness for Grades 11-12
Course Positioning
This course prepares students for college, entrance exams, projects, internships, and future careers. It balances conceptual understanding, responsible use, productivity, research skills, and exposure to AI careers. Students should leave with a portfolio project and a personal AI study workflow.
Learning outcomes
- Describe the major branches of AI: machine learning, deep learning, generative AI, computer vision, speech, robotics, and agents.
- Use AI to support studying, research, writing, coding, presentations, and career exploration without outsourcing thinking.
- Understand academic integrity rules and how to cite or disclose AI assistance.
- Build a small project using no-code, low-code, or beginner-code methods.
- Map AI-related career pathways across engineering, medicine, law, design, business, research, and public policy.
Expanded Topic-by-Topic Coverage
Module 1. AI landscape and careers
Module focus: Types of AI systems, current applications, career roles, interdisciplinary opportunities. Primary live activity or lab: Career card sorting: match AI roles with skills and industries. Expected take-home output: AI career map.
Topics and coverage
Types of AI systems
- What it means: define Types of AI systems clearly and connect it to the module focus: Types of AI systems, current applications, career roles, interdisciplinary opportunities.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
current applications
- What it means: define current applications clearly and connect it to the module focus: Types of AI systems, current applications, career roles, interdisciplinary opportunities.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
career roles
- What it means: show where career roles appears in the learner's real workflow and which parts are judgment-heavy versus draftable.
- What to cover: current workflow, pain points, AI-assisted steps, human review checkpoints, quality standard, and ownership of the final decision.
- Demonstration: convert one messy real-world input into a structured brief, draft, analysis, checklist, or next action.
- Evidence of learning: learners produce a reusable template or playbook entry that can be used after the course.
interdisciplinary opportunities
- What it means: define interdisciplinary opportunities clearly and connect it to the module focus: Types of AI systems, current applications, career roles, interdisciplinary opportunities.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Career card sorting: match AI roles with skills and industries.
- Learners produce: AI career map.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 2. How generative AI works
Module focus: Tokens, probability, training, fine-tuning, context windows, multimodal models, why models hallucinate. Primary live activity or lab: Token prediction game and prompt experiments. Expected take-home output: Concept notes.
Topics and coverage
Tokens
- What it means: define Tokens clearly and connect it to the module focus: Tokens, probability, training, fine-tuning, context windows, multimodal models, why models hallucinate.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
probability
- What it means: connect probability to the data lifecycle from source and structure through analysis, interpretation, and decision-making.
- What to cover: source reliability, missing or biased data, leakage, assumptions, calculations, and the difference between correlation and decision-ready evidence.
- Demonstration: walk through a small dataset or example table and mark the checks required before trusting the result.
- Evidence of learning: learners produce a short analysis note that includes assumptions, limitations, and verification steps.
training
- What it means: place training inside the AI system stack so learners know what problem it solves and what tradeoffs it introduces.
- What to cover: inputs, outputs, system boundaries, evaluation criteria, cost or latency implications, and common failure cases.
- Demonstration: use a diagram, small code sample, worksheet, or tool trace to make the mechanism visible.
- Evidence of learning: learners compare two approaches and explain which one they would choose for a realistic constraint.
fine-tuning
- What it means: define fine-tuning clearly and connect it to the module focus: Tokens, probability, training, fine-tuning, context windows, multimodal models, why models hallucinate.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
context windows
- What it means: define context windows clearly and connect it to the module focus: Tokens, probability, training, fine-tuning, context windows, multimodal models, why models hallucinate.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
multimodal models
- What it means: place multimodal models inside the AI system stack so learners know what problem it solves and what tradeoffs it introduces.
- What to cover: inputs, outputs, system boundaries, evaluation criteria, cost or latency implications, and common failure cases.
- Demonstration: use a diagram, small code sample, worksheet, or tool trace to make the mechanism visible.
- Evidence of learning: learners compare two approaches and explain which one they would choose for a realistic constraint.
why models hallucinate
- What it means: place why models hallucinate inside the AI system stack so learners know what problem it solves and what tradeoffs it introduces.
- What to cover: inputs, outputs, system boundaries, evaluation criteria, cost or latency implications, and common failure cases.
- Demonstration: use a diagram, small code sample, worksheet, or tool trace to make the mechanism visible.
- Evidence of learning: learners compare two approaches and explain which one they would choose for a realistic constraint.
Practice and evidence of learning
- Learners complete or discuss: Token prediction game and prompt experiments.
- Learners produce: Concept notes.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 3. AI for learning and exams
Module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning. Primary live activity or lab: Build a 14-day revision plan with AI, then critique it. Expected take-home output: Personal study workflow.
Topics and coverage
Socratic tutoring
- What it means: define Socratic tutoring clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
revision plans
- What it means: define revision plans clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
flashcards
- What it means: define flashcards clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
worked examples
- What it means: define worked examples clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
feedback loops
- What it means: define feedback loops clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
avoiding passive learning
- What it means: define avoiding passive learning clearly and connect it to the module focus: Socratic tutoring, revision plans, flashcards, worked examples, feedback loops, avoiding passive learning.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Build a 14-day revision plan with AI, then critique it.
- Learners produce: Personal study workflow.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 4. Research and source evaluation
Module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy. Primary live activity or lab: Compare AI answer, search results, and textbook explanation. Expected take-home output: Research note template.
Topics and coverage
Search strategy
- What it means: define Search strategy clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
summaries
- What it means: define summaries clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
citations
- What it means: define citations clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
literature notes
- What it means: define literature notes clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
fact-checking
- What it means: define fact-checking clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
source hierarchy
- What it means: define source hierarchy clearly and connect it to the module focus: Search strategy, summaries, citations, literature notes, fact-checking, source hierarchy.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Compare AI answer, search results, and textbook explanation.
- Learners produce: Research note template.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 5. Writing and presentation support
Module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries. Primary live activity or lab: Improve a weak essay outline using AI and peer critique. Expected take-home output: AI-assisted essay plan.
Topics and coverage
Outlines
- What it means: define Outlines clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
feedback
- What it means: define feedback clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
style transfer
- What it means: define style transfer clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
slide planning
- What it means: define slide planning clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
speech practice
- What it means: define speech practice clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
citation boundaries
- What it means: define citation boundaries clearly and connect it to the module focus: Outlines, feedback, style transfer, slide planning, speech practice, citation boundaries.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Improve a weak essay outline using AI and peer critique.
- Learners produce: AI-assisted essay plan.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 6. Coding and computational thinking
Module focus: Pseudocode, Python/notebooks or no-code automation, debugging with AI, explaining code. Primary live activity or lab: Build a small calculator, quiz app, or data visualization. Expected take-home output: Mini coding artifact.
Topics and coverage
Pseudocode
- What it means: define Pseudocode clearly and connect it to the module focus: Pseudocode, Python/notebooks or no-code automation, debugging with AI, explaining code.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Python/notebooks or no-code automation
- What it means: define Python/notebooks or no-code automation clearly and connect it to the module focus: Pseudocode, Python/notebooks or no-code automation, debugging with AI, explaining code.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
debugging with AI
- What it means: define debugging with AI clearly and connect it to the module focus: Pseudocode, Python/notebooks or no-code automation, debugging with AI, explaining code.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
explaining code
- What it means: define explaining code clearly and connect it to the module focus: Pseudocode, Python/notebooks or no-code automation, debugging with AI, explaining code.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Build a small calculator, quiz app, or data visualization.
- Learners produce: Mini coding artifact.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 7. Ethics, bias, jobs, and society
Module focus: Bias, fairness, privacy, labor displacement, deepfakes, regulation, environmental cost. Primary live activity or lab: Structured debate: should AI tools be allowed in school assignments? Expected take-home output: Position paper.
Topics and coverage
Bias
- What it means in this course: define Bias in operational terms, not as an abstract principle.
- What to cover: sensitive data boundaries, affected stakeholders, approval paths, documentation, and what senior school students must never delegate blindly to AI.
- Use case: present one acceptable use, one borderline use, and one prohibited use, then ask learners to justify the classification.
- Evidence of learning: learners add a risk control, review step, or escalation rule to their course project.
fairness
- What it means: define fairness clearly and connect it to the module focus: Bias, fairness, privacy, labor displacement, deepfakes, regulation, environmental cost.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
privacy
- What it means in this course: define privacy in operational terms, not as an abstract principle.
- What to cover: sensitive data boundaries, affected stakeholders, approval paths, documentation, and what senior school students must never delegate blindly to AI.
- Use case: present one acceptable use, one borderline use, and one prohibited use, then ask learners to justify the classification.
- Evidence of learning: learners add a risk control, review step, or escalation rule to their course project.
labor displacement
- What it means: define labor displacement clearly and connect it to the module focus: Bias, fairness, privacy, labor displacement, deepfakes, regulation, environmental cost.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
deepfakes
- What it means: define deepfakes clearly and connect it to the module focus: Bias, fairness, privacy, labor displacement, deepfakes, regulation, environmental cost.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
regulation
- What it means: define regulation clearly and connect it to the module focus: Bias, fairness, privacy, labor displacement, deepfakes, regulation, environmental cost.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
environmental cost
- What it means: place environmental cost inside the AI system stack so learners know what problem it solves and what tradeoffs it introduces.
- What to cover: inputs, outputs, system boundaries, evaluation criteria, cost or latency implications, and common failure cases.
- Demonstration: use a diagram, small code sample, worksheet, or tool trace to make the mechanism visible.
- Evidence of learning: learners compare two approaches and explain which one they would choose for a realistic constraint.
Practice and evidence of learning
- Learners complete or discuss: Structured debate: should AI tools be allowed in school assignments?
- Learners produce: Position paper.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 8. Capstone project studio
Module focus: Project definition, user needs, prototype, testing, presentation. Primary live activity or lab: Build an AI-assisted project in teams. Expected take-home output: Demo and portfolio write-up.
Topics and coverage
Project definition
- What it means: define Project definition clearly and connect it to the module focus: Project definition, user needs, prototype, testing, presentation.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
user needs
- What it means: define user needs clearly and connect it to the module focus: Project definition, user needs, prototype, testing, presentation.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
prototype
- What it means: define prototype clearly and connect it to the module focus: Project definition, user needs, prototype, testing, presentation.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
testing
- What it means: define testing clearly and connect it to the module focus: Project definition, user needs, prototype, testing, presentation.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
presentation
- What it means: show where presentation appears in the learner's real workflow and which parts are judgment-heavy versus draftable.
- What to cover: current workflow, pain points, AI-assisted steps, human review checkpoints, quality standard, and ownership of the final decision.
- Demonstration: convert one messy real-world input into a structured brief, draft, analysis, checklist, or next action.
- Evidence of learning: learners produce a reusable template or playbook entry that can be used after the course.
Practice and evidence of learning
- Learners complete or discuss: Build an AI-assisted project in teams.
- Learners produce: Demo and portfolio write-up.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Labs, projects, and assessments
- Lab 1: Build an AI tutor prompt for one school subject and evaluate it on five questions.
- Lab 2: Use AI to research a career path, then verify salary, skills, and education requirements independently.
- Capstone options: study assistant, subject explainer, local-language learning aid, science-fair assistant, no-code chatbot, or data storytelling project.
Evaluation approach
- 20% concept quizzes.
- 25% prompt and verification notebook.
- 20% ethics/career reflection.
- 35% capstone prototype and presentation.
Recommended tools and materials
- AI assistant, search engine, spreadsheet/notebook, presentation tool, Canva/Figma optional, beginner coding environment optional.
- Use school-approved tools and avoid entering private student data.
Safety, ethics, and governance emphasis
- Academic honesty policy should be explicit from the first session.
- Require students to document how AI was used in every major submission.
- Encourage effortful learning: AI should quiz, critique, and explain; it should not simply produce final homework.
Delivery notes
- Offer two tracks: non-coding and coding.
- Include local examples: agriculture, healthcare, education, small business, climate, and Indian-language AI.
- Invite college students or professionals for a career Q&A if available.
Instructor Build Checklist
- Prepare one short demo for each module and one learner activity that creates a saved artifact.
- Prepare examples that match the audience, local context, and likely tools learners can access.
- Add a verification step to every AI-generated output: factual check, source check, data sensitivity check, and quality review.
- Keep a running portfolio folder so each module contributes to the final project or learner playbook.
- Reserve time for reflection on what the learner did, what AI did, what was checked, and what remains uncertain.