15. AI for Designers
Course Positioning
This course teaches designers to use AI as an ideation partner, research assistant, moodboard generator, prototyping accelerator, copy helper, and production support tool. It emphasizes design judgment, taste, accessibility, brand consistency, IP awareness, and human-centered design.
Learning outcomes
- Use AI to support ideation, moodboards, visual exploration, UX research synthesis, and design critique.
- Create prompts and briefs that preserve brand, audience, constraints, and accessibility.
- Integrate AI-generated assets into a professional design workflow without losing authorship or quality control.
- Use AI to prototype interfaces, write microcopy, generate variants, and prepare design handoff.
- Build a design AI playbook for repeatable use.
Expanded Topic-by-Topic Coverage
Module 1. AI in the design process
Module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff. Primary live activity or lab: Map a design workflow and identify AI support points. Expected take-home output: Design AI workflow map.
Topics and coverage
Research
- What it means: show where Research appears in the learner's real workflow and which parts are judgment-heavy versus draftable.
- What to cover: current workflow, pain points, AI-assisted steps, human review checkpoints, quality standard, and ownership of the final decision.
- Demonstration: convert one messy real-world input into a structured brief, draft, analysis, checklist, or next action.
- Evidence of learning: learners produce a reusable template or playbook entry that can be used after the course.
ideation
- What it means: define ideation clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
visual exploration
- What it means: define visual exploration clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
prototyping
- What it means: define prototyping clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
testing
- What it means: define testing clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
production
- What it means: define production clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
handoff
- What it means: define handoff clearly and connect it to the module focus: Research, ideation, visual exploration, prototyping, testing, production, handoff.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Map a design workflow and identify AI support points.
- Learners produce: Design AI workflow map.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 2. Creative briefs and prompt direction
Module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs. Primary live activity or lab: Turn a vague design request into a strong creative brief. Expected take-home output: Promptable creative brief.
Topics and coverage
Audience
- What it means: define Audience clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
brand voice
- What it means: define brand voice clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
constraints
- What it means: define constraints clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
references
- What it means: define references clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
do/don't lists
- What it means: define do/don't lists clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
aspect ratios
- What it means: define aspect ratios clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
output specs
- What it means: define output specs clearly and connect it to the module focus: Audience, brand voice, constraints, references, do/don't lists, aspect ratios, output specs.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Turn a vague design request into a strong creative brief.
- Learners produce: Promptable creative brief.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 3. Moodboards and visual exploration
Module focus: Styles, composition, color, typography, art direction, reference ethics. Primary live activity or lab: Generate and critique moodboard directions. Expected take-home output: Moodboard rationale.
Topics and coverage
Styles
- What it means: define Styles clearly and connect it to the module focus: Styles, composition, color, typography, art direction, reference ethics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
composition
- What it means: define composition clearly and connect it to the module focus: Styles, composition, color, typography, art direction, reference ethics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
color
- What it means: define color clearly and connect it to the module focus: Styles, composition, color, typography, art direction, reference ethics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
typography
- What it means: define typography clearly and connect it to the module focus: Styles, composition, color, typography, art direction, reference ethics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
art direction
- What it means: define art direction clearly and connect it to the module focus: Styles, composition, color, typography, art direction, reference ethics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
reference ethics
- What it means in this course: define reference ethics in operational terms, not as an abstract principle.
- What to cover: sensitive data boundaries, affected stakeholders, approval paths, documentation, and what graphic designers, UX/UI designers, product designers, design students, creative teams must never delegate blindly to AI.
- Use case: present one acceptable use, one borderline use, and one prohibited use, then ask learners to justify the classification.
- Evidence of learning: learners add a risk control, review step, or escalation rule to their course project.
Practice and evidence of learning
- Learners complete or discuss: Generate and critique moodboard directions.
- Learners produce: Moodboard rationale.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 4. UX research synthesis
Module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats. Primary live activity or lab: Summarize fictional user interviews into themes. Expected take-home output: Research synthesis board.
Topics and coverage
Interview notes
- What it means: define Interview notes clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
affinity mapping
- What it means: define affinity mapping clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
pain points
- What it means: define pain points clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
personas
- What it means: define personas clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
journey maps
- What it means: define journey maps clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
caveats
- What it means: define caveats clearly and connect it to the module focus: Interview notes, affinity mapping, pain points, personas, journey maps, caveats.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Summarize fictional user interviews into themes.
- Learners produce: Research synthesis board.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 5. UI and prototype acceleration
Module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support. Primary live activity or lab: Generate alternative flows for a simple app screen. Expected take-home output: Prototype concept.
Topics and coverage
Wireframes
- What it means: define Wireframes clearly and connect it to the module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
component ideas
- What it means: define component ideas clearly and connect it to the module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
microcopy
- What it means: define microcopy clearly and connect it to the module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
empty states
- What it means: define empty states clearly and connect it to the module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
flows
- What it means: define flows clearly and connect it to the module focus: Wireframes, component ideas, microcopy, empty states, flows, design system support.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
design system support
- What it means: show where design system support appears in the learner's real workflow and which parts are judgment-heavy versus draftable.
- What to cover: current workflow, pain points, AI-assisted steps, human review checkpoints, quality standard, and ownership of the final decision.
- Demonstration: convert one messy real-world input into a structured brief, draft, analysis, checklist, or next action.
- Evidence of learning: learners produce a reusable template or playbook entry that can be used after the course.
Practice and evidence of learning
- Learners complete or discuss: Generate alternative flows for a simple app screen.
- Learners produce: Prototype concept.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 6. AI critique and iteration
Module focus: Design review prompts, accessibility checks, brand checks, usability heuristics. Primary live activity or lab: Run a structured critique on a design mockup. Expected take-home output: Design critique report.
Topics and coverage
Design review prompts
- What it means: explain how Design review prompts changes the interaction between human intent, model behavior, external information, and final output.
- What to cover: inputs, constraints, examples, output format, grounding, iteration, failure modes, and when a human must intervene.
- Demonstration: show a weak attempt, a stronger structured attempt, and a reviewed final version with explicit checks.
- Evidence of learning: learners create a reusable prompt, schema, retrieval note, or workflow pattern and test it on at least two examples.
accessibility checks
- What it means: define accessibility checks clearly and connect it to the module focus: Design review prompts, accessibility checks, brand checks, usability heuristics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
brand checks
- What it means: define brand checks clearly and connect it to the module focus: Design review prompts, accessibility checks, brand checks, usability heuristics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
usability heuristics
- What it means: define usability heuristics clearly and connect it to the module focus: Design review prompts, accessibility checks, brand checks, usability heuristics.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Run a structured critique on a design mockup.
- Learners produce: Design critique report.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 7. Production and handoff
Module focus: Asset resizing, copy variants, specs, developer notes, QA checklists. Primary live activity or lab: Create handoff notes from a mock design. Expected take-home output: Handoff package.
Topics and coverage
Asset resizing
- What it means: define Asset resizing clearly and connect it to the module focus: Asset resizing, copy variants, specs, developer notes, QA checklists.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
copy variants
- What it means: define copy variants clearly and connect it to the module focus: Asset resizing, copy variants, specs, developer notes, QA checklists.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
specs
- What it means: define specs clearly and connect it to the module focus: Asset resizing, copy variants, specs, developer notes, QA checklists.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
developer notes
- What it means: define developer notes clearly and connect it to the module focus: Asset resizing, copy variants, specs, developer notes, QA checklists.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
QA checklists
- What it means: define QA checklists clearly and connect it to the module focus: Asset resizing, copy variants, specs, developer notes, QA checklists.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Create handoff notes from a mock design.
- Learners produce: Handoff package.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Module 8. IP, authenticity, and professional ethics
Module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent. Primary live activity or lab: Audit a design workflow for IP and disclosure risks. Expected take-home output: Design AI policy note.
Topics and coverage
Copyright
- What it means: define Copyright clearly and connect it to the module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
training data concerns
- What it means: connect training data concerns to the data lifecycle from source and structure through analysis, interpretation, and decision-making.
- What to cover: source reliability, missing or biased data, leakage, assumptions, calculations, and the difference between correlation and decision-ready evidence.
- Demonstration: walk through a small dataset or example table and mark the checks required before trusting the result.
- Evidence of learning: learners produce a short analysis note that includes assumptions, limitations, and verification steps.
client disclosure
- What it means: define client disclosure clearly and connect it to the module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
stock usage
- What it means: define stock usage clearly and connect it to the module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
originality
- What it means: define originality clearly and connect it to the module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
consent
- What it means: define consent clearly and connect it to the module focus: Copyright, training data concerns, client disclosure, stock usage, originality, consent.
- What to cover: the core concept, why it matters, what good usage looks like, and where learners are likely to misunderstand it.
- Demonstration: give one simple example, one realistic example, and one failure or limitation example.
- Evidence of learning: learners explain the topic in their own words and apply it to a small artifact or decision.
Practice and evidence of learning
- Learners complete or discuss: Audit a design workflow for IP and disclosure risks.
- Learners produce: Design AI policy note.
- Instructor checks for accuracy, practical usefulness, clear assumptions, appropriate human review, and fit with the course audience.
- Learners revise once after feedback so the module contributes to the final project, portfolio, or playbook.
Minimum coverage before moving on
- Learners can explain the module vocabulary without relying on tool-generated text.
- Learners have seen one worked example, one hands-on application, and one limitation or failure case.
- Learners know what must be verified, what data must be protected, and who remains accountable for the output.
Labs, projects, and assessments
- Lab 1: Create a campaign moodboard with three visual territories and critique each.
- Lab 2: Turn fictional user research into personas and journey pain points.
- Lab 3: Produce a UI flow, microcopy variants, and accessibility checklist.
- Capstone: AI-assisted design system mini-kit or campaign concept with rationale, prompts, iterations, and handoff notes.
Evaluation approach
- 20% creative brief and prompt quality.
- 20% research synthesis.
- 25% prototype/design artifact.
- 15% critique and accessibility review.
- 20% final design playbook or capstone.
Recommended tools and materials
- Figma, Canva, Adobe tools if available, image generation tools, AI assistant, accessibility checkers, design system references.
- Optional: prototyping tools and no-code front-end builders.
Safety, ethics, and governance emphasis
- Avoid uploading confidential client assets to unapproved tools.
- Use disclosure and licensing practices appropriate to client and platform.
- Treat AI images as drafts requiring art direction, editing, accessibility checks, and legal review where needed.
Delivery notes
- Designers respond well to live critique. Include before/after examples and prompt iteration logs.
- Build assignments around taste and decision-making, not only speed.
Instructor Build Checklist
- Prepare one short demo for each module and one learner activity that creates a saved artifact.
- Prepare examples that match the audience, local context, and likely tools learners can access.
- Add a verification step to every AI-generated output: factual check, source check, data sensitivity check, and quality review.
- Keep a running portfolio folder so each module contributes to the final project or learner playbook.
- Reserve time for reflection on what the learner did, what AI did, what was checked, and what remains uncertain.