AI Study Support + Insight
Study Support is an AI-powered tool within Brightspace, a cloud-based learning management system (LMS) designed for both educational institutions and corporate training.

For learners, it uses AI to deliver instant, personalized feedback and tailored study materials based on their class performance.

For educators, it provides insights into learner progress and content effectiveness, and leverages AI to support timely interventions.

Role
Leading Product Designer
Team
1 Product manager
1 Dev manager
1 ML expert
5 Developers
1 Principal designer (report to)
Timeline
Nov 2024 - Jan 2025
Unable to show detailed design as this project is in BETA.
Continue reading if you want to learn about my design process ;)

A sneak peek into the solution

💬 Personalized Quiz Feedback
Use AI to generate a 2-3 sentence feedback that highlights where learner is doing well and where needs improvement
📚 Recommend personalized review material
Use AI to suggest the most relevant study material (course content/Outside sources) for each student based on their quiz achievement
🤖 Feedback that reinforces machine learning
1. Instructors rate the quality of the recommended material -> inform future suggestion

2. Track learner engagement with recommended material and performance on subsequent assessment
🗄 The Background

Starting with the business needs

There is a gap/opportunity in the market
There is not much effective learner-facing AI tool on the market. Most of our competitors (e.g., Canvas, Blackboard, Moodle, etc.) focus on making AI tool for instructors.

So we had a strategic plan to “Improve learner Achievement by utilizing Gen AI technology”
⚠️ The Problem

We are facing many challenges...

😵‍💫
Ambiguous project direction and scope
WHAT aspect of learner experience should we focus on? WHERE should the experience exist on the platform? HOW should we leverage AI?
See how I dealt with it
🚀
Short timeline + High team pressure
We need to ship something in 3 months. This is 3x faster than other design projects of similar scope.
See how I dealt with it
🤨
Gen AI is unreliable
This is especially concerning in education industry. No teacher want to expose their learners with unreliable content. It is important to risk control the entire process.
See how I dealt with it
🎯 The Goal
Empower learners with AI-driven support and enhances their achievement, while ensuring reliability, clarity, and seamless integration within their learning experience
But, how do we do that?
🔎 Finding a direction

User needs point us to the right direction

We looked into secondary research about learning science and hosted a workshop where all content designers were invited to share idea and knowledge about known user needs.
📘 According to Learning Science:
Learner tend to be overconfident + have inaccurate judgements of their own knowledge .
Academic user: K-12 or High Education
Learner
“I want to get good grades in school”
Study course material prepared by the instructor because it is more likely to appear in quiz or test
Instructor
“I want my students to master important learning concepts”
Strong focus on content accuracy
Uses learning outcome/objective to measure student success.  
Corporate user
Learner
“I want to pass the compliance course” “I want to learn a new skill”
Study course material and outside sources, whichever aids understanding.
Instructor
“I want employees to effectively apply key skills and knowledge in their roles.”
Lower focus on content accuracy, more willing to provide different medium that help learner understand concept.

Feasibility consideration

As we are in a time crunch, instead of building an entirely new system, its more reasonable to “Stand on the shoulders of giants” - to build on top of existing workflows.

I identified the most suitable place to support learners by conducting system analysis of the Learning management system and by analyzing the education core workflow.
This feature will initially be on the Quiz tool, as it is the only auto-graded assessment on the LMS, making it ideal for providing instant feedback and personalized support.
*System analysis of all types of activities in Brightspace LMS
*Workflow analysis
🎯 Define

A sneak peek into the solution

💬 Personalized Quiz Feedback
Use AI to generate a 2-3 sentence feedback that highlights where learner is doing well and where needs improvement
📚 Recommend personalized review material
Use AI to suggest the most relevant study material (course content/Outside sources) for each student based on their quiz achievement
🤖 Feedback that reinforces machine learning
1. Instructors rate the quality of the recommended material -> inform future suggestion

2. Track learner engagement with recommended material and performance on subsequent assessment
*This is 1/10 of the solution
💡 System thinking

Now it’s time to dig into the details

Further analyze the quiz tool by: 

1 - Mapping instructor and student workflows and mental models.

2 - Examining quiz settings to identify technical constraints for AI study support.

3 - Identifying key touchpoints with other tools to enable AI nudges and transparency.
*Workflow, information architecture and setting analysis

UI Exploration

Its important to know which design process to prioritize vs. deprioritize, especially when working under time pressure.
🚀
How I dealt with short timeline
I focused on the ‘Understand and Define’ over ‘Explore’ phase to build a solid foundation, allowing me to move quickly without unnecessary iteration.

Complete workflow

✨ Designing AI 

Identify where and how to use AI

Gen AI is great at processing information. Making it an optimal tool to evaluate learner achievement and provide feedback and suggestions.

However, there are two things we need to consider when using AI: 
- How does AI do what we want it to do?
- How do we ensure high-quality result?
HOW does AI provide feedback and recommend study material?
Our system uses learning outcomes 🏷️ to connect assessments with course materials. Instructors align these outcomes—set by their institution or the Ministry of Education—with quizzes and content.

For example:
If a student struggles with a quiz question, AI references these links to suggest relevant materials.

If no outcomes are linked—or too many are—AI uses semantic analysis to analyze the quiz and content, ensuring the most relevant recommendations.
There are certain edge cases when the above rules do not apply.
I worked with the dev team to define the solution for different use cases.
I ranked all conditions in order of how much they would affect the output and treated different use-case differently.
*Edge case ranking and solution document
Quality and risk control with AI output
🏷️
Well-labeled dataset
We ensure AI quality by structuring data around learning outcomes, enabling AI to trace student performance and recommend relevant course materials when needed.
🛂
Human oversight
Instructors can set guidelines and guardrails, such as selecting approved course materials or credible sources, ensuring AI suggestions stay relevant and reliable.
📈
Continuous improvement
Instructors can review AI-generated feedback and recommendations, providing ratings and corrections that continuously refine AI learning.
🤖 Prompt Design

Prompt design is a collaborative effort

Each of the product triages brings a special perspective to the table:
🎤 Designer (User and Pedagogy perspective)
What kind of output is BEST for the user?
What is most optional for education use?
🛠️ Prompt Engineer (Tech perspective)
How to write an effective prompt?
💰 Product manager (Business perspective)
Which solution is most cost-friendly?

Introducing a
user-centred approach
to prompt design

The designer (myself) considered user needs, intentions, pedagogy, and ethics, and provided a guideline to the Machine learning experts that informs the ideal result for different scenarios.

As results were generated, I worked closely with the team to evaluate whether the output met our intent, and then iterated repeatedly until it did.
🎙️
The process is very well received
I also shared my experience with the entire design team and company-wide annual conference - INFUSION
Beyond project work

🎤 Becoming the voice of AI designer

Being a designer on the AI team. I am grateful to have first-hand experience working with this new technology - and now I am sharing my knowledge with others.
Somethings that I am doing
📋  Help to establish Design x AI approach
💡  Introduce new practices and AI tools to the team
👩🏻  Mentoring other designers over prompt design
🎤  Advocate for designers’ involvement (during company-wide conference)