All projects
AI & SaaS Product Design 0→1 Conversational AI

DreamCollege AI

Building trust in AI-powered college counseling — from blank Figma file to 150,000 students in 4 months.

Role Lead Designer (0→1)
Timeline 4 months
Team 4 Devs · 4 UX · CEO · CFO
Type Web & Mobile SaaS
DreamCollege AI platform across devices

DreamCollege AI — platform across devices

Outcomes

150K users · 4.2/5 AI trust · 63% retention lift — in 4 months

150K+ lifetime users scaled in 4 months
+63% retention through conversational AI improvements
73% task success rate improvement via simplified workflows
4.2/5 AI trust score (up from 2.8/5 pre-redesign)
+40% boost in average session duration
-52% task completion time (41% → 19% time-on-task)
-45% design-to-dev handoff time via design tokens
-33% issue resolution time in month one
01
Overview

Starting with a thank you

I appreciate you taking the time to review this case study. I chose this example because it demonstrates three things: shipping fast on a 0→1, high-visibility product; identifying and optimizing a key friction issue using real data; and influencing cross-functional roadmap decisions through user research.

Headquarters

Frisco, Texas

Founded

2023

Industry

SaaS · EdTech

Company Size

11–50

Skills

UX Research · Systems Thinking · Prototyping · IA · Usability Testing · Localisation

Platform

Web · Mobile · Conversational AI

02
Problem

A critical trust deficit in a $591B market

When I joined DreamCollege AI at the ideation phase, the college admissions landscape was facing a critical trust deficit. Students and families were making life-changing decisions worth hundreds of thousands of dollars — yet they were hesitant to trust AI with guidance that could shape their futures.

Core Challenge: How might we design an AI-powered college counseling platform that feels as trustworthy, personal, and reliable as working with a human counselor — while serving users at scale?
  • High stakes: College decisions represent $591.1B in the US education market in 2025
  • Trust barrier: 70% of prospective users expressed skepticism about AI for major life decisions
  • Market gap: Traditional counselors serve 1–2 dozen students; we needed to serve thousands
  • Timing pressure: 4-month runway with constrained budget and talent pool
Slow Access to Advice Daunting Application Process Limited personalized support available Complex applications, tight deadlines Unclear Career Paths Lack of guidance on majors Too Many Options Complex choices, complex criteria

Key problem areas identified in discovery

Engagement Metrics Increased session duration 3 Operational Efficiency 2 4 Team Efficiency Reduced issue resolution Faster design handoff User Growth 1 150K+ users in 4 months Conversational AI

Conversational AI as the driver of all four outcome areas

03
Research

End-to-end design ownership — Research → IA → UX → UI → Testing

I led end-to-end design from 0→1, translating ambiguous AI-heavy business and user requirements into clear, user-centered experiences across: conversational AI interfaces, multi-step onboarding, payment flows, and core product features.

01
End-to-End Design
Led 0→1 design across conversational AI chatbots, hub-and-spoke onboarding, payment flows, subscription management, and all core product surfaces.
02
Design Foundation
Built a 48-component design system with tokens, naming conventions mirroring front-end frameworks, and Loom walkthroughs — cutting handoff time by 45%.
03
Drove Research
35+ student interviews, 28 usability sessions (Hotjar + Tobii eye-tracking), 12 stakeholder interviews. Iterative A/B testing on all critical conversion points.
12 12 28 35+ Iterative A/BTesting StakeholderInterviews UsabilityTesting User Interviews High schoolstudents interviewed Sessions usingheatmaps and eye-tracking Counselors andparents interviewed Sessions usingheatmaps and eye-tracking

Research scope — 75 total sessions across 4 methods

Research touchpoints (n=75 sessions) 47% 37% 16% Student interviews 35 SESSIONS · 47% Usability sessions 28 SESSIONS · 37% Stakeholder interviews 12 SESSIONS · 16%

Research touchpoints breakdown (n=75)

Three trust barriers that shaped everything

Through 35+ student interviews and 12 parent/counselor sessions, three patterns kept surfacing. Naming them gave the whole team a shared framework for every design decision that followed.

01
"Black Box" Anxiety
Users felt uncomfortable when they couldn't see what the AI was thinking. "It's like asking for directions from someone wearing a blindfold." The fix: make AI reasoning visible at every step.
02
The Comparison Trap
Students weren't comparing us to other AI tools — they were comparing us to a "good counselor." The fix: personalization signals at every touchpoint.
03
Emotional ≠ Logical Stakes
Even when users logically understood AI capabilities, their emotional response was skepticism. The fix: progressive trust built through small wins, not big asks.

"It's like asking for directions from someone wearing a blindfold — even if they're right, I can't trust it."

— High school student, discovery interview
04
Process

Six phases across twelve weeks

Discovery and design ran in parallel due to the 4-month runway. Competitive analysis of 8 platforms, journey mapping across 6 personas, and heuristic evaluation of AI chatbots in education informed every decision.

Project timeline — 12 weeks, 6 phases

01
Discovery & Research
Weeks 1–2. 35+ student interviews, 12 parent/counselor interviews. Competitive audit of 8 platforms. Identified 3 trust barriers.
02
Problem Framing
Weeks 2–3. Established 3 design principles. Journey mapping across 6 personas. Aligned roadmap with CEO on Trust-First strategy.
03
IA & Journey Redesign
Weeks 3–4. Hub-and-spoke onboarding architecture. Information hierarchy redesign. Navigation structure validated with 8 users.
04
Visual System & DS
Weeks 4–6. 48-component design system. Design tokens for color, type, spacing. Weekly design-dev syncs to align implementation.
05
Conversational AI UX
Weeks 5–8. LLM response design, streaming states, trust layers. Drop-off reduced from 31% to 7%.
06
Testing & Iteration
Weeks 7–12. Hotjar heatmaps, Tobii eye-tracking, 12 A/B tests. Continuous iteration until all key metrics hit targets.
05
Design

Three north-star principles · 48 components · Zero design drift

01
Transparent by Default
Show users what's happening behind the scenes at every step. Visual cues for AI reasoning. If the AI can't explain it, it shouldn't say it.
02
Human-Centered AI
Make AI feel like an intelligent assistant, not a replacement for human judgment. Every interaction should feel like support, never surveillance.
03
Progressive Trust
Earn trust incrementally through small wins before asking for big commitments. Always give before you take. Users should feel confident before they commit.

Visual Design & Design System

Working closely with the CEO and branding stakeholders, I developed a visual language balancing approachability with credibility. A gradient-heavy palette (blues → purples) conveyed innovation while differentiating us from corporate competitors.

  • 48 reusable components (buttons, forms, cards, modals, nav)
  • Design tokens for colors, typography, spacing, and shadows
  • Naming conventions mirroring front-end frameworks (e.g., btn-primary-lg)
  • Documentation for states, variations, and usage guidelines
  • Weekly design-dev syncs with shared Figma libraries
Result: Design-to-development handoff time dropped 45%, and developers could implement 80% of designs without clarification questions.
DreamCollege AI color design system

Color system — primary, accent, neutral, semantic

Full DreamCollege AI design system overview

Design system overview — typography, components, screens

Conversational AI — from 31% drop-off to 7%

One of the biggest UX challenges: keeping users engaged while the LLM generated responses (3–8 second delays). Initial testing showed 31% of users clicked away during loading, reporting they felt "ignored" or unsure if the system was working.

I designed a three-layer trust system: immediate acknowledgment (streaming animation), progressive disclosure (structured response format), and explainability (inline "why" callouts for every recommendation).

AI transparency: 3 layers

Three-layer AI transparency model

Drop-off −77% · Follow-up AI engagement +63%

Progressive profile building and transparent AI in production

Progressive profile building + transparent AI in production

06
Testing

Three decisions that changed the product — backed by data

Three critical design decisions with heatmaps

Decision 1: Profile-First Onboarding · Decision 2: Copy Profile from Sample · Decision 3: Collapsible Navigation

Continuous validation — heatmaps, eye-tracking, A/B tests

Hotjar Heatmaps revealed that users were clicking on non-interactive elements (icons, headings) expecting them to be buttons. I redesigned these as genuine interactive elements or removed them to reduce confusion.

Tobii Eye-Tracking (12 sessions) discovered users were skipping important context text styled too similarly to body copy. I introduced icon badges, colored callouts, and larger font sizes for key decision points.

  • 12 iterative A/B testing rounds across critical conversion points
  • 28 usability sessions using Hotjar + Tobii
  • 12 stakeholder interviews to align on roadmap priorities
  • 35+ user interviews across grades 9–12
Test
Original
Variant (Winner)
CTA Button Text
Onboarding save action
"Save and Continue" — 54% click-through
+8%
"Save My Progress" — 62% click-through
AI Response Styling
Chat bubble format
Plain text in bubbles — 3.2/5 trust rating
+0.6
Bullets + bold headers — 3.8/5 trust rating
College Card Layout
Browse/discovery view
Horizontal cards, small images — 23% click-through
+8%
Vertical cards, large images — 31% click-through
07
Insights

What the data surfaced that nobody expected

Stakeholder Management

The CEO wanted to prioritize monetization features (payment, upgrades) while research clearly showed we needed to nail core trust and usability first. I created a "Trust-First Roadmap" presentation showing user research quotes, conversion funnel data with a 47% drop-off before payment, and a proposed sequence: fix core experience → build trust → introduce premium features.

Outcome: CEO agreed to delay payment v2 by 2 weeks to focus on profile UX improvements. This resulted in 22% higher conversion when payment was eventually released.

The lesson: Stakeholder alignment isn't about winning arguments. It's about making the data so clear that the right decision becomes obvious. Frame around business outcomes, not design preferences.

Designing for AI is designing for the unknown

AI products introduce design challenges that traditional UX patterns don't solve: responses change as the model learns, user control must be preserved, and a user base spanning 30+ countries created unexpected localization complexity.

AI design challenges — surface vs. depth

Three things research found that nobody expected

01
Students Don't Read Instructions
Hotjar showed 94% of users skipped instructional text. We replaced text-heavy instructions with visual examples and progressive disclosure. Result: 41% improvement in form completion rates.
02
Parents Are Hidden Users
Eye-tracking revealed parents were frequently present — sometimes completing profiles themselves. We added a "Share with Parents" feature and rewrote copy for both audiences. Result: 19% increase in completed applications.
03
Anxiety-Driven Behavior
Students checked the platform 3–4× more than necessary. We built a "Your Journey" dashboard with "No changes since you last checked" status. Result: 28% reduction in unnecessary sessions.
08
Reflection

What I'd do differently — and what I'm proud of

What I'd Do Differently

  • Conduct usability tests in Week 2 rather than Week 5 — we could have caught navigation issues sooner and saved 2 weeks of iteration
  • Involve parents earlier as a distinct user group — their influence was underestimated until mid-project
  • Establish design tokens before the first production component — retrofitting is painful

What I'm Proud Of

  • Solving the trust problem — users feel confident making life-changing decisions because the AI earns trust through transparency, not authority
  • A design system architected to handle 10× growth without major refactoring — serving 150K today and built for more
  • Cross-functional leadership — bridging design, engineering, and business to translate user needs into measurable outcomes

Six lessons from building at the intersection of AI and trust

01
Trust is designed, not assumed
Users don't trust AI because it's capable — they trust it when the interface makes reasoning legible. Explainability is a design feature, not an engineering afterthought.
02
Reduce before you add
The biggest wins came from removing complexity. Simplification required more design judgment than building new features — and delivered more measurable impact.
03
Emotion is a UX signal
Students making life-defining decisions brought anxiety into every session. Designing for emotional state — not just task completion — was what set this apart.
04
Systems thinking multiplies impact
A well-built design system doesn't just save time — it creates organizational clarity. When everyone speaks the same design language, decisions get made faster and better.
05
Microcopy shapes confidence
At high-stakes moments, words do as much work as UI. Clear, reassuring copy around AI outputs reduced hesitation and helped users move forward with conviction.
06
Measure behavior, not opinions
Self-reported trust mattered, but completion rate, drop-off, and return usage told the real story. Instrumenting those signals made iteration faster and far more objective.

Design System & Collaboration

Weekly Design-Dev Sync

  • Demo new designs with interactive Figma prototypes
  • Review implementation against specs side-by-side
  • Discuss technical feasibility of upcoming features
  • Align on responsive behaviors and edge cases

Design Handoff Process

  • Figma files with organized layers, auto-layout, and interaction annotations
  • Assets exported at 1×, 2×, 3× with consistent naming
  • Spec document: user flows, copy, error states, success metrics
  • Loom video walkthrough (5–8 min per feature)

Conclusion

DreamCollege AI taught me that great design isn't about making things pretty — it's about building trust when stakes are high, simplifying complexity without losing depth, and creating systems that scale.

In 4 months, we went from a blank Figma file to a platform serving 150,000 students making one of the biggest decisions of their lives. The real achievement was earning their trust.

Glad we could cross paths.
Out of anywhere you could be, you're here.

Next project

Nature Counter
All projects