Persona Development
Stakeholders Interviews
Jorney Maps
Heuristic Evaluation
Usability Testing
Analytic Audits

UX Research Project
Municipal Building Permit Application System Redesign
CLIENT: City of Richmond Hill
ROLE: Lead UX Researcher
DURATION: 5 Months
RESEARCH FOCUS: Qualitative & Quantitative
Project Summary
As Lead UX Researcher, I designed and executed a comprehensive research initiative to identify and resolve critical usability issues in the City of Richmond Hill's online building permit application system. I led a mixed-methods research approach combining qualitative and quantitative techniques, personally conducting 20+ interviews, analyzing 6 months of analytics data, and running 3 rounds of usability testing with 16 participants.
My research uncovered significant user experience barriers causing a 62% drop-off rate and overwhelming customer support teams, and my recommendations directly led to measurable improvements in completion rates and user satisfaction.
Metrics & Results
52% to 94%
Task Completion Rate
70%
Drop-off Reduction
-40%
Support Calls
42 min to 15mins
Completion Time Reduced






Research Objective
What, Where, How
I partnered with the Planning Director to define the research scope and established five primary research questions that would guide my investigation;
Primary Research Questions:
-
What are the primary pain points preventing users from completing permit application?
-
Where in the application flow do users experience the most friction?
-
What are the differences in needs between residential homeowners and professional contractors?
-
How do we reduce the cognitive load and improve information architecture?
-
What accessibility barriers exist for user with varying digital literacy levels?
My Role & Responsibilities
As the Lead UX Researcher on this project, I was responsible for:
Research Strategy & Design:
Designed the comprehensive mixed-methods research approach
Selected appropriate methodologies based on research questions and constraints
Created all research protocols, screeners, and discussion guides
Secured stakeholder buy-in and research budget of $45,000
Research Execution:
-
Personally conducted 8 stakeholder interviews and 12 user interviews
-
Recruited and screened all research participants
-
Moderated all 16 usability testing sessions across 3 rounds
-
Led collaborative journey mapping workshops with users and stakeholders
Analysis & Synthesis:
-
Analyzed 6 months of analytics data and identified critical drop-off points
-
Conducted thematic coding of 20+ hours of interview transcripts using NVivo
-
Synthesized findings from multiple sources to identify patterns
-
Developed 2 research-based personas validated with stakeholders
Strategic Recommendations:
-
Translated research findings into 10 prioritized, actionable recommendations
-
Presented findings to executive leadership and secured implementation approval
-
Advocated for user needs when balancing business and technical constraints
-
Collaborated with design and development teams throughout implementation
Impact Measurement:
-
Designed post-launch evaluation framework and success metrics
-
Conducted longitudinal studies to measure sustained impact
-
Resported ROI findings to demonstrate research value ($136K annual savings)
Research Methodology
I designed a comprehensive mixed-methods research approach that would triangulate findings across qualitative and quantitative data sources. This strategic combination ensured both depth of understanding and statistical confidence in our findings.
Phase 1: Discovery (Weeks 1-3)
I conducted stakeholder interviews, user interviews, and led a heuristic evaluation to understand the problem space from multiple perspectives
Phase 3: Validation (Weeks 6-14)
I designed and executed 3 rounds of usability testing, collaborating with the design team on iterative refinements.

Phase 2: Analysis (Weeks 4-5)
I synthesized data across sources, audited analytics, developed personas, and facilitated journey mapping workshops.
Phase 4: Post-Launch Evaluation (Weeks 15-20)
I conducted longitudinal studies, deployed satisfaction surveys, and measured performance metrics to prove research ROI.
Research Methods Deep Dive
-
Sample: 8 city staff members (permit clerks, building inspectors, customer service reps)
-
Method: Semi-structured 45-minute interviews I personally conducted
-
Location: On-site at City Hall
-
My Role: I designed the interview protocol and personally conducted all 8 interviews to understand internal pain points and organizational constraints
-
-
Sample: 12 participants (6 homeowners, 6 contractors)
-
Method: In-depth interviews with task walkthroughs
-
Duration: 60 minutes per session
-
My Role: I recruited all participants using targeted social media ads and community outreach, screened for digital literacy diversity, and personally moderated all 12 sessions
-
-
Framework: Nielsen's 10 Usability Heuristics
-
Evaluators: 3 UX experts
-
Focus: Navigation, error prevention, visual hierarchy
-
My Role: I coordinated the evaluation, compiled findings from 3 evaluators, and prioritized issues by severity
-
-
Framework: Nielsen's 10 Usability Heuristics
-
Evaluators: 3 UX experts
-
Focus: Navigation, error prevention, visual hierarchy
-
My Role: I coordinated the evaluation, compiled findings from 3 evaluators, and prioritized issues by severity
-
-
Rounds: 3 iterations
-
Method: Moderated remote testing via Zoom
-
Tasks: 8 scenario-based tasks per session
-
My Role: I designed all test scenarios, recruited participants, personally moderated all 16 sessions, and analyzed session recordings to identify patterns
-
-
Method: Affinity mapping from interview data
-
Personas: 2 primary user archetypes
-
Validation: Reviewed with 4 stakeholders
-
My Role: I led affinity mapping workshops, synthesized behavioral patterns into 2 distinct personas, and validated them through stakeholder review sessions
-
-
Scope: End-to-end application process
-
Method: Collaborative workshop with users
-
Output: Pain points and opportunity areas
-
My Role: I facilitated a full-day journey mapping workshop with 6 users and 4 stakeholders, synthesizing insights into a comprehensive journey map
-
-
Sample: 147 respondents (post-launch)
-
Method: CSAT, NPS, SUS scales
-
Response Rate: 34%
-
My Role: I designed the survey instrument, distributed it through email and in-app prompts, and analyzed results to measure post-launch satisfaction
-

Detailed Method 1: Stakeholder Interviews
Objective: Understand internal pain points, technical constraints, and organizational goals
I designed this research phase to ensure my recommendations would be both user-centered and organizationally feasible. I knew that understanding staff perspectives would be crucial for stakeholder buy-in later.
Participants:
-
3 Permit Clerks (frontline processing)
-
2 Building Inspectors (technical review)
-
2 Customer Service Representatives (support desk)
-
1 IT Manager (systems integration)
Key Questions I Asked:
-
What are the most common issues users face when calling for help?
-
Which parts of the application process cause the most errors?
-
What technical limitations exist with the current system?
-
What would make your job easier when processing applications?
My Analysis Approach:
I transcribed all interviews and conducted thematic coding using NVivo software, identifying 7 recurring themes across transcripts. This revealed that staff were equally frustrated with the system as users were
Key Insight I Uncovered:
The IT Manager revealed that the document upload system had zero validation, meaning users could upload any file type, which explained the 47% rejection rate I later found in analytics.
Detailed Method 2: User Interviews with Task Analysis
Objective: Uncover user mental models, pain points, and contextual usage scenarios
I designed this method to observe real user behavior, not just hear what users said they did. The screen-sharing task walkthrough proved invaluable for seeing actual friction points in real-time.
Recruitment Strategy I Used:
I created targeted Facebook and community board ads that reached both homeowners and contractors. I intentionally screened for digital literacy diversity using a self-assessment scale I developed, ensuring I included users who struggle with technology, not just power users
Recruitment Criteria:
-
Homeowners: Applied for at least one permit in past 12 months
-
Contractors: Submit 5+ permits annually
-
Mix of successful and abandoned applications
-
Age range: 28-67 years
-
Digital literacy: Varied (screened with self-assessment)
Protocol I Designed:
-
Background interview (15 min): Experience, motivations, previous permit history
-
Screen sharing task walkthrough (25 min): Attempt to start a new permit application
-
Retrospective think-aloud (15 min): Discuss frustrations and expectations
-
Card sorting exercise (5 min): Organize permit types and documentation requirements
My Analysis:
I created a behavioral coding matrix, built affinity diagrams to cluster pain points, and calculated task success metrics for each participant. This revealed stark differences between homeowner and contractor needs.
Critical Discovery I Made:
During session #4, a homeowner stopped mid-task and said "I feel completely overwhelmed." I probed deeper and discovered that the 847-word instruction page was the primary barrier. This insight led me to prioritize content simplification in my recommendations.

Detailed Method 3: Analytics Deep Dive

Objective: Quantify user behaviour patterns and identify drop-off points
I knew qualitative research alone wouldn't convince stakeholders to invest in a redesign. I needed hard data showing business impact.
Data I Analyzed:
-
Funnel analysis: Entry → Step 1 → Step 2 → Step 3 → Step 4 → Submission
-
Heatmaps: Scroll depth, click patterns, rage clicks
-
Session recordings: 50 randomly sampled abandoned sessions I personally reviewed
-
Device breakdown: 45% mobile, 48% desktop, 7% tablet
-
Error rates: Form validation failures by field
My Critical Finding:
I discovered that 62% of users abandoned at Step 3 (Document Upload), with an average time on page of 8.5 minutes before exit.
When I cross-referenced this with session recordings, I saw users repeatedly trying to upload files, getting errors, and giving up. This became the #1 priority in my recommendations.
How I Used This Data:
In my stakeholder presentation, I showed the funnel drop-off visualization alongside video clips of users struggling.
This combination of quantitative and qualitative evidence was irrefutable and secured immediate buy-in.
Detailed Method 4: Three-Round Usability Testing
I advocated for three rounds of testing instead of the originally planned single round. I convinced the team that iterative testing would prevent costly post-launch fixes by validating solutions progressively.
Round 1: Low-Fidelity Wireframes (n=5)
Prototype: Grayscale wireframes in Figma
My Testing Approach: I moderated all 5 remote sessions via Zoom, using scenario-based tasks like "You want to add a fence to your backyard. Start the application process."
Key Findings I Identified:
-
3/5 participants couldn't locate FAQ/help resources
-
Terminology like "setback requirements" confused all homeowners
-
Progress indicator wasn't clear enough (users didn't know how many steps remained)
Success Rate: 40% task completion
My Action: I immediately met with the design team and advocated for three changes: clearer navigation, plain-language tooltips, and a numbered progress bar. I showed them video clips of users struggling to make my case.

Round 2: Mid-Fidelity Prototype (n=6)
Prototype: Interactive prototype with placeholder content
My Testing Approach: I recruited 2 additional participants for this round to increase confidence in findings.
Key Findings I Identified:
-
Document upload requirements still unclear (4/6 participants uploaded wrong file types)
-
File size limits not communicated until after upload failure
-
Mobile users struggled with form field sizing
-
Positive feedback on contextual tooltips (+83% found them helpful)
Success Rate: 67% task completion
My Action: I pushed back on the design team's suggestion to move forward. I insisted we needed to solve the document upload problem before launch, showing them that 4/6 users still failed at this step. I recommended specific solutions: upfront format guidance, sample documents, and real-time validation.

Round 3: High-Fidelity Prototype (n=5)
Prototype: Fully designed, interactive prototype with real content
My Testing Approach: I used the same tasks as Round 1 to measure improvement against baseline.
Key Findings I Celebrated:
-
90% task completion rate
-
Average time to complete: 12 minutes (down from 38 minutes in baseline)
-
All participants praised progress bar and status tracker
-
Mobile responsiveness received positive feedback (100% said it was "easy to use")
-
Minor accessibility issues I identified with color contrast (addressed before launch)
System Usability Scale (SUS) Score: 82.5 (Grade A) - I calculated this based on standardized scoring
My Recommendation: I confidently recommended proceeding to development, presenting executive leadership with a comparison deck showing improvement from 40% to 90% task completion across testing rounds.
.

Key Research Findings
Through my analysis of multiple data sources, I identified four critical insights that I prioritized for immediate action:
Critical Insight #1: Cognitive Overload
"I felt completely overwhelmed. There was so much text on the screen and I didn't know where to start. I gave up and just called the city office instead." - Homeowner, Age 52
Evidence I Gathered:
-
Average reading level required: 14th grade (I analyzed content using Hemingway Editor)
-
Average words per page: 847 words (I manually counted across all pages)
-
8/12 interview participants mentioned feeling "overwhelmed"
-
Heatmaps I reviewed showed only 23% scroll depth on instruction pages
My Interpretation: Users were experiencing decision paralysis from information overload. I recognized this as a fundamental information architecture problem, not just a content problem.
.
Critical Insight #2: Document Upload Confusion
"I uploaded my property survey, but then it said 'invalid format.
' What format do they want? There were no examples or instructions." - Contractor, Age 41
Evidence I Gathered:
-
62% drop-off rate at Step 3 that I identified in analytics
-
47% of uploaded files were rejected (I pulled this from system logs)
-
Average 3.2 support calls per rejected upload (I analyzed customer service data)
-
No upfront guidance on acceptable file types, sizes, or naming conventions (I documented through heuristic evaluation)
My Interpretation: This was a classic "error prevention" failure (Nielsen's Heuristic #5). Users needed guidance BEFORE attempting upload, not error messages AFTER failure. This became my #1 design recommendation.
Critical Insight #3: Mobile Usability Failures
Evidence I Compiled:
-
45% of traffic was mobile, but mobile completion rate was only 18% (I segmented analytics data)
-
Form fields too small: average tap target 38px, below WCAG minimum of 44px (I measured in browser dev tools)
-
No mobile-optimized document upload (I identified through device testing)
-
6/12 participants I interviewed mentioned attempting on mobile first, then switching to desktop
My Interpretation: The team had deprioritized mobile, assuming permits were "desktop tasks." I used data to prove mobile was critical, not optional. I advocated for mobile-first design despite initial resistance.
Critical Insight #4: Lack of Progress Transparency
"I submitted my application three weeks ago and have no idea what's happening. Is it being reviewed? Is it approved? I have no clue." - Homeowner, Age 35
Evidence I Analyzed:
-
76% of support calls were status inquiries (I categorized call center logs)
-
No automated status updates (I confirmed with IT Manager)
-
No estimated timeline provided (I documented in heuristic evaluation)
-
10/12 participants I spoke with expressed frustration about lack of transparency
My Interpretation: This was causing significant operational burden (835 calls/month) that could be eliminated through automation. I calculated that this represented $47K in annual support costs.
Positive Finding I Discovered: Trust in Government Systems
Despite frustrations, 91% of participants I interviewed expressed willingness to use an improved online system, citing convenience and time savings as primary motivators. This insight helped me secure stakeholder buy-in—users WANTED digital, they just needed it to work better.
Synthesized Pain Points by User Type
I created this comparison matrix to help the design team understand where to prioritize mobile optimization (homeowners) versus power-user features (contractors):

Research-Based Personas
I developed two primary personas by conducting affinity mapping workshops with interview data. I validated these personas with 4 stakeholders to ensure they reflected real user segments, not assumptions.

Persona 1: "Homeowner Amy"
Demographics: Age 42, Marketing Manager, First-time permit applicant
Digital Literacy: Moderate (comfortable with email and social media, less so with government websites)
Goals:
-
Get a fence permit for backyard renovation
-
Understand requirements clearly without calling for help
-
Complete application quickly during lunch break
Pain Points:
-
Overwhelmed by legal language and technical terms
-
Doesn't know what documents are needed upfront
-
Anxious about making mistakes that could delay approval
-
Limited time during work hours to deal with government paperwork
Quote: "I just want to know: what do I need to provide, how long will it take, and am I doing this right?"
Research Evidence: Represents 58% of applicant base (occasional/one-time users) based on my analysis of application frequency data
How I Used This Persona: I referenced Amy when advocating for plain language and mobile-first design. When stakeholders suggested keeping technical terminology, I asked "Would Amy understand this?"

Persona 2: "Contractor Ethan"
Demographics: Age 38, Licensed Contractor, Submits 15-20 permits annually
Digital Literacy: High (uses multiple project management and trade software daily)
Goals:
-
Submit permit applications efficiently for multiple projects
-
Track status of multiple applications simultaneously
-
Access permit history and documentation for client records
Pain Points:
-
Repetitive data entry for similar permit types
-
No bulk submission or template functionality
-
Can't easily track which permits are approved vs. pending
-
Client pressure to provide accurate timelines
Quote: "Time is money. I need a system that lets me submit fast, track everything in one place, and doesn't make me re-enter the same info over and over."
Research Evidence: Represents 34% of applicant base but 71% of total permit volume (I calculated this from application logs)
How I Used This Persona: I used Ethan to justify contractor-specific features like templates and bulk upload, showing that serving 34% of users efficiently would process 71% of permits.
Before State: Baseline Metrics
I established comprehensive baseline metrics to prove research ROI. I knew I needed clear before/after comparisons to demonstrate impact.


Research-Informed Recommendations
I synthesized all research findings into 10 prioritized recommendations. I organized these using a 2x2 matrix (impact vs. effort) to help stakeholders make implementation decisions.
My Prioritization Framework: I weighted recommendations based on:
-
Frequency of issue across data sources (how many users affected?)
-
Severity of impact (how much does it hurt task completion?)
-
Business value (what's the cost of NOT fixing?)
-
Implementation feasibility (can we do it within constraints?)
-
Finding I Identified: 14th grade reading level, 847 words per page, 77% scroll abandonment
-
My Recommendation: Rewrite all content at 8th grade reading level; reduce to max 300 words per page; use progressive disclosure for detailed information
-
Expected Impact: Reduce cognitive overwhelm, improve comprehension for 80%+ of users
-
How I Advocated: I partnered with the Communications team and personally rewrote the first 3 pages as examples, showing stakeholders it was possible to maintain legal compliance while improving readability
-
-
Finding I Identified: 62% drop-off at upload step, 47% file rejection rate
-
My Recommendation:
-
Display accepted formats upfront (PDF, JPG, PNG max 10MB)
-
Show file examples and templates
-
Provide real-time validation before upload
-
Add drag-and-drop functionality
-
Implement auto-saving for partial progress
-
-
Expected Impact: Reduce drop-off by 50%, decrease support calls by 35%
-
How I Advocated: I showed stakeholders video clips of 4 different users failing at this step, making the problem undeniable. I then presented a competitive analysis of 5 other municipal systems that solved this well.
-
-
Finding I Identified: 45% mobile traffic, only 18% mobile completion
-
My Recommendation:
-
Increase tap targets to 44px minimum (WCAG compliance)
-
Optimize form fields for mobile keyboards
-
Enable camera upload for documents
-
Implement mobile-optimized file picker
-
-
Expected Impact: Increase mobile completion to 70%+
-
How I Advocated: The team initially wanted to deprioritize mobile. I presented data showing 45% of traffic was mobile and calculated that we were losing 37% of potential completions. I reframed mobile not as "nice-to-have" but as serving nearly half our users.
-
-
Finding I Identified: 76% of support calls are status inquiries
-
My Recommendation:
-
Dashboard showing all applications and current status
-
Automated email/SMS notifications at each stage
-
Estimated timeline for review and approval
-
Direct messaging with permit reviewers
-
-
Expected Impact: Reduce status inquiry calls by 60%
-
How I Advocated: I calculated that 76% of 847 monthly calls = 644 status inquiries at $140 average handling cost = $90K annual waste. This ROI calculation convinced the IT Manager to prioritize automated notifications.
-
-
Finding I Observed: 5/6 homeowners needed to gather documents mid-application
-
My Recommendation: Auto-save every 30 seconds; email resume link; show completion percentage
-
Expected Impact: Reduce abandonment by 25%
-
-
Finding I Documented: All homeowners confused by technical jargon
-
My Recommendation: Hover tooltips explaining terms like "setback," "easement," "zoning variance" in plain language; embedded FAQ; contextual help videos
-
Expected Impact: Reduce help-seeking behavior by 40%
-
How I Contributed: I personally wrote the first 15 tooltip definitions, working with the Planning team to ensure legal accuracy while maintaining plain language
-
-
Finding I Calculated: 71% of permit volume from 34% of users (contractors)
-
My Recommendation:
-
Template system for repeat permit types
-
Bulk upload functionality
-
Multi-project dashboard
-
Pre-filled property information from previous permits
-
-
Expected Impact: Reduce contractor submission time by 60%
-
How I Advocated: I showed that serving power users efficiently would process the majority of permits, directly addressing the department's backlog concerns
-
-
My Recommendation: Auto-fill property details, zoning information, and requirements based on address
-
Expected Impact: Save 3-5 minutes per application
-
-
My Recommendation: Step-by-step progress bar showing completion percentage and remaining steps
-
Expected Impact: Improve user confidence and reduce anxiety
-
-
My Recommendation: Interactive tool that helps users determine exact requirements before starting
-
Expected Impact: Reduce errors and incomplete submissions by 30%
-
Design Solutions (Research-Validated)
Solution 1: Simplified Step-by-Step Flow
-
Before: All information on single long scrolling page
-
After: 5 discrete steps with clear progress indicator
-
My Validation: 90% of test users I observed successfully navigated all steps without confusion
-
My Contribution: I pushed back on the initial 8-step design, arguing from research that users wanted simplicity. I advocated for consolidating to 5 steps maximum.
Solution 2: Document Upload Redesign
-
Before: Generic upload button, no guidance
-
After:
-
Visual card layout showing each required document
-
Checkmarks when uploaded successfully
-
Format requirements and file size limits displayed
-
Sample documents linked
-
Drag-and-drop interface
-
-
My Validation: File upload success rate increased from 53% to 96% in testing I conducted
-
My Contribution: I created the list of required documents for each permit type, working with permit clerks to ensure accuracy. I also sourced sample documents users could reference.
Solution 3: Mobile-Optimized Experience
-
Before: Desktop design shrunk to mobile screen
-
After: Mobile-first design with larger touch targets, optimized form fields, camera integration
-
My Validation: Mobile task completion rate increased from 18% to 78% in my testing
-
My Contribution: I conducted mobile-specific testing sessions and documented every friction point with screenshots and video evidence
Solution 4: Plain Language Content Strategy
-
Before: Legal/technical jargon throughout
-
After: 8th grade reading level, definitions for all technical terms, visual diagrams
-
My Validation: Reading comprehension improved from 41% to 89% in testing I conducted
-
My Contribution: I personally rewrote critical pages and collaborated with the Communications team on voice and tone guidelines
Solution 5: Real-Time Validation & Error Prevention
-
Before: Errors shown only at final submission
-
After: Inline validation, helpful error messages, prevention of common mistakes
-
My Validation: Form completion errors decreased by 82% based on my analysis
-
My Contribution: I documented every error scenario I observed in testing and provided specific error message recommendations
I collaborated closely with the design team throughout implementation, ensuring solutions addressed the root causes I identified in research, not just symptoms.
After State: Post-Launch Results
I designed a comprehensive post-launch evaluation framework to measure impact over 3 months. I wanted to prove that investing in research delivers measurable ROI.
Business Impact I Calculated
Cost Savings:
-
Support call reduction: $47,000 annually (335 fewer calls × $140 average handling cost)
-
Processing efficiency: $89,000 annually (reduced error correction and resubmissions)
-
Total Annual Savings: $136,000
Quantitative Results I Measured

Qualitative Results I Collected

User Satisfaction I Documented
-
89% of users rated new system as "easy to use" or "very easy to use"
-
92% said they would recommend the online system to others
-
78% of users completed applications without any assistance
Staff Impact I Observed:
-
Permit clerks reported 40% reduction in application errors
-
Customer service team reassigned to proactive outreach instead of reactive support
-
Building department processed 23% more permits with same staffing levels
My Presentation to Leadership: I created an executive dashboard showing these results and presented them to the City Manager and Planning Director. This led to approval for 3 additional digital transformation research projects.
Key Learnings & Best Practices
What Worked Well
1. Mixed-Methods Triangulation I combined qualitative insights with quantitative validation to ensure both depth and statistical confidence. Interview quotes brought findings to life for stakeholders, while metrics proved business value. This dual approach was essential for securing buy-in.
2. Iterative Testing with Real Users I advocated for three testing rounds when stakeholders initially wanted one. Testing progressively refined designs allowed me to validate solutions before final development, preventing costly post-launch corrections. The improvement from 40% to 90% task completion across rounds proved the value of iteration.
3. Persona-Driven Design Decisions Creating distinct user personas (Homeowner Amy vs. Contractor Ethan) helped me prioritize features and tailor experiences. When stakeholders debated feature priority, I asked "Does this serve Amy or Ethan?" which focused discussions.
4. Stakeholder Collaboration I held regular check-ins with city staff to ensure solutions were technically feasible and aligned with organizational constraints. This prevented me from recommending solutions that couldn't be implemented.
5. Leading with Business Impact I calculated cost savings and efficiency gains early, which helped me secure budget and resources. When I said "This costs $90K annually," stakeholders listened.
Conclusion
This UX research initiative highlights the strategic value of user-centered research in government digital services. Through a rigorous mixed-methods approach, we transformed a complex, error-prone system into a streamlined, intuitive experience that effectively supports both residents and internal government staff.
Key success factors included:
-
A comprehensive research methodology combining qualitative insights with quantitative validation
-
Persona-led design decisions tailored to the needs of distinct user groups
-
Iterative usability testing to validate solutions prior to full implementation
-
Close collaboration with stakeholders to ensure alignment and organizational buy-in
-
Clearly defined success metrics to demonstrate measurable business impact
The outcomes—an 81% increase in task completion, a 61% reduction in completion time, and significant gains in user satisfaction—clearly demonstrate the return on investment of robust UX research. As a result, this project has become a benchmark for subsequent digital transformation initiatives within the City of Richmond Hill.
Proudly designed by Somreen Safdar
© 2035 by Somreen Safdar..