Established OKX UX metrics across global markets via surveys
I designed and launched a quantitative and qualitative survey to establish quarterly UX metrics that track and guide usability improvements, and to gather insights that informed the product roadmap.
Design became a priority — but measurement was broken
1. No Consistent UX Metrics
Rich qualitative insights existed, but no consistent, quantitative way to measure UX. Leadership couldn't validate if changes improved UX or compare across markets.
2. Qual Data Alone Wasn't Convincing
Small-sample findings were sometimes dismissed. Quantitative benchmarks were needed to make insights more persuasive and actionable.
3. Siloed, Low-Quality Surveys
Different teams sent surveys independently with varying quality, compromising feedback, risking survey fatigue, and undermining brand credibility.
16 questions. Three sections. One clear framework.
I designed a survey combining SUPR-Q and CSAT for quantitative UX metrics with qualitative questions to identify the "why" behind scores.
Section 1: UX Metrics & Usability
SUPR-Q selected as the core UX metric for comprehensive coverage of usability, trust, loyalty, and credibility.
Section 2: Content Quality & Satisfaction
Evaluated localisation investments across multiple markets with open-ended and MC questions.
Section 3: About Users
Frequency, device usage, primary exchange, and switching intent — to understand behavioural segments.
Multi-channel distribution plan: email, KOLs & followers, and in-app surveys to reach different user segments and reduce method-specific biases.
10,000+ responses in under 5 days across 6 markets
With support from 6 different teams — marketing, customer support, localisation, content design, legal, and engineering — I successfully distributed across 7 languages.
Orchestrated cross-team support from marketing, localisation, content design, and legal for review and distribution.
Set up survey in 7 languages across SurveyMonkey and Pollfish, optimised for reduced fatigue and better response quality.
Piloted with 100 users per market to validate translations and flow before full launch.
Rolled out in controlled batches, pausing once target responses per market were reached.
⚠️ Due to confidentiality, UX metrics and some qualitative details have been omitted. Chart below is illustrative only.
Satisfactory performance overall — with clear gaps
⭐ Wins
Strong UX metrics across all markets. NPS rose 27% in one market, validating UX and localisation improvements. Small wins and unmet needs surfaced to build on.
🚨 Gaps
Usability challenges for new users. Technical issues like slow loading. Futures & margin trading interfaces less intuitive than competitors. P2P transaction reliability concerns.
For illustrative purposes only — not actual survey data.
Making findings actionable at every level
Prepared both consolidated and market-specific reports. Organised readouts for all teams, followed by smaller sessions with product owners and designers to drive action.
"The survey results really helped my team in project prioritisation and also validated some usability issues we previously identified as well!"— Localisation Product Lead
Tracking & Follow-up
Tracked all findings in Jira and followed up regularly to ensure progress on identified issues.
Dual Recommendations
Provided both ideal long-term fixes and feasible quick wins to keep momentum while larger changes were in development.
The Outcome
OKX's first UX benchmark ever — validating improvements and steering future product and research decisions.
Outcomes
- Repeatable survey system across multiple markets
- Proof of effectiveness of localisation enhancements
- Qualitative findings made more persuasive
- Streamlined analysis with templates & AI translation
Learnings
- Apply stronger statistical tests for rigour
- Pair ideal recommendations with quick wins
- Cross-team partnerships accelerate delivery