DeFi UX Execution Audit

Reduce drop-off at swap, bridge, and borrow confirmation. I find the UI ambiguity that causes hesitation right before users commit.

Deliverable in 1 week: annotated findings, prioritized fixes, and a walkthrough call.

Public teardowns across 41 protocols. No client data.

Teardowns published

Public teardowns. No client data.

31 teardowns

Where DeFi loses users

Swap confirmation

Users abandon when fees, slippage, or route details arrive too late in the decision sequence.

Abandonment

Bridge & borrow flows

Missing destination validation introduces doubt at exactly the moment commitment is required.

Support Burden

High-intent moments

Visual hierarchy collapse forces scanning instead of deciding, raising abandonment risk at peak intent.

Mis-execution Risk

What I audit for

Each issue is tagged by type, severity, and where it appears in the flow.

Verification Gap

Contract details, fees, or slippage arrive after the user has already formed intent — or not at all.

Swap routes that surface price impact only after confirmation is triggered.

CTA / State Mismatch

The call-to-action doesn't reflect the next logical step given the current application state.

"Approve" and "Swap" rendered identically when the flow requires sequential completion.

Reversed Selection Flow

Network or chain selection is not enforced before token selection, creating irreversible mismatches.

Token dropdown available before destination chain is confirmed on bridge flows.

Hidden Primary Action

The committing action is embedded within a data table or requires implicit interaction discovery.

Borrow button inside a collateral table row with no visual affordance at scan depth.

Terminology Collision

Labels reuse terms across contexts with different meanings, or omit breakdown where precision is required.

"Price impact" and "fee" used interchangeably across swap and liquidity flows.

Visual Hierarchy Collapse

No dominant focal layer exists at the moment of commitment — every element competes at equal weight.

Yield dashboards where APY, TVL, risk rating, and CTA share identical typographic treatment.

The deliverable

Included

  • Up to 3 transaction flows audited end to end
  • 1-week turnaround
  • Friction classification (type + severity)
  • Risk assessment per friction point
  • Structural improvement guidance
  • Report format: Figma board or PDF with screenshots, callouts, and fix notes.

Not included

  • Full redesign
  • Ongoing retainer
  • Generic UI feedback

Why me

15 years in frontend engineering and product — started in fintech building trading systems and payment flows, then optimised conversion points across SaaS products, and now focused on DeFi, where UX friction doesn't just frustrate users, it loses them funds.

I've published 41 public teardowns of real protocols — swap, bridge, borrow, and perps flows. No AI-generated templates or generic checklists. Pattern recognition built from shipping high-stakes interfaces.

Follow the teardowns on X →

Pricing

£3,000

Fixed. No ranges. 1 protocol, up to 3 flows, 1 week.

Best for: teams with measurable completion issues on core transaction flows.

Not for: early prototypes without traffic.

Request the audit

Or DM “AUDIT" on X

FAQ

What does a DeFi UX audit cost?
£3,000 fixed fee. No ranges, no retainer, no hourly billing. One protocol, up to three transaction flows, delivered in one week.
How long does the audit take?
One week from kick-off to delivery. That includes flow walkthroughs, friction classification, annotated findings, and the walkthrough call.
Which flows do you audit?
Swap, bridge, borrow, and any other transaction flow where users commit funds. Up to three flows per engagement. I focus on the moments where hesitation and abandonment are highest — confirmation screens, state transitions, and multi-step approvals.
What do I get at the end?
An annotated findings report (Figma board or PDF with screenshots, callouts, and fix notes), a prioritized list of issues tagged by type and severity, and a one-hour walkthrough call to discuss recommendations.
Do you need codebase or backend access?
No. The audit is based entirely on the live interface — I walk through flows exactly as a user would. No internal access required.
Is this right for an early-stage protocol?
This audit is best suited to live protocols with measurable completion issues on core flows. If you're still in prototype with no real traffic, the findings won't have enough signal to act on meaningfully.