Flagship Skill · Comparison tool design

The comparison tool design skill.

Comparisons that earn the choice by earning the user's trust.

A senior product marketing director's playbook for designing side-by-side comparison tools that help users decide. Axis selection, default-comparison logic, recommendation discipline.

Audience: product marketers, growth marketers, content marketers running vs-pages and decision-support tooling.

What this skill is for

Growth tooling, grouped by funnel stage.

Comparison-tool-design sits in the Convert cluster. It is the decision-support tool when audiences are choosing between known options.

Capture

Activate

Convert

Architect

The keystone distinction

Three positions. Both extremes are failure modes.

Failure mode

Feature-list-dump

Every option's every feature in a giant grid. No decision support. User asked to weigh 40 cells; most leave without choosing.

Failure mode

Hidden-recommendation

"Comparison" tool that is actually a sales pitch. Defaults favor one option; framing weights the answer. Trust erodes when users notice.

The discipline

Honest-comparison-with-guidance

Genuine like-for-like comparison plus an explicit opinionated recommendation. Visible, defended, overridable. Acknowledges competitor strengths.

Anatomy of an honest comparison

Recommendation, axes, audience filter.

Recommendation banner at top with audience-fit and tradeoffs disclosed. 6 axes shown (10 available); each cell shows specifics not just checkmarks; winners highlighted. Audience filter visible at bottom.

Recommended for mid-market teams

Option B: Best fit for 50-200 person teams prioritizing scale and support response time.

Tradeoff: Option A wins for under-50-person teams. Option C is enterprise-class but slower to deploy.

Axis
Option A
Option B (rec)
Option C
Per-user pricing
$X
$Y
$Z
Scale (5000+ users)
Yes
Yes
No
Time to deploy
2 weeks
1 week
4 weeks
Custom integrations
Limited
Full
Full
Support response
24h
1h
12h
Security audits
SOC 2
SOC 2 + ISO
SOC 2
Showing 6 of 10 axesFilter audience: Mid-market

The framework

Twelve considerations for comparison tool design.

  1. 01The comparison-tool decision
  2. 02Honest-comparison-with-guidance, not feature-list-dump or hidden-recommendation
  3. 03Axis selection (8-12 axes)
  4. 04Default-comparison logic honest
  5. 05Recommendation engine designed
  6. 06Filter and toggle UX
  7. 07Methodology disclosed
  8. 08Mobile parity
  9. 09Maintenance discipline
  10. 10Honest about competitor strengths
  11. 11Audience-fit measured
  12. 12Conversion as success metric

What is in the skill

Twelve sections covered in the body.

  1. 01

    What this skill covers

    Side-by-side comparison tools. Distinct from calculators (numbers), quizzes (categories), configurators (build custom).

  2. 02

    The comparison-tool decision

    When tools earn vs when written content serves.

  3. 03

    Feature-list-dump vs hidden-recommendation vs honest-comparison-with-guidance

    The keystone framing.

  4. 04

    Axis selection

    Decision-relevant capabilities, costs, constraints, service, risk. The 8-12 axis rule.

  5. 05

    Default-comparison logic

    Honest defaults vs bias-flattering defaults.

  6. 06

    Recommendation engine design

    Visible, defended, overridable. Audience-fit recommendations.

  7. 07

    Filter and toggle UX

    Filterable elements vs fixed elements. The filter-fatigue trap.

  8. 08

    Comparison-fatigue patterns

    Why most comparisons fail to produce decisions.

  9. 09

    Common failure modes

    8+ patterns: traffic without conversion, biased reputation, mobile-broken, audience criticism.

  10. 10

    The framework: 12 considerations

    Decision, honest-comparison-with-guidance, axis selection, defaults, recommendation, filters, methodology, mobile, maintenance, competitor-honesty, audience-fit, conversion metric.

  11. 11

    Reference files

    Nine references covering decision criteria, axis selection, defaults, recommendation engine, filters, fatigue patterns, honest recommendation, anti-patterns, failures.

  12. 12

    Closing: comparison tools earn the choice when they earn the user's trust

    The comparison tools that compound conversion are the ones that help users decide honestly.

Reference files

Nine references that go alongside the SKILL.md.

  • references/comparison-tool-decision-criteria.md

    When comparison tools earn the build vs when written content serves.

  • references/axis-selection-patterns.md

    Strong axes, weak axes, the 8-12 rule.

  • references/default-comparison-logic.md

    Honest defaults vs bias-flattering defaults.

  • references/recommendation-engine-design.md

    When to recommend, how to defend, override path.

  • references/filter-and-toggle-patterns.md

    Filterable vs fixed elements; filter-fatigue trap.

  • references/comparison-fatigue-patterns.md

    Why most comparisons fail to produce decisions.

  • references/honest-recommendation-discipline.md

    The discipline that distinguishes hidden from honest recommendations.

  • references/comparison-anti-patterns.md

    The patterns that look like comparisons but degrade trust.

  • references/common-comparison-failures.md

    8+ failure patterns with diagnoses and cures.

Browse all reference files on GitHub

Pairs with these platforms

Three platforms with comparison-relevant workflows.

The skill is platform-agnostic. These platforms ship workflows that fit comparison-tool programs: Webflow (host the comparison page), Notion (axis documentation and methodology), PostHog (per-segment conversion analytics).

Bridges to other skills

Five sister skills that compose with comparison tools.

  • Sister: numbers

    calculator-design

    Calculators give a number from inputs. This skill compares known options.

  • Sister: categories

    quiz-and-assessment-design

    Quizzes give a category from answers. This skill compares known options.

  • Adjacent: build custom

    product-configurator-design

    Configurators build a custom option. This skill compares known options.

  • Pricing-page application

    landing-page-copy

    Pricing pages are one specific application of comparison tools.

  • Upstream context

    content-strategy

    Content strategy decides which comparison topics earn investment.

Open source under MIT

Read the SKILL.md on GitHub.

The skill source lives in the rampstackco/claude-skills repository. MIT licensed.

Frequently asked questions.

How is comparison-tool-design different from calculator-design and quiz-and-assessment-design?
Calculators give a number from inputs. Quizzes give a category from answers. Comparison tools help users decide between known options. Different decision-support pattern; different methodology.
What is honest-comparison-with-guidance?
Genuine like-for-like comparison plus an explicit opinionated recommendation: 'For X audience, choose Y.' The recommendation is visible, defended, and not the only path; users can override.
What is feature-list-dump?
Every option's every feature in a giant grid. No decision support. The user is asked to weigh 40 cells against each other; most leave without choosing.
What is hidden-recommendation?
Comparison tool that is actually a sales pitch. Defaults favor one option; framing weights the answer. Trust erodes when users notice the bias.
How many axes earn placement?
Most production comparison tools work well with 8-12 axes. Below 8 feels thin; above 12 produces cognitive overload.
Why does the recommendation need to acknowledge competitor strengths?
When the brand always wins, sophisticated audiences notice the bias. Acknowledging where competitors are stronger signals honesty; the recommendation that comes after is more trusted.