ForceRank
← All posts

Google Forms Ranking Question: Why It's Painful and What to Use Instead

You needed your team to rank 12 items, so you opened Google Forms, picked the "Multiple choice grid" question type, and built a matrix where each row is an item and each column is a rank from 1-12. Then you sent it. Then half the team filled it out wrong, and the responses came back as a spreadsheet you couldn't actually read, and now you're writing INDEX/MATCH formulas at 11pm.

Google Forms is the right tool for a lot of things. Ranking questions are not one of them. This post is about why, and what to do instead.

The four ways Google Forms ranking goes wrong

1. The UX is hostile to the respondent

Google Forms gives you two options for ranking, and both are bad:

Multiple choice grid. A grid where each item is a row and each rank position is a column. Respondents have to pick exactly one cell per row, with no duplicates across rows. There's no visual indication of "which ranks are still available." On mobile, the grid is unreadable. People give up.

Dropdown per item. One dropdown per item, each with the ranks 1 through N. Respondents pick a rank for each. Nothing prevents them from picking rank "1" for three different items, so you get duplicate ranks in your data and have to clean it. Also, on mobile, you're scrolling through 12 dropdowns to fill out one form.

Neither option is drag-and-drop, which is what humans naturally want when ranking. Both options take 5-10× longer to fill out than they should. This shows up as low completion rate.

2. The data is unusable without spreadsheet surgery

Even when respondents fill it in correctly, the output is a wide-format spreadsheet:

RespondentItem AItem BItem C...
Alice315...
Bob142...

To get a group ranking out of this, you need to:

  1. Decide what aggregation method to use. (Average rank? Median? Borda count? Most teams default to average rank, which is mathematically the wrong choice — see below.)
  2. Build formulas to compute the aggregate.
  3. Sort.
  4. Hope nobody had a typo.

There's no way to see who agreed with whom. There's no way to see where the team disagreed. The data is technically there but not in a form you can act on.

3. Average rank is the wrong aggregation method

The default thing most people do with ranking data is average it. Item with the lowest average rank wins. This is intuitive and almost always wrong.

Average rank doesn't account for the fact that ranking #2 vs #3 is a much smaller signal than ranking #1 vs #10. It treats all rank positions as equal-magnitude steps, which they aren't. It also produces near-ties on noisy data and gives no way to break them.

The mathematically correct approach for aggregating ranks is the Schulze method, which compares every pair of items pairwise, counts wins, and finds the ordering that beats all alternatives. It's used in academic elections, open-source governance, and serious group-decision contexts. It cannot be done in a Google Sheet without writing a script.

So either you implement Schulze yourself, or you ship a ranking based on average rank that you know is wrong.

4. No way to see disagreement

The single most valuable output of a group ranking exercise is knowing where the group disagreed. A team that aligned 95% on the top 3 ships Monday. A team that aligned 40% needs a 30-minute conversation before they ship anything.

Google Forms gives you the raw data. It doesn't show you the alignment. To get the alignment, you'd build a heatmap or correlation matrix in Sheets — possible, but you've now spent 90 minutes on the analysis side of an exercise that should have taken your team 20 minutes total.

When Google Forms is fine

Be fair: Google Forms is the right call for a lot of survey work.

  • Single-choice questions. "Which option do you prefer?" — Google Forms does this perfectly.
  • Open-ended feedback. Google Forms is great for collecting written input.
  • Likert scales. "Rate this from 1-5" works fine.
  • Lists where order doesn't matter. Just collecting opinions.

The wrinkle is specifically ordered ranking with aggregation. That's the use case where Google Forms breaks.

What to use instead

A purpose-built ranking tool solves the four problems directly:

Problem with Google FormsWhat a ranking tool does
Hostile UX (grid or dropdowns)Drag-and-drop, mobile-friendly, no duplicate ranks possible
Wide-format spreadsheet outputAggregated group ranking, automatically
Average-rank aggregation is wrongSchulze method (mathematically optimal) built in
No way to see disagreementAlignment view shows where individuals agreed and disagreed

The setup pattern looks like this:

  1. Type your items into the tool.
  2. Share a link with your team.
  3. Each person ranks via drag-and-drop in 2-5 minutes. No login. Mobile works.
  4. Results aggregate automatically. You see the group order and who disagreed with it.

For most teams, the time savings is on the analysis side, not the response side. What used to be "send Form, wait, export to Sheets, write formulas, sort, hope" becomes "send link, get results."

A specific example

Suppose 8 people ranked 10 items.

With Google Forms, the workflow is:

  • Build the grid form (15 min)
  • Send and collect (24-48 hr)
  • Export to Sheets, clean malformed responses (15 min)
  • Decide on aggregation, write formulas (30 min)
  • Sort, format, send to team (10 min)
  • Realize you don't know who disagreed and someone asks "why is X above Y?" — no answer

Total: ~70 min of facilitator work, no disagreement insight.

With a purpose-built ranking tool:

  • Type items into form (3 min)
  • Send link (1 min)
  • Collect (24-48 hr)
  • Open results page (instant)

Total: ~5 min of facilitator work, full disagreement view automatically.

The ratio gets worse as the team grows. Aggregating 30 responses by hand in Sheets is hours. Aggregating 30 responses in a tool that knows it's a ranking is still seconds.

What to do on Monday

If you have a ranking exercise on the calendar:

  1. Don't open Google Forms.
  2. Open a purpose-built ranking tool. Type your items. Share the link.
  3. When results come back, look at where the team disagreed before you look at the aggregate. That's where the actual conversation is.

The aggregate is the easy part. The disagreement is where the value comes from — and Google Forms gives you neither.


Try a real ranking tool. ForceRank is built for ranking questions specifically. Drag-and-drop, no signup for participants, automatic Schulze aggregation, and an alignment view that shows where the team agreed and disagreed. Free for groups up to 20. Works on mobile.