We Analyzed 100 Million Steam Reviews. Less Than 0.5% Got a Developer Response.
Across 100 million Steam reviews, less than 0.5% got a developer response.
That's one reply for every 200 reviews. Maybe.
Here's what gets me about that number: when developers DO respond to negative reviews, those reviews flip to positive 21.6% of the time. No response? The review just sits there. Dragging your score down. Forever.
This isn't speculation. It's what 100 million reviews across the entire Steam catalog actually show. Let's walk through it.
The 0.5% problem
In January 2025, researcher Will at Truthful Toast published an analysis of Steam developer responses across 100 million reviews. The headline number: only about 450,000 of those reviews (roughly 0.5%) had received any developer response at all.
But the distribution is wild.
Half of all developer responses on the entire platform came from just 79 games. Not because only 79 games exist on Steam (there are over 70,000). Because only 79 developers consistently bother to reply.
Think about that for a second. Studios obsess over marketing spend, trailer performance, wishlist counts. Almost nobody is doing the one post-launch activity that costs nothing but time: talking back to the people who played your game.
Imagine a restaurant ignoring 99.5% of its Google reviews. In hospitality, that'd be insane. In gaming, it's Tuesday.
What happens when developers actually respond
When developers respond to negative reviews on Steam, the numbers move:
- 12.1% average sentiment improvement on negative reviews that get a developer response. That's the aggregate across all games studied. Some see much higher.
- 21.6% of negative reviewers update their review after getting a developer response. Many of those updates flip from negative to positive.
- Only 8.5% of positive reviewers update after a response, so your effort goes way further on negatives.
- Reviews that get developer responses and are later updated settle at a 63.5% positive rate.
- Reviews updated without any developer interaction drift 10.4% more negative over time. Without intervention, your score doesn't stay flat. It decays.
Steam lets players edit their reviews at any time, including flipping their thumbs-up or thumbs-down vote. When you respond to a negative review, you give the reviewer a reason to reconsider. And 21.6% of the time, they do.
Why players are 2x more likely to update after a response
Players who get a developer response are twice as likely to update their review compared to players who don't. Once you think about why, it's obvious.
When someone writes a negative review, they're either venting frustration or trying to be heard. Either way, a developer response gives them what they actually wanted: acknowledgment.
Basic reciprocity. You gave them attention. They reconsider. It doesn't work every time, but it works often enough to move your overall score.
The response matters more than the fix. Players don't need you to have solved the problem to update their review. They need to know you read it and you understand the issue. Feeling heard is the trigger, not a patch note.
And Steam's update mechanism makes this frictionless. The reviewer clicks edit, changes their vote, and it hits your score immediately. No form. No waiting period. One click.
The 55.9% improvement when studios are strategic
Not all response strategies are equal. The aggregate 12.1% improvement is the average across every studio that responds, including those who respond randomly, inconsistently, or only to positive reviews.
The data tells a completely different story when you isolate studios that specifically target negative reviews:
Games where developers focused their responses on negative reviews saw an average sentiment improvement of 55.9%.
That's nearly five times the aggregate average.
The difference is strategy. Instead of responding to everything equally, these studios put their energy into the reviews most likely to flip from negative to positive.
Some games did even better. The Truthful Toast analysis found Forza Horizon 4 hitting an 80% improvement rate, with DOOM Eternal, Gunfire Reborn, and Elite Dangerous at 60% or higher.
This makes intuitive sense. A player who writes a thoughtful negative review about a specific bug or balance issue is way more receptive to a response than someone who writes "this game sucks." The strategic move is to focus where a response can actually change minds. The data confirms it works.
Games near tier boundaries benefit the most. Those rating "bubbles" between Mixed and Mostly Positive, or between Mostly Positive and Very Positive, are where this really pays off. When you're at 68% and need 70% to cross into Mostly Positive, even a handful of flipped reviews can be the difference.
The revenue math nobody's doing
Here's where this gets concrete.
Steam's review tier system controls how the algorithm treats your game across the entire store: search results, discovery queues, "More Like This" recommendations, and sale event featuring. The thresholds are well-documented:
| Tier | Threshold | What It Means |
|---|---|---|
| Overwhelmingly Positive | 95%+ (500+ reviews) | Best possible placement. Maximum visibility. |
| Very Positive | 80%+ (50+ reviews) | Strong recommendation signal. |
| Mostly Positive | 70-79% | Solid. Players will consider purchasing. |
| Mixed | 40-69% | Yellow warning label. Players scroll past. |
| Mostly Negative | 20-39% | Active deterrent. |
The jump from Mixed to Mostly Positive — from the yellow "warning" label to blue "safe" label — is the most valuable tier transition for most games. According to research from Gamesight, moving from Mixed to Very Positive can nearly triple a game's advertising conversion rate. Every game they tracked with a conversion rate above 2% had 80% or higher positive reviews.
Now combine that with the review-to-sales multiplier. The commonly used estimate (originated by Mike Boxleiter and refined by GameDiscoverCo) suggests that each Steam review represents roughly 40-60 copies sold, depending on the game's age, genre, and price point. The median for recent releases is approximately 58 copies per review.
Let's do the math:
- Your game has 500 reviews at 66% positive (330 positive, 170 negative). You're Mixed.
- You respond to all 170 negative reviews. The data says 21.6% will update, so that's roughly 37 flipped reviews.
- Each flip counts double in the ratio: minus one negative, plus one positive. New score: (330 + 37) / 500 = 73.4%.
- You just crossed into Mostly Positive. No new reviews needed. Just responses.
- At ~50 sales per review and a $20 price point, your 500 reviews represent roughly $500,000 in gross revenue. A tier change that improves conversion by even 30% is worth $150,000+ over the game's lifetime.
This is back-of-napkin math. Your actual numbers will differ. But the magnitude is clear: responding to reviews isn't a marginal activity. It could be the single highest-return thing you do after launch.
Want to see the actual numbers for your game? Run a free Steam review audit. Enter your AppID and see your current score, tier position, and how far you are from the next threshold. Takes 30 seconds.
Why 99.5% of studios still don't do this
If the data is this clear, why does almost nobody respond?
The time cost is real. A thoughtful response takes 3-5 minutes to write. If your game gets 20 reviews a day, that's 60-100 minutes of writing. For a solo dev, that's 10-15 hours per week — time that should go toward actually building your game. At a $50/hour opportunity cost, that's roughly $2,500 per month in developer time.
Then there's the emotional toll. The games industry has a serious burnout problem. The 2025 Games Industry Employment Survey found that roughly half of developers report experiencing professional burnout, and the IGDA Developer Satisfaction Survey consistently finds that crunch affects 28% or more of the workforce. Reading negative reviews all day makes both worse. It's psychologically draining on top of the time cost.
A lot of developers also assume negative reviewers will never change their mind. "They already decided they hate it. Why bother?" The data directly contradicts this: 21.6% isn't a rounding error. But the assumption sticks because the feedback loop is invisible. You respond to a review, the player quietly updates it two days later, and you never notice the connection.
And finally, there are almost no tools for this. The Google Reviews ecosystem has dozens of response management platforms (BirdEye, Podium, ReviewTrackers, etc.). Steam has almost nothing purpose-built for review response management. Until recently, the only option was doing it by hand in the Steam developer dashboard.
The result: developers know reviews matter, but the cost of responding manually pushes them into the 99.5% who do nothing. And their scores pay the price.
What a response strategy actually looks like
This doesn't have to be a 15-hour-per-week commitment. The studios seeing the 55.9% improvement aren't spending more time. They're spending smarter time.
The approach is a simple triage:
- Negative reviews about issues you've already fixed. Highest priority. Best flip potential. "We actually fixed this in patch 1.3 — would love to hear if it works better now."
- Negative reviews with valid criticism you haven't fixed yet. Acknowledge, give a timeline if possible, and convert the reviewer from adversary to someone "watching for updates."
- Positive reviews. A brief thank-you that shows you read the specific review. Two sentences. Builds community.
- Low-effort or joke reviews. Skip. The ROI isn't there.
With this triage system, a daily review session takes 10-15 minutes, not 2 hours. You focus on the reviews where engagement has the highest probability of changing your score, and you let the rest go.
For the complete implementation of this system, including response templates, a daily workflow, and exactly how to measure results, read The Steam Review Response Framework: How to Respond to Every Review in 10 Minutes a Day.
So what do you do with this?
Less than 0.5% of Steam reviews get a developer response. The games that do respond — strategically targeting negative reviews — see a 55.9% improvement in sentiment. Players are twice as likely to update their review after a response, and 21.6% of negative reviewers flip their vote.
Almost nobody does this. Not because it doesn't work, but because it takes time, it takes emotional energy, and until recently there were no tools built to make it sustainable.
Your players are already telling you what they think. The question is whether you're going to answer.
See how your game's response rate compares. Run a free review audit.
Data in this article references the Truthful Toast analysis of 100 million Steam reviews (January 2025), with additional data from GameDiscoverCo, Gamesight, and the IGDA Developer Satisfaction Survey. The author acknowledges methodological limitations of the source data, including the use of AI-based sentiment analysis and the snapshot nature of the dataset.