We Analyzed 100 Million Steam Reviews. Less Than 0.5% Got a Developer Response.
Across 100 million Steam reviews, less than 0.5% received a developer response.
That means for every 200 reviews your game gets, statistically one might hear back from you. Maybe.
And here is why that number matters more than you think: when developers DO respond to negative reviews, those reviews update to positive 21.6% of the time. The baseline without a response? The review just sits there, dragging your score down. Forever.
This is not speculation. It is what the data says when you analyze 100 million reviews across the entire Steam catalog. Let us walk through what was found.
The 0.5% problem
In January 2025, researcher Will at Truthful Toast published an analysis of Steam developer responses across 100 million reviews. The headline finding: only about 450,000 of those reviews (roughly 0.5%) had received any developer response at all.
But the distribution is wild. Half of all developer responses on the entire platform were concentrated in just 79 games. Not because only 79 games exist on Steam (there are over 70,000), but because only 79 developers consistently bother to respond.
Think about that. Studios obsess over marketing spend, trailer performance, wishlist counts. Almost nobody is doing the one post-launch activity that costs nothing except time: talking back to the people who played your game.
Imagine a restaurant ignoring 99.5% of its Google reviews. In hospitality, that would be considered insane. In gaming, it is the norm.
What happens when developers actually respond
When developers respond to negative reviews on Steam, the numbers move:
- 12.1% average sentiment improvement on negative reviews that receive a developer response. That is the aggregate across all games studied. Some see much higher.
- 21.6% of negative reviewers update their review after getting a developer response. Many of those updates flip from negative to positive.
- Only 8.5% of positive reviewers update after a response, so your effort goes much further when directed at negative reviews.
- Reviews that get developer responses and are later updated settle at a 63.5% positive rate.
- Reviews updated without any developer interaction drift 10.4% more negative over time. Without intervention, your score does not stay flat. It decays.
Steam allows players to edit their reviews at any time, including changing their thumbs-up or thumbs-down vote. When a developer responds to a negative review, it creates a reason for the reviewer to reconsider. And 21.6% of the time, they do.
Why players are 2x more likely to update after a response
Players who receive a developer response are twice as likely to update their review compared to players who don't. Once you think about why, it is obvious.
When someone writes a negative review, they are either venting frustration or trying to be heard. Either way, a developer response gives the reviewer what they actually wanted: acknowledgment.
Basic reciprocity. You gave them attention. They reconsider their position. It does not work every time, but it works often enough to move your overall score.
The response itself matters more than the fix. Players do not need you to have solved the problem to update their review. They need to know you read it and you understand the issue. Feeling heard is the trigger, not a patch note.
And the update mechanism on Steam makes this frictionless. The reviewer clicks edit, changes their vote, and the effect is immediate on your score. No form to fill out. No waiting period. One click.
The 55.9% improvement when studios are strategic
Not all response strategies are equal. The aggregate 12.1% improvement is the average across every studio that responds, including those who respond randomly, inconsistently, or only to positive reviews.
The data tells a different story when you isolate studios that specifically target negative reviews:
Games where developers focused their responses on negative reviews saw an average sentiment improvement of 55.9%.
That is nearly five times the aggregate average. The difference is strategy: instead of responding to everything equally, these studios prioritized the reviews where engagement had the highest chance of flipping a negative to a positive.
Some individual games performed even better. The Truthful Toast analysis found titles like Forza Horizon 4 with an 80% improvement rate, and games like DOOM Eternal, Gunfire Reborn, and Elite Dangerous at 60% or higher.
This makes intuitive sense. A player who writes a thoughtful negative review about a specific bug or balance issue is far more receptive to a developer response than someone who writes "this game sucks." The strategic approach is to focus on the reviews where a response can actually change minds. The data confirms it works.
Games near tier boundaries (the rating "bubbles" between Mixed and Mostly Positive, or between Mostly Positive and Very Positive) benefit the most. When you are at 68% and need 70% to cross into Mostly Positive, even a handful of flipped reviews can be the difference.
The revenue math nobody is doing
Here is where this gets concrete.
Steam's review tier system controls how the algorithm treats your game across the store: search results, discovery queues, "More Like This" recommendations, and sale event featuring. The thresholds are well-documented:
| Tier | Threshold | What It Means |
|---|---|---|
| Overwhelmingly Positive | 95%+ (500+ reviews) | Best possible placement. Maximum visibility. |
| Very Positive | 80%+ (50+ reviews) | Strong recommendation signal. |
| Mostly Positive | 70-79% | Solid. Players will consider purchasing. |
| Mixed | 40-69% | Yellow warning label. Players scroll past. |
| Mostly Negative | 20-39% | Active deterrent. |
The jump from Mixed to Mostly Positive, from the yellow "warning" label to blue "safe" label, is the most valuable tier transition for most games. According to research from Gamesight, moving from Mixed to Very Positive can nearly triple a game's advertising conversion rate. Every game they tracked with a conversion rate above 2% had 80% or higher positive reviews.
Now, combine this with the review-to-sales multiplier. The commonly used estimate (originated by Mike Boxleiter and refined by GameDiscoverCo) suggests that each Steam review represents roughly 40-60 copies sold, depending on the game's age, genre, and price point. The median for recent releases is approximately 58 copies per review.
So let us work an example:
- Your game has 500 reviews at 66% positive (330 positive, 170 negative). You are Mixed.
- You respond to all 170 negative reviews. The data says 21.6% will update, so that is roughly 37 flipped reviews.
- Each flip counts double in the ratio: minus one negative, plus one positive. So your new score: (330 + 37) / 500 = 73.4%.
- You just crossed into Mostly Positive. No new reviews needed. Just responses.
- At ~50 sales per review and a $20 price point, your 500 reviews represent roughly $500,000 in gross revenue. A tier change that improves conversion by even 30% is worth $150,000+ over the game's lifetime.
This is back-of-napkin math. Your actual numbers will differ. But the magnitude is clear: responding to reviews is not a marginal activity. It could be the single highest-return thing you do after launch.
Want to see the actual numbers for your game? Run a free Steam review audit. Enter your AppID and see your current score, tier position, and how far you are from the next threshold. Takes 30 seconds.
Why 99.5% of studios still don't do this
If the data is this clear, why does almost nobody respond to reviews?
The time cost is real. A thoughtful review response takes 3-5 minutes to write. If your game gets 20 reviews a day, that is 60-100 minutes of writing. For a solo developer, that is 10-15 hours per week, time that should go toward actually building your game. At a $50/hour opportunity cost, that is roughly $2,500 per month in developer time.
Then there is the emotional toll. The games industry has a serious burnout problem. The 2025 Games Industry Employment Survey found that roughly half of developers report experiencing professional burnout, and the IGDA Developer Satisfaction Survey consistently finds that crunch affects 28% or more of the workforce. Reading negative reviews all day makes both worse. It is psychologically draining on top of the time cost.
Many developers also assume negative reviewers will never change their mind. "They already decided they hate it. Why bother?" The data directly contradicts this: 21.6% is not a rounding error. But the assumption persists because the feedback loop is invisible. You respond to a review, the player quietly updates it two days later, and you never notice the connection.
And finally, there are almost no tools for this. The Google Reviews ecosystem has dozens of response management platforms (BirdEye, Podium, ReviewTrackers, etc.). Steam has almost nothing purpose-built for review response management. Until recently, the only option was doing it by hand in the Steam developer dashboard.
The result: developers know reviews matter, but the cost of responding manually pushes them into the 99.5% who do nothing. And their scores pay the price.
What a response strategy actually looks like
This does not have to be a 15-hour-per-week commitment. The studios seeing the 55.9% improvement are not spending more time. They are spending smarter time.
The approach breaks down into a simple triage:
- Negative reviews about issues you have already fixed. Highest priority. These have the best flip potential. "We actually fixed this in patch 1.3, would love to hear if it works better now."
- Negative reviews with valid criticism you have not fixed yet. Acknowledge, give a timeline if possible, and convert the reviewer from adversary to someone "watching for updates."
- Positive reviews. A brief thank you that shows you read the specific review. Two sentences. Builds community.
- Low-effort or joke reviews. Skip. The ROI is not there.
With this triage system, a daily review session takes 10-15 minutes, not 2 hours. You focus on the reviews where engagement has the highest probability of changing your score, and you let the rest go.
For the complete implementation of this system, including response templates, a daily workflow, and exactly how to measure results, read The Steam Review Response Framework: How to Respond to Every Review in 10 Minutes a Day.
So what do you do with this?
Less than 0.5% of Steam reviews receive a developer response. The games that do respond, strategically targeting negative reviews, see a 55.9% improvement in sentiment. Players are twice as likely to update their review after a response, and 21.6% of negative reviewers flip their vote.
Almost nobody does this. Not because it does not work, but because it takes time, it takes emotional energy, and until recently there were no tools built to make it sustainable.
Your players are already telling you what they think. The question is whether you are going to respond.
See how your game's response rate compares. Run a free review audit.
Data in this article references the Truthful Toast analysis of 100 million Steam reviews (January 2025), with additional data from GameDiscoverCo, Gamesight, and the IGDA Developer Satisfaction Survey. The author acknowledges methodological limitations of the source data, including the use of AI-based sentiment analysis and the snapshot nature of the dataset.