Every week, during the Monday-night D&D game I play in, I reconsider the notion that players should hit monsters about 60% of the time. (I play an arcane striker, lately, so the to-hit rate is of great concern to me.) The idea being that a 9 or higher on a d20 roll should, on average, be enough to hit the appropriate defense score of a monster at the same level as the PCs. Maybe it’s just because I play with a chatty group of six players (the DM makes seven) — and I’m as much a loudmouth as anybody, believe me — but when I fall into that 40% part of the equation, I get pretty bummed.
It could be a long while before my turn comes around again, and the odds are decent (they’re, you know, about 4-in-10) that I’ll miss again, and spend more time waiting around. On a night when the dice run cold, it’s easy to become a spectator. But what’s the alternative?
Chris Sims suggests using 3d6 in place of a d20. The idea, there, is to use the bell curve generated by 3d6 (added together) to increase that hit rate of 9+ to something more like 74% (his math, not mine), thereby cutting down on whiffs during each at-bat. Is this crazy? If it protects the fun at Sims’ table, how crazy can it be? He’s built-in crits and everything.
It’s a simple enough fix, if you don’t find the d20 to be such an integral part of the D&D experience that getting rid of it is unthinkable. It’s a more drastic approach than mine, which was just to throw more lower-level monsters at the party (thereby increasing the number of player hits and the number of enemy hit points on the table all at once), but drastic’s not always bad. (But I’m the guy who tried rolling first for a while, too, so what do I know?)
You’ve played some D&D 4E. How often do you think a player should miss? Where’s that sweet spot between risk and reward for you?