Effects of Armor Class on Character Longevity
I don't see this being a very long blog post, but I wanted to make it as a companion to one I wrote this past August (link) about comparing different combat systems. This time, I'd like to show how armor class as treated in typical D&D (originally in 1974 as the 'alternate combat system' to Chainmail) prolongs the life of a defending character, but not in a way that you'd expect.
Ava Islam says in her blog post (link)  that armor class is "a 'hidden' pool of hit points that every monster has which fluctuates depending on the to-hit bonus of the character they are going up against". This is because armor class determines the accuracy of an attacker's hit, such that the lower the attacker's accuracy (i.e. the lower the chance to hit), the more virtual hit points that a defender has. I've talked about this in my previous blog post in the following terms:
- How many rounds it will take to defeat a target.
- How likely a target will be killed in one hit.
My earlier conclusion in this respect is as follows, and it's going to ground my analysis here today:
However, even when the countdown to defeat is statistically equivalent between these two methods (i.e. having armor class and HP versus having HP alone), they necessarily result in different experiences of play. It seems to me that the purpose of the to-hit roll is not just to prolong the lifespan of the target but, literally and directly, to introduce the likelihood that a hit fails altogether. That sounds obvious, but it's a distinction worth drawing out with different implications for play than simply increasing hit points.
Now, I'd like to analyze mathematically the translation of armor class into virtual hit points, and how this makes armor class a different beast than actual hit points.
My basic formula is to divide the average value of an individual hit die (~3.5 in 1974 D&D) by the likelihood of being hit. Keep in mind that one hit does damage approximate to one hit die, except in later editions of D&D where hits tend to deal 1-6 points of damage while hit dice for monsters are 1-8.
For example, you have a 55% chance to hit someone with an AC of 9 (unarmored). By dividing 3.5 by 0.55, we can see the target has a total 'virtual' HP of 6.36. Hence, as it were, the target has virtually 2.86 extra hit points by virtue of the to-hit roll. However, if the target had an AC of 2 (plate mail and shield), they would have a total virtual HP of 17.5 (+14)! Below is a table of virtual HP scores per HD, given the target's AC and given that the attacker is a regular level 1 character.
There are two percent difference columns. The first (A) is how much HP/HD improves based on an average HP/HD of 3.5. However, this is not useful because indeed the worst possible AC is 9. Therefore the second column (B) is a more accurate measure in that it compares the HP/HD of each possible AC to the worst possible AC. By this, we can see that a fully armored character has 390% more virtual HP than an unarmored character. In other words, if an unarmored character could take 1 hit, then a fully armored character could take 5.
The story does not end there, however. The above table is basically useless because it does not take into account how higher level characters improve their fighting accuracy. There is not a linear function that accounts for improvements in virtual HP, whether absolute or relative. The only thing that changes is the accuracy of the to-hit roll. What we're dealing with is a function f(x) = 3.5/x such that x is in the interval (0, 1]. I even plugged it into my calculator!
Below is a table similar to the one above, except that it is a function of d20 scores necessary to hit rather than a function of armor class. Again I include the two types of percent difference, though in this case we might consider type (A) to be more useful because we are looking at the whole set of possible to-hit chances. This is to say that, with the possibility of there being a 100% to hit, the target in that case would have no virtual HP due to armor class whatsoever.
Let us consider a fighter who levels up from level 3 to 6, thus granting (as it were) a +2 bonus to-hit. The accuracy to hit a target with AC 9 increases from 55% to 65%, and so the virtual HP/HD changes from 6.36 to 5.38 (-0.98). Likewise, the accuracy to hit a target with AC 2 increases from 20% to 30%, but the virtual HP/HD changes from 17.5 to 11.7 (-5.80). The difference between amounts of virtual HP for a linear change in accuracy is thus exponential.
Keep in mind that all this is due to the virtual nature of AC as HP. It is not a pool of points that goes down the more you hit, but it is (in a more literal sense than one might have guessed) a sort of armor that shields the pool of HP from going down. I've discussed this in the earlier post (link again), but combat without accuracy rolls has very different goals than combat with accuracy rolls. So, I hope that this is not taken as a critique of the former as an abstraction of the latter (which it isn't really), but an explanation of what makes the latter tick on its own terms.
Were you to convert monsters from a system with AC to one without, the formula that Ava explains in her post (HP = AC/2 * HD where AC is ascending over the domain [10, ∞)) is elegant and scales well! Here is a small table for that, because I liked how AC/2 replaces a d8 roll (which, as I have mentioned, became the standard for HD after 1974 D&D) which itself has an average of 4.5 ~= 5. So it happens that each point of AC improves an HD roll by 0.5 per HD, were you to roll HD instead of using the formula itself.
I hope this fills in the gaps left by my earlier post on this topic, and that it gives y'all a good basis to think about how you prefer to handle to-hit rolls and damage in your campaigns.
 In my original post on this topic, it didn't occur to me that Ava was the author of the blog post that I couldn't find. Oopsie doodle! Hope this makes up for it, sis. :)