While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at dndnext.com.
In the last couple of articles Surf examined both Armor Class and Attack bonus. He now turns his attention to Hit Points....
One thing that becomes quickly obvious to someone browsing D&D Next monsters is that Bounded Accuracy doesn’t apply to hitpoints. The system uses hitpoints, and by association damage, to scale up levels. What we expect here is a steady linear increase over levels, but one higher (or “faster”) than the increase of AC and Attack bonus.
During this particular analysis we consider both the “natural” creature hitpoints and the “modified” hitpoints.
|HP Std||HP Mod|
What we see here is that hitpoints are much more variable within the data than AC or Attack Bonus. This provides some support for our initial thoughts on hitpoints.
|StdDevp of HP Std by XP|
If we are correct we should also see a relatively linear increase in deviation each level. Considering only our green and blue cells that does seem to be the case.
The problem here is that it’s more difficult to be sure we have picked the right target number at the upper levels, where there’s less data. We’ll do what we have done previously - try to ensure levels 1-7 are quite close and that levels 10, 12 and 13 aren’t too far off. And we’ll hopefully end up in roughly the right neighbourhood for level 20.
Something else we need to be mindful of here is using the “by HP” division. We are examining the hitpoints of our data and using the category based on our arbitrary ideas of hitpoint division will likely introduce an artificial skew to our results. So we can look at those divisions, but where there’s a discrepancy between it and the “by XP” category we should go with the XP category.
|HP Std Average by XP|
The first thing that jumps out to me here is the obvious relationship between the different types of monster at any given level. If we determine the mathematical relationship between these then determining the hitpoint progression for all of them will be simplified – we can determine any other type based on a single type of the same level.
Consider how RPG designers typically build these tables – they decide on a starting value, a progression and how the adjacent types are derived. It follows that the easiest way to reconstruct a table is to use the same approach.
So we’ll determine an Average progression and build the entire table from there. That will let us ensure that our green and blue data points align closely and that our handful of creatures in the level 14+ region are in the right neighbourhood.
The other thing I notice thing a number of the creatures seem to be miscategorised, mainly because their XP value seems low for their type. Most importantly the Tough creatures at levels 12 and 13 are probably Solo creatures. The “by HP” table supports this idea and being able to use the average of these two rows should help us build a more accurate table.
|HP Calc Average by XP|
Both the “HP Std” and “HP Calc” tables are promising. In addition the “by HP” and “by XP” divisions both have some interesting variations within them.
Any of these four tables might provide the data that D&D Next monsters are based upon.
Of course, the best approach is to build all four tables! Then we can reflect on what it took to reproduce the tables and which we believe is the right one.
But before we do that let’s have a look at the graphs...
The very first thing that I noticed with these graphs was… Well to be honest it was the variability of most of the trendlines! But that lead me to consider which trendlines show some kind of stability. And my isn’t that Solo trendline stable! Yes, this is mainly because there are Solo monsters scattered across the levels. And yes, many of these are sparse and we shouldn’t have too much confidence in them. But all in all we should be able to rely on our Solo progression running somewhere near to this trendline.
The Average trendline is also more stable than a first glance leads one to believe. Yes, it gets variable between our two categories (“by HP” and “by XP”), but again we did expect that.
Overall I feel confident that determining a good Average progression and extending this across the adjacent monster types, and ensuring the end result aligns with our green and blue cells is still the best plan.
Choosing Between HP Tables
As previously noted I went ahead and built out all four variations of the hitpoints table. There was a lot of boring “too-ing and fro-ing” that doesn’t bear detailing here.
While almost any numerical table can be built, assuming sufficient computing power, that really isn’t necessary. When we start having to us complex polynomials and sliding variables we are getting well outside the complexity of roleplaying game tables! This simplified matters somewhat and four tables were built with a minimum of fuss.
So how do we decide which one to use? Well, as I said “things aren’t normally that complex”. So when I have to decide between a table that scales using a long decimal point and one that doesn’t. When the simpler one aligns better with the actual observed data. Well that makes the decision pretty easy.
And the upshot of that is that Wizards of the Coast don’t appear to be using a higher value for hitpoints and subtracting “notional hitpoints” from it to allow for resistances, healing and similar... Not in most cases, anyway. Rather they are assuming that creatures at a certain level will have a similar amount of damage mitigation in addition to their hitpoints. Yep, DR just became part of the core mechanics again.
If we take 10 as our starting value for a level 1 Average creature and add 5.5 for each subsequent level we have a progression very close to what our sample data shows. Next we make an Easy worth 70% of an Average, a Tough 130% of Average and Solo worth 200% of Average.
This can be expressed as...
- Average_HP = 4.5 + (level x 5.5)
- Easy_HP = Average_HP x 0.7
- Tough_HP = Average_HP x 1.3
- Solo_HP = Average_HP x 2.0
This puts us very, very close to our observed data, providing maximum alignment with our blue and green data points...
We are out in a few places though… The amount of data we have to analyse is quite sparse and even our “green data points” are bound to deviate from the underlying proscribed value.
Adjustments seem to be pretty minimal. Regeneration does appear to be factored in (so subtract estimated regeneration per combat from total hitpoints), however resistance does not. Neither do most other forms of damage mitigation. I wouldn’t recommend varying hitpoints by more than +/-2% without being careful to compensate elsewhere.
I could complain here about the low volume of sample data and the uncertainty behind the samples that introduces, but the fact is that this table should work well for most DMs and is quite close to the current crop of monsters.
No, my big concern here is that this is an area where the math is probably wrong. The D&D Next community receives complaints almost daily about the weakness of monsters. And this weakness seems to become more pronounced at higher levels. Wizards of the Coast has generally acknowledged that this is the case and that monster math isn’t finalised yet. They quite rightly indicate that getting the core math correct is a necessary precursor to finalising monster math, since monster math is built upon the character system. Their current stop-gap is to simply adjust monster XP, with the occasional tweak to armor class or hitpoints.
What leads me to believe hitpoints are one of the main areas at fault? Well common sense and history provide some support. Consider the role hitpoints have played in basic monster math revisions in previous editions – most recently in 4th Edition. Hitpoints are one of the major design dials for monsters.
But more to the point the numbers support it. Let’s take a look at that now...
You might remember back in the Part 2: Class Development Profiling installment I indicated I’d validate monster hitpoints against PC damage, as the primary comparator, and PC hitpoints, as a secondary comparator. So I expect to see some kind of stable and/or predictable relationship with PC damage output. Since PC hitpoints increase faster than PC damage output I wouldn’t expect to see any constant relationship against it, tho I’d hope to see a trend of steady difference.
So imagine my surprise when I paste my proposed hitpoint table into my validation sheet and find a perfect fit… Against PC hitpoint progression!
Yes, the current monster damage progression is completely stable against PC hitpoints! An Average monster of a given level has half the hitpoints of a PC of the same level. How constant is that? Well the average is exactly 0.500 of PC hitpoints with a variance of 0.0002 and a stdev of 0.0141. That’s very, very stable!
This means that Average creatures take progressively more hits to kill, eventually many more than their XP award covers. This isn’t very noticeable at lower levels, where Average monsters are concentrated, and DMs and players using Averages won’t notice an issue. But as players level up the Tough and Solo creatures come progressively into play. But these are quite weak! A Solo has four time the XP of an Average, but Solo creatures at higher levels only have double the hitpoints of an Average creature... An Average creature that doesn’t exist at those levels.
While the table included in this instalment matches the current crop of monsters I consider it faulty and in serious need of correction... Which we’ll tackle a couple of instalments from now.
Check back in a couple of days for the next installment Part 7: Damage Analysis...