Sunday, 29 December 2013

D&D Next Monsters: Part 10: Final Packet Analysis

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

Surf does a full review for the final playtest packet....

First up, my apologies to anyone waiting on these installments. Unimaginative real life has a great deal to answer for and this is a part-time gig.

With the final playtest packet release we saw some changes to monsters. Most monsters were relegated to the Older Playtest Adventures and Bestiaries folder and Wizards of the Coast seem to have done a pass of reviews over the main Bestiary itself and included creatures for Murder In Baldur's Gate. So I started from scratch, redoing my entire analysis - after all, this was the final packet so I might as well be thorough, right? Of course, I didn't ignore all of my previous work. It is interesting, after all, to see what's been altered, what's been added and what's been removed.


So What Changed?

Well no creatures have been removed, although a number of creatures have had a trait or action removed. A number of creatures were altered, mainly creatures in the level one through three range. This makes sense since WotC have mentioned several times that they intend these low level creatures to provide easier same-level fights than at higher level and the changes made to them are consistent with this.

With a little analysis it becomes evident that many aspects of monsters now exhibit non-linear progression curves. There are no obvious exponential or polynomial curves, but we see both logarithmic and power curves. These help give us the easier level 1-3 creatures.

One of the other things that is obvious is the tweaking of these "easy" creatures' Armor Class. This seems to be about 2 bouts below that of a "normal" creatures of the same level. This is mainly obvious at level 1, where we have 21 "easy" creatures and 15 "average creatures". At level 2 it's more difficult to tell since we only have 12 "easy" creatures and 13 "average" creatures.


Sample Size

One of the issues with the changes in the final playtest is the greatly reduced sample size. With so few creatures at each level it's now more difficult to accurately determine underlying numbers. That means we don't quite have the confidence in those numbers that we had with previous packets. We can extrapolate from our previous work and from other numbers and works. We can also go back through sample monsters at each level and compare them to our numbers. Both of which help improve confidence in our results.

The D&D Next forums have been particularly useful for locating related analysis. For example, the DPR Calculations thread contains carefully calculated attack and damage numbers for many classes, thoroughly reviewed by the Wizards Community.



All the areas we've been looking at have seen some kind of change....

Armor Class

Observations: Overall we've seen some significant "tightening up" of AC, which we knew was coming. The effective range for an Average monster contracted from 12 through 18 to 13 through 17. What's glaringly obvious, and less expected, is that there is now a clear disparity between the Easy monsters and the other monsters. An Easy monster seems to have an AC 1-2 points lower than other monsters of it's level. The formula for AC seems to have remained a linear curve.


  • AC ~= Level x 0.20 + 13
  • Easy AC = AC - 2

Hit Points

Observations: Considering the Average monster as the base of HP, this area has seen little change. Much of the actual change to hitpoints has been to Easy creatures. Previously these had about 70% of the hitpoints of Average creatures, but this now seems to have been reduced to 40%. In addition it seems that a Power curve is now a better fit for progression, which helps ensure lower level creatures tend to be more easily defeated by same-level PCs. Hit Points appear to have been changed to a fairly flat power curve.


  • HP ~= 10 x Level ^ 0.81
  • Easy HP = HP x 0.4
  • Tough HP = HP x 1.5
  • Solo HP = HP x 2.0

Attack Bonus

Observations: This is another area that saw significant change, standardising level 1 creatures to a +2 attack bonus and scaling standard creature attack bonus through to +9 at level 20. There seems to be no stable variation between Easy and Solo creatures of the same level. The formula for Attack Bonus now seems to be a logarithmic curve.


  • Attack ~= 2.06 x log(Level) + 2.38
  • No apparent variations


Observations: Damage seems to be the least impacted stat, in fact all changes in this area simply seem to have brought most creatures more into line with previous analysis. That said the Easy/Normal/Hard/Solo variations did shift a little and have been updated. Damage still appears to be a linear formula, but it's a little different to what we previously used.


  • Damage ~= 3.10 x Level + 2.00
  • Easy Damage = Damage x 0.50
  • Hard Damage = Damage x 1.25
  • Solo Damage = Damage x 1.50


Monster Building Table

All of which yields the following table...

LevelAC *HitpointsAttackDamage
* -2 AC for Easy creatures.



WotC have previously indicated that a design goal for D&D Next is for lower level monsters to be easier for same-level PCs to overcome than higher level monsters. The math changes in the final packet certainly nudge creature stats in that direction. However nothing has been done to address the feeling that higher level monsters are still much too easy.

There are a few factors at work here and we'll see what we can do to address these early in the new year.



Check back next week for the Part 10: Final Packet Analysis...

Thursday, 29 August 2013

D&D Next Monsters: Part 9: XP Curves & KpL...

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

Surf looks at monster XP progression....

So +Jonathan Black asked +Mike Mearls about the discrepancies between low, mid and high level character progression. Specifically "Why does it take more xp to go from 10 to 11 than 11 to 12?". Mike didn't answer, but this raised some questions about monster XP progression in my mind, so I decided to take a look...


PC Progression


First up let's take a look at what Jonathan is talking about. Let's take the PC progression table and subtract the previous level's value from each level. This gives us the XP a character needs to gain to go up to the next level.

Now to gain a level we expect that we should need to gain more XP than we needed to gain last level. Or at least the same. But not less. Now take a look at the differences at levels 11 & 12.

The only explanation for this is human error. I believe this is a simple transposition mistake. If we swap the two values around then everything is perfect. This is the kind of think I'd expect Wizards of the Coast to fix quietly.

Swapping these two values works perfectly. In mathematical terms there isn't much real impact and it's probably not worth the headache of getting your whole group to understand the issue and agree that XP to reach level 11 should be 75,000 rather than 77,000. But if you are keen that's the only change required.

Of course, working through this got the old mental cogs turning...


PC Progression vs Monster XP

Looking at PC progression got me thinking about the relationship between it and monster-at-level XP values. There's a relationship between the two, regardless of how loosely Wizards of the Coast have defined it. Obviously the PC progression chart helps us here, as does the Average column of the encounter design table (which is essentially the XP for one Average monster for that level). One of the design concepts of D&D Next also tells us what to expect. That is " advancement at lower levels with more gradual advancement at mid- and high levels".


Monster XP Curve

Average Monster XP by Level

Simply graphing the Average monster XP by level shows us that it's all over the place. I'd say "quite out of whack at higher levels". The trendline on the graph shows the kind of progression I'd expect.

What we see is that after level 11 there's a lot of XP that's just way higher than we'd expect. One could argue that this doesn't matter because the encounter math is built on the assumption that monster XP values use the same table. And running some quick pivots show that's the case.


And yet this argument misses something important - the impact to PC progression. These higher XP rewards, and higher associated encounter XP budgets, mean that PCs level up with fewer fights than if the correct values were being used.

When we are levelling up too quickly we feel that the game is too easy. So these over-curve XP values likely play a part in the feeling that D&D Next combats are too easy at mid and high level. Not that they are the only factor.

Regardless of whether it's necessary, it is pretty easy to determine the "right" values. We just add an extra column to our graph, copy the current values into it and then adjust them until each point sits on our trendline. We can even sprinkle in some of the usual RPG industry "rounding magic" to make the final table a bit more palatable to the human mind (people like to see zeros on the ends of longer numbers).

Which gives us a handy new encounter budget/monster XP chart.

Now, let's see if it proves useful...


Kills Per Level

Kills per Level (Current) Kills per Level (Revised)

If we subtract the XP needed to reach the next level from the XP needed to reach this level and divide the result by the XP value of an Average monster of this level we learn the Kills per Level (aka Kills/Level aka KpL) for this level. We can trivially do this for every level and then graph the result.

And the result is really interesting!

Yes, the values are scattered. And yes, we can correct this pretty easily simply by using our new Encounter/Monster XP table (see the "Revised" graph). But what's interesting is the curve! Wizards of the Coast seem to have decided that from level 10 we need to have far fewer fights each level. I'm guessing that's because they didn't plan to release many mid- or high-level creatures during the playtest and only planned to get a feel for what the epic tier of play was like.

And I reckon this is a factor in the feeling that the higher levels of play are too easy.


Correcting It...

LevelCurved TailPlateau

There are pretty much two options for correcting this. The first is simple, we halve the XP budget and XP value for encounters and creatures over level 10.

The second option is not something I'd ordinarily recommend, but the more I look at it the more convinced I am that it's the right way to go. It's an option you'd have to sell to your group - updating the PC progression table. Generally I prefer options the DM can silently use behind the scenes, but if PC progression is at fault then that is what is most appropriate to correct as other corrections will have side effects. Some of which won't be immediately obvious.

If we do this we need to decide how we want our progression to look. For my money I like a "curved tail" curve that starts like the current curve, it curves up to 25KpL at level 8, from there gently curves down to 21 KpL at level 20. The other option I'll present is a "plateau" curve which also starts similarly, curving up to 24 KpL at level 8, from there ever so gently dropping down to 23KpL at level 20. Again, I applied a little rounding to make the final numbers more natural for humans.

Then again, none of this matters if you don't use the PC progression table... Like my own group. When the DM simply decides how often everyone levels up this whole article become a moot point.

But it was an interesting moot point...



Check back next week for the Part 10: Final Packet Analysis...

Tuesday, 20 August 2013

D&D Next Monsters: Part 8: Putting It All Together

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

Surf brings togethor the AC, Attack, Damage and Hitpoint analysis....

Well with all that number-crunching done let's put all of our results togethor into a consolidated table!


Monster Building Table



Where's The Art?

The DM Guidelines PDF says that "Encounter building is a mixture of art and science as you combine these threats together" and traditionally this has been even more the case with monster design. But with tables like the one above some feel that monster design is shallow and inflexible.

Not so, I say! Tables like this simply provide a baseline for monster design that let us produce flexible monsters that provide a reliable threat at their target level. The art comes in the various traits, actions, reactions and adjustments that can and should be made,

Below follow my thoughts on modifying the aspects of monsters converred during this analysis. Some of it is based on analysis and math, some of it is based on my opinion. I've tried to indicate where this is the case, but your mileage may vary. Of course, I take no responsibility for how you choose to use the information in this article and if your creation eats your mother and destroys your house don't come looking for me!


Armor Class

Trivial changes to AC should generally be limited to +/-1 for most creatures, without compensating in some other way.

Adjustments to AC are commonly compensated for by counter-adjustmenting hitpoints, tho adjusting other aspects of a creature can work too. In a sense giving a creature an additional +1 to AC is like giving it 5% more hitpoints. So a sensible rule of thumb is to compensate for a +1 AC with -5% of original hitpoints or vice versa.

Maximum adjustment will depend on the creature being created or modified, to some extent. Changes should probably be capped at about +/-5 AC and even these should be thought out and compensated for very carefully.

Traits which alter AC are fairly rare in the game. There are a few, such as "Soft Belly" (e.g., Ankheg) that are used to compensate for creatures with a somewhat high AC.



Trivial hitpoint modifications should be limited to around 2% (rounded up).

Adjustments to balance hitpoint changes are often made to AC or damage. As noted above a shift of 5% hitpoints can be compensated for by adjusting AC one point. A 2 point change to hitpoints can also be compensated for with a 1 point shift in damage (+2 hp, -1 damage). Another way of compensating for hitpoint changes is with traits that temporarily drop hitpoints or AC.

Maximum alterations to hitpoints should probably be no more than +/-50%.

Traits and actions directly effecting hit points are quite rare, including Relentless (Orcs), Regeneration and any healing effects. These active effects are usually compensated for in some way, such as a vulnerability or adjusting the creature's base hitpoints.Passive traits are a whole different kettle of fish. It seems that D&D Next assumes a certain level of damage mitigation for creatures by their level. The higher a creature's level is the more resistances, immunities and similar traits it has.
This isn't an area I have analysed properly yet, but hope look at in the future. In the meantime I strongly recommend looking at creatures of a similar level when building or levelling/delevelling monsters.



Trivial tweaking of attack bonus should generally be limited to +/-1 if one doesn't plan to compensate for the change elsewhere.

Adjustments to attack bonus are often made to damage or hitpoints. A 1 point shift in Attack Bonus can be balanced with a 5% shift in damage or a 10% shift in hitpoints.

Maximum attack bonus tweaking should probably be limited to +/-2.

Traits that adjust attack bonus are scarce. The most common of these is undoubtedly Pack Tactics, however this is capped at +5 and most creatures with this trait have a much lower attack bonus than is normal for their level. When used this way the trait is sort of self-balancing. Other traits like Bushwhacker (Goblins) and Captivating (Harpies) grant advanatge on attacks conditionally and overall probably amount to no more than a +1 to attack over the course of a combat. So generally



Trivial damage tweaks can be made in the order of +/-2% without compensating elsewhere.

Adjustments to compensate for damage changes are typically made to attack bonus, to hitpoints, or to both. A 5% change to damage can be adjusted for with a 1 point change in Attack Bonus or a 10% change in hitpoints.

Maximum changes to Damage, where compensating measures are taken, should probably be around +/-50%.

Traits that adjust damage are relatively common. However, most of these traits should be factored into the creature's base damage - this includes damaging auras, damage on death, bonus damage (e.g., on surprise), berserk and other similar traits.



Check back in a few days for the next installment Part 9: XP Curves & KpL...

Monday, 19 August 2013

D&D Next Monsters: Part 7: Damage Analysis...

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

In the last article Surf unearthed the math behind hitpoints. Today we look at the last of these major monster stats - Damage....

Meh to crud with the intro... let's dive right in this time!


High-Level Data

During this particular analysis we consider a creature's “At-Will” damage and it's “Damage Per Round” (aka DPR).


As we saw with character classes, back in Part 2, damage is lower than hitpoints and thus it's variability is lower than hitpoints. Because of Bounded Accuracy it's still considerably higher than AC or Attack bonus though. We expect that monster damage should scale relative to PC hitpoints so this is a good sign.

StdDevp of At-Will Damage by XP
11.561.310.00 1.62
22.121.852.56 2.41
3 2.683.010.002.94
40.003.593.77 3.70
50.005.464.88 5.22
6  6.894.006.87
7  9.297.558.36
8   9.809.80
9   2.242.24
10   14.3314.33
11   7.597.59
12  9.179.4011.85
13  9.0911.1110.61
14   31.6031.60
15   0.000.00
18   2.002.00
20   6.006.00

When looking at the stdev table we need to remember to ignore the pink data points. The low amount of data makes these cells very "swingy". That's why we have some cells with a value of 0.00 (there's only one sample, or the small number of samples have the same value). In other cases it results in quite large values (like 31.60).

Again, there is the linear increase in variation each level.

We'll face similar issues to our hitpoint analysis here. That's mostly associated with sparseness of data at higher levels. But, as before, the blue datapoints at levels 10, 12 and 13 can be used to guide us and the pink data points may be handy as very rough indicators that we are in the neighbourhood of what we need.


At-Will Damage Average by XP
13.614.778.00 4.08
26.256.508.70 7.10
3 8.049.3212.008.77
49.0010.9612.09 11.43
511.0014.4414.17 14.28
6  18.4813.0018.17
7  23.0027.9626.61
8   27.7027.70
9   36.4036.40
10   36.6136.61
11   58.3358.33
12  40.0756.2544.93
13  54.0056.5755.80
14   66.5066.50
15   86.0086.00
18   94.0094.00
20   82.0082.00

Average Tables

What we find when we start digging through the averages is a lot of similarity with hitpoints. The values may be lower, but the patterns are very similar. Some of this is due to our categorisation methods, but many of these similarities persist between the "by HP" and "by XP" views. This not unexpected and another good sign that our theories on the relationships between the different PC and monster stats are on the right track.

Again, it's very obvious that there's a relationship between each type of monster at a given level. If you are having trouble seeing this try looking at the bottom total. That's an obviously skewed summary, one that inflates Solo creatures somewhat. But it does give you a feel of the relationships.

The same apparent miscategorisation we noticed in the hitpoints analysis is also present. Closer examination shows us there really don't seem to be any Tough creatures at levels 12 and 13. For those we'll again just use the Total in the Solo column's place.


DPR Average by XP
13.754.988.00 4.25
26.526.818.98 7.39
3 8.279.9712.009.21
49.0011.6712.67 12.07
511.0015.1715.18 15.11
6  20.0513.0019.64
7  27.7130.1729.52
8   35.9335.93
9   38.5738.57
10   39.5739.57
11   59.4259.42
12  41.3259.6946.83
13  57.0056.9256.94
14   69.8869.88
15   86.0086.00
18   94.0094.00
20   106.00106.00

As with the hitpoints analysis all four tables look like promising ways of reconstructing monster damage data.

And, again, I'll opt to build all four and then compare them. I won't bore my dear readers with an explanation of why.

Time to check out the graphs.



At-Will Damage by HP At-Will Damage by XP DPR by HP DPR by XP

Right now many of you are probably sitting there saying to yourselves "Er. If the Damage data is so similar to the Hitpoints data, why do the graphs look so different?!?" And that's a fair question.

There are several reasons for the apparent differences. The scale is, of course, different and this accounts for some of the variation. The way the actual averages fall out within the scale is a little different to, which is to be expected but nonetheless contributes to the apparent differences.

The most obvious difference is in the trendlines, but we do expect some of our trendlines to wander because of the scant data above level 7. But if we remove the trendlines we see that the general placement of all the datapoints is very similar. And, as mentioned in the Average Tables section, the distances between the different types at each level are relatively close.

The patterns are close. And although the trendlines are really only a useful guide or indicator, they look more useful for damage analysis than for hitpoint analysis.

These graphs also give us another important clue. If you go back and look at the graphs for hitpoints and think about how the datapoints are clustered around the trendlines you'll see that it's about the same for the two "by HP" graphs and the same again for the two "by XP" graphs - the clustering is only very slightly "tighter" for the "HP Calc" graph in each pair... But these graphs are a different story. In these graphs the data for the DPR version of each pair is a good deal more tightly clustered around it's trendline, which is predictably most noticable with the Solo datapoints and trendline.

This is a pretty solid indicator that the math behind monster damage is built around projected DPR, rather than "At-Will" DPR. I'll still create all four tables, but that's my tip for this one.


Choosing Between Damage Tables

After building the four tables I like to go back and examine each. How close is the table to the original? Is there a section of the table that doesn't match the table we are trying to duplicate? Or are there small variations scattered throughout? What about the base numbers? Are they unusual numbers like 4.23791? Or are they close to more "natural" numbers for humans to use? And the derived numbers?

What's quickly obvious is that the "by HP" tables aren't a great match. We can set up forumals so that most of the table matches fairly well - but ther's always a section of the resultant table that "drifts" away from the source table. That's a pretty good indication that we are on the wrong track.

The "by XP" tables, on the other hand, tend to line up easily and naturally. We do have small variations scattered throughout the table, but that's something we'd expect - it's highly unlikely our sample data is will just spell the table out for us, we expect variation. The "DPR by XP" table, in particular, is a very close match that uses natural values and increments. It's the best match, so my tip panned out on this occasion. Sometimes we get lucky, other times we have to keep slogging through until we find what we need.


Damage Table


The damage table is even simpler than the hitpoints table! Average damage for a level is obtained by taking level, adding one and multiplying the result by 2.5. An easy creature only inflicts 80% of an Average creature's damage, while a Hard creature inflicts %120 of that damage and a Solo inflicts 200% of that damage.

This can be expressed as...

  •   Average_Dmg = (level + 1) x 2.5
  •   Easy_Dmg = Average_HP x 0.8
  •   Tough_Dmg = Average_HP x 1.2
  •   Solo_Dmg = Average_HP x 2.0

This particular table aligns quite nicely with our source table, with most of the green/blue values being very close. A few do, of course, deviate from our source data. But this is easily accounted for by variations in the source data and certainly within expectations.

So how much can we adjustment a creature's damage without causing problems? We do see variation of 25%-30% of DPR within any given level and type, but that will normally be accompanied by adjustment of other attributes - as it should be. I'll stick with what I said about hitpoints - I wouldn't recommend adjusting by more than +/-2% of damage without compensating elsewhere.

My biggest concern with damage is, again, Solo creatures. As with hitpoints, there's a disparity between their XP award and their damage output. I do understand that having a creature that does an average of almost an entire PC's hitpoints in damage is a problem. But I'm not sure that the answer to that is docking their damage output. We'll look at how we can improve Solo creatures in a later installment of this series.



Our primary validation of monster damage is against PC hitpoints. If we divide the Damage for an Average creature at a given level by the hitpoints of an average PC of the same level we get a fraction. That fraction is the percent of PC hitpoints that Average monster does. If there is a direct relatonship between the two then this number should stay fairly constant through the levels. And what we see is that this number averages 0.23 (23%) with a variance of 0.000 and a stdev of 0.008. Which is pretty compelling!

Our secondary validation was against PC damage. Monster damage starts out at about 41% of PC damage and slowly progress to 131% of PC damage. This sounds weird, but if we look more closely we see that most of this disparity is in the first couple of levels. At level three it's 75% of PC damage and progresses evenly.

So, while Solo creatures ahve some shortcomings that need to be addressed this table should closely reflect the current damage progression and be suitable for most DMs to use.



Check back tomorrow for the next installment Part 8: Putting It All Together...

Monday, 12 August 2013

D&D Next Monsters: Part 6: Hitpoint Analysis...

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

In the last couple of articles Surf examined both Armor Class and Attack bonus. He now turns his attention to Hit Points....

One thing that becomes quickly obvious to someone browsing D&D Next monsters is that Bounded Accuracy doesn’t apply to hitpoints. The system uses hitpoints, and by association damage, to scale up levels. What we expect here is a steady linear increase over levels, but one higher (or “faster”) than the increase of AC and Attack bonus.


High-Level Data

During this particular analysis we consider both the “natural” creature hitpoints and the “modified” hitpoints.

 HP StdHP Mod

What we see here is that hitpoints are much more variable within the data than AC or Attack Bonus. This provides some support for our initial thoughts on hitpoints.

StdDevp of HP Std by XP
13.042.890.00 3.74
23.064.529.97 7.03
3 11.5916.790.0014.49
40.007.3211.10 9.51
50.009.9414.19 12.48
6  16.998.0016.61
7  17.3224.5925.24
8   12.3512.35
9   30.1030.10
10   45.8645.86
11   28.7128.71
12  37.6023.7034.95
13  30.2441.2641.26
14   29.3029.30
15   0.000.00
18   28.5028.50
20   25.5025.50

If we are correct we should also see a relatively linear increase in deviation each level. Considering only our green and blue cells that does seem to be the case.

The problem here is that it’s more difficult to be sure we have picked the right target number at the upper levels, where there’s less data. We’ll do what we have done previously - try to ensure levels 1-7 are quite close and that levels 10, 12 and 13 aren’t too far off. And we’ll hopefully end up in roughly the right neighbourhood for level 20.

Something else we need to be mindful of here is using the “by HP” division. We are examining the hitpoints of our data and using the category based on our arbitrary ideas of hitpoint division will likely introduce an artificial skew to our results. So we can look at those divisions, but where there’s a discrepancy between it and the “by XP” category we should go with the XP category.


HP Std Average by XP
15.5010.1711.00 7.22
213.0515.9020.36 16.47
3 24.2126.6527.0025.52
422.0030.9035.59 32.84
544.0040.8649.96 44.40
6  58.6160.0058.69
7  60.8985.0078.62
8   65.8065.80
9   126.60126.60
10   118.64118.64
11   124.67124.67
12  146.71129.33141.50
13  175.00141.43151.50
14   149.00149.00
15   142.00142.00
18   178.50178.50
20   224.50224.50

Average Tables

The first thing that jumps out to me here is the obvious relationship between the different types of monster at any given level. If we determine the mathematical relationship between these then determining the hitpoint progression for all of them will be simplified – we can determine any other type based on a single type of the same level.

Consider how RPG designers typically build these tables – they decide on a starting value, a progression and how the adjacent types are derived. It follows that the easiest way to reconstruct a table is to use the same approach.

So we’ll determine an Average progression and build the entire table from there. That will let us ensure that our green and blue data points align closely and that our handful of creatures in the level 14+ region are in the right neighbourhood.

The other thing I notice thing a number of the creatures seem to be miscategorised, mainly because their XP value seems low for their type. Most importantly the Tough creatures at levels 12 and 13 are probably Solo creatures. The “by HP” table supports this idea and being able to use the average of these two rows should help us build a more accurate table.


HP Calc Average by XP
15.5810.5011.00 7.42
213.8516.1721.50 17.15
3 25.9129.3427.0027.70
422.0033.6337.59 35.21
544.0045.0252.39 47.83
6  64.1566.0064.26
7  70.4490.7685.38
8   90.0090.00
9   130.20130.20
10   132.86132.86
11   134.00134.00
12  163.86159.33162.50
13  183.67163.71169.70
14   177.25177.25
15   180.00180.00
18   214.50214.50
20   264.50264.50

Both the “HP Std” and “HP Calc” tables are promising. In addition the “by HP” and “by XP” divisions both have some interesting variations within them.

Any of these four tables might provide the data that D&D Next monsters are based upon.

Of course, the best approach is to build all four tables! Then we can reflect on what it took to reproduce the tables and which we believe is the right one.

But before we do that let’s have a look at the graphs...



HP Std by HP HP Std by XP HP Calc by HP HP Calc by XP

The very first thing that I noticed with these graphs was… Well to be honest it was the variability of most of the trendlines! But that lead me to consider which trendlines show some kind of stability. And my isn’t that Solo trendline stable! Yes, this is mainly because there are Solo monsters scattered across the levels. And yes, many of these are sparse and we shouldn’t have too much confidence in them. But all in all we should be able to rely on our Solo progression running somewhere near to this trendline.

The Average trendline is also more stable than a first glance leads one to believe. Yes, it gets variable between our two categories (“by HP” and “by XP”), but again we did expect that.

Overall I feel confident that determining a good Average progression and extending this across the adjacent monster types, and ensuring the end result aligns with our green and blue cells is still the best plan.


Choosing Between HP Tables

As previously noted I went ahead and built out all four variations of the hitpoints table. There was a lot of boring “too-ing and fro-ing” that doesn’t bear detailing here.

While almost any numerical table can be built, assuming sufficient computing power, that really isn’t necessary. When we start having to us complex polynomials and sliding variables we are getting well outside the complexity of roleplaying game tables! This simplified matters somewhat and four tables were built with a minimum of fuss.

So how do we decide which one to use? Well, as I said “things aren’t normally that complex”. So when I have to decide between a table that scales using a long decimal point and one that doesn’t. When the simpler one aligns better with the actual observed data. Well that makes the decision pretty easy.

And the upshot of that is that Wizards of the Coast don’t appear to be using a higher value for hitpoints and subtracting “notional hitpoints” from it to allow for resistances, healing and similar... Not in most cases, anyway. Rather they are assuming that creatures at a certain level will have a similar amount of damage mitigation in addition to their hitpoints. Yep, DR just became part of the core mechanics again.


Hitpoints Table


If we take 10 as our starting value for a level 1 Average creature and add 5.5 for each subsequent level we have a progression very close to what our sample data shows. Next we make an Easy worth 70% of an Average, a Tough 130% of Average and Solo worth 200% of Average.

This can be expressed as...

  •   Average_HP = 4.5 + (level x 5.5)
  •   Easy_HP = Average_HP x 0.7
  •   Tough_HP = Average_HP x 1.3
  •   Solo_HP = Average_HP x 2.0

This puts us very, very close to our observed data, providing maximum alignment with our blue and green data points...

We are out in a few places though… The amount of data we have to analyse is quite sparse and even our “green data points” are bound to deviate from the underlying proscribed value.

Adjustments seem to be pretty minimal. Regeneration does appear to be factored in (so subtract estimated regeneration per combat from total hitpoints), however resistance does not. Neither do most other forms of damage mitigation. I wouldn’t recommend varying hitpoints by more than +/-2% without being careful to compensate elsewhere.

I could complain here about the low volume of sample data and the uncertainty behind the samples that introduces, but the fact is that this table should work well for most DMs and is quite close to the current crop of monsters.

No, my big concern here is that this is an area where the math is probably wrong. The D&D Next community receives complaints almost daily about the weakness of monsters. And this weakness seems to become more pronounced at higher levels. Wizards of the Coast has generally acknowledged that this is the case and that monster math isn’t finalised yet. They quite rightly indicate that getting the core math correct is a necessary precursor to finalising monster math, since monster math is built upon the character system. Their current stop-gap is to simply adjust monster XP, with the occasional tweak to armor class or hitpoints.

What leads me to believe hitpoints are one of the main areas at fault? Well common sense and history provide some support. Consider the role hitpoints have played in basic monster math revisions in previous editions – most recently in 4th Edition. Hitpoints are one of the major design dials for monsters.

But more to the point the numbers support it. Let’s take a look at that now...



You might remember back in the Part 2: Class Development Profiling installment I indicated I’d validate monster hitpoints against PC damage, as the primary comparator, and PC hitpoints, as a secondary comparator. So I expect to see some kind of stable and/or predictable relationship with PC damage output. Since PC hitpoints increase faster than PC damage output I wouldn’t expect to see any constant relationship against it, tho I’d hope to see a trend of steady difference.

So imagine my surprise when I paste my proposed hitpoint table into my validation sheet and find a perfect fit… Against PC hitpoint progression!

Yes, the current monster damage progression is completely stable against PC hitpoints! An Average monster of a given level has half the hitpoints of a PC of the same level. How constant is that? Well the average is exactly 0.500 of PC hitpoints with a variance of 0.0002 and a stdev of 0.0141. That’s very, very stable!

This means that Average creatures take progressively more hits to kill, eventually many more than their XP award covers. This isn’t very noticeable at lower levels, where Average monsters are concentrated, and DMs and players using Averages won’t notice an issue. But as players level up the Tough and Solo creatures come progressively into play. But these are quite weak! A Solo has four time the XP of an Average, but Solo creatures at higher levels only have double the hitpoints of an Average creature... An Average creature that doesn’t exist at those levels.

While the table included in this instalment matches the current crop of monsters I consider it faulty and in serious need of correction... Which we’ll tackle a couple of instalments from now.



Check back in a couple of days for the next installment Part 7: Damage Analysis...

Friday, 9 August 2013

D&D Next Monsters: Part 5: Attack Bonus Analysis...

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

In Part 4 Surf broke down Armor Class progression, showing how he approaches the analysis of these simpler linear progressions and finished off by producing an AC Progression table. This time he moves on it’s related neighbour – Attack bonus...

So looking at Armor Class was interesting, what will we see with monster accuracy? Well, let’s get stuck into it – it should be shorter than last time since we already covered the hows and whys...


High-Level Data

As with Armor Class progression for Attack Bonus is linear and short. It’s also predictably less variable than Armor Class.


The standard deviation and variability at each level and data point match up with this, for all our green cells stddev averages 1.05 with most green and blue data points below 1.00. So even more than AC it’s obvious that attack bonus is a linear progression.


Attack Mod Average by XP
13.734.285.00 3.94
24.654.674.68 4.67
3 5.004.906.004.97 5.22
56.005.836.04 5.92
6  6.726.506.71
8   7.407.40
9   7.207.20
10   7.437.43
11   8.338.33
13  9.007.718.10
14   9.009.00
15   8.008.00
18   8.508.50
20   9.009.00

Getting Average

If you are new to this game you could be forgiven for assuming that we can create a simple linear progression from 3.94 to 9.00. The sparse data from level 8 makes the second part a bit uncertain. The progression from level 1 to level 7 is a bit better. That would likely get us in the right ballpark. What we will need to do is ensure close alignment in our progression with the green data point values. The blue and pink ones… Well they aren’t quite as reliable and will need to be a rough guide.


Graphical Consideration

Attack bonus by HP Attack bonus by XP

So what jumps out with the graphs is that almost all of the trendline bands align very closely. Yes the “thinness” of some of our data does give us a trendline or two that wanders, but we can ignore that, as long as we check that’s the reason the trendline wanders.

Both graphs show the trends are firmly anchored around 4 and head straight towards the neighbourhood of 10.

Looking at the data points, instead of lines we get a sense that our target is probably on the higher side of the end range.

Let’s see what we can construct to match this.


Attack Table


If we take 3.9 at level one and 8.1 at level 13 and make a simple linear progression between them we’ll see that we have a progression of 0.35/level. By comparing this against our averages table we see that we very close to the totals columns for the green cells – in many cases about 0.1-ish away from the value. We can extend this progression down to level 20 and round off the decimal portion, giving us a value of 10.55.

We can tweak this quite easily from here to suit our purposes. Since my handy validation table drives right of my proposed table it’s very easy to seek a series step value that matches our observed data and results in minimal variability against our validation data.

In the end a starting value of 4.00 and a step size of 0.30 is pretty much spot on.

So what can we normally adjust, within the bands of whatever we choose as “normal”? Well as we noted the variance within these numbers is pretty low. I wouldn’t adjust Attack bonus more than +/-1 for most creatures. Say +/-2 for named opponents.

I have a couple of concerns about this progression. First, I’m not convinced Wizards of the Coast have properly anticipated the PCs ability to gain attack bonuses – for example we can expect most parties to have some buffs for every combat by say level 7. My other concern is, again, the lack of clear instruction about magic items – you really need to make sure you apply magic items to creatures before the fight if you are running a game with significant levels of magic. Simply rolling for loot after the fight and deciding if there’s a magic item is going to put your monsters at a disadvantage. Creatures fought should utilise magic items if they are items that have a combat impact.



The main validation I used here was to subtract the attack bonus for each level from the average AC for PCs at the same level. This gave an average of 9.54 with a variance of 0.188 (stdev 0.434).

As a secondary validation I divided the attack bonus for the level by the average PC attack bonus for the same level. The result of this was an average of 0.89 with a variance of 0.002 (stdev 0.050).

Again, this gives me a high degree of confidence in the result.



Check back in a couple of days for the next installment Part 6: Hitpoint Analysis...

Wednesday, 7 August 2013

D&D Next Monsters: Interlude: A New Packet...

As the D&D Next community reels from the 2nd August 2013 packet release Surf takes a moment to tell us what impact the new packet will have on his analysis...


Well if you haven't been living under a rock you are probably hearing the screams about the new packet release!

Many of the underpinnings of PC Classes have changed, most classes have seen significant modification, skills (as they were) have been completely removed and replaced. Feats saw significant modification.

And changes were made to some monsters.

Naturally many readers are wondering what this means to this series of articles... Strangely enough, it doesn't impact my analysis much! Not yet, at any rate.

I can almost hear the puzzled voices. Let me explain...

The thing is this packet doesn't bring the sweeping changes to monsters that it brought to PCs. As 00 Read First.pdf indicates some creatures have been updated with damage, AC or accuracy adjusted downwards. And some creatures have had their hitpoints and/or XP adjusted. The encounter table for XP hasn't changed and spot-checking of a number of key creatures shows that few are changed and those that have been altered have minimal adjustments made.

I wouldn't rule out a big overhaul of monster. But my guess is that we are an update or two away from that.

But shouldn't the changes to classes have some impact on my analysis?

Sure. And after I've posted the consolidated tables in Part 8 I'll put subsequent parts on hold and do a full review. That means readers will have to wait a week or two for those subsequent articles. But hey, it can't be helped...



Check back tomorrow for the next installment Part 5: Attack Bonus Analysis...

Monday, 5 August 2013

D&D Next Monsters: Part 4: AC Analysis...

While this blog does not contain material published by Wizards of the Coast it does contain materials summarized and extrapolated from the D&D Next playtest packets. By continuing to read this blog you are consenting to the terms of the Wizards online playtest agreement, which you can view at

Having broken down class progression stats and dealt with the demon Data Entry Surf dives into getting usable numbers out of his monster analysis...Starting with Armor Class.


View From A Jumbo Jet

So now that we’ve entered and nicely categorised all that data let’s dive in and tackle some of the easier stats head-on. Bounded Accuracy suggests Armor Class (along with Attack bonus, which we’ll do next time) should be pretty stable, and we saw with PCs that AC scales quite slowly as they gain levels. It shouldn’t be too different for monsters. In fact, it’s highly unlikely that different ACs are in play for the different categories of creature within each level. That’s certainly the case in some previous editions of the game.

While we think AC progression should be easy to work out, we should make sure before simply plowing ahead. It’s an easy task, so there’s no real excuse for not doing it.

To get a 10,000 meter idea all we need to do is grab the average and standard deviation for our data. This is laughably simple in Excel...


The low standard deviation and variation tell us that AC is pretty stable across our data, as expected.


StdDevp of AC by XP
11.992.610.00 2.23
22.181.833.50 2.57
3 3.332.730.003.03
40.002.302.73 2.51
50.002.362.35 2.34
6  2.480.002.41
8   2.562.56
9   2.142.14
10   3.103.10
11   0.000.00
12  1.180.821.11
13  1.251.852.09
14   1.481.48
15   0.000.00
18   0.500.50
20   0.000.00

Digging In The Details

One of the great things that spreadsheets do is let us create pivot tables – a tabular analysis on the intersection of our data elements as data points. With a few simple clicks we can create a pivot table. They are trivial to copy and modify and they are information rich. So when I am digging in this kind of data I use them quite a lot.

This pivot table shows the standard deviation for level by XP category. Notice that the standard deviation for most level/category points is generally lower than this global standard deviation? We also see that the variability within a given data point is often higher than the overall variability for that whole level.

The upshot of this is that the data is fairly tightly grouped not just within a level/category data point, but across a level.

Note: There are, of course, some differences between the “by XP” and “by HP” pivots, but that’s more a matter of a slightly different distribution of data points due to the different categorisation criteria. Unless there are compelling reasons to show both I’ll generally just show one as an example.


AC Average by XP category
111.6912.2413.00 11.90
212.2012.6713.09 12.67
3 13.0413.2316.0013.18
410.0013.1813.03 13.07
513.0014.0314.04 14.02
6  13.5213.0013.49
7  13.8913.6813.74
8   16.8016.80
9   13.8013.80
10   14.7114.71
11   18.0018.00
12  15.4316.0015.60
13  16.6714.0014.80
14   17.7517.75
15   16.0016.00
18   16.5016.50
20   17.0017.00

Dammit, Give Me Some AC!

Well, all that’s all just fine and dandy, but what about the actual numbers? Let’s take a look at a pivot of average AC...

What’s compelling here is that the variability within a given level and category is often greater than the variability for the entire level itself. This supports the idea that AC progression is by level, not by category within level.

Notice that the Total column is already quite close to a nice linear progression? Even with the various gaps and low quantities of data at higher levels, the current AC progression for D&D Next is quite close to the surface.

In fact it looks as if plotting a simple 20 step linear progression from 11.90 through 17.00 would yield a usable AC progression table.

But let’s not get ahead of ourselves, there’s one more thing we should look at before trying to reproduce the data progression.


Graphs? Really?

AC by HP

Let’s graph the pivot tables and put some trendlines on each series. What I really want here is an X/Y graph, but Excel doesn’t let us do that when we graph a pivot, so we have to settle for a line graph. We really aren’t interested in where the lines between the data points wander. What we are interested in is the trendlines and the patterns they form. I’ve gone and reformatted these to look like XY graphs for the reader’s convenience in these articles – but I usually don’t bother.

Now our missing pieces of data do make some of this a little “fuzzy” and the “by HP” data is a bit all over the place. But even so it’s pretty easy to see the bands the trendlines form and the areas they highlight. If you ignore everything except the bands you’ll think to yourself “why it’s an easy linear progression from 10-ish to 20-something”.

So there are a lot of ways of looking at this data. It takes much longer to write about than to do. It probably takes much longer to read and understand than to do, if you haven’t done it before. But once you are used to it it’s pretty quick and easy to do.


* +/- 2 standard

AC Table

It’s trivial to build a progression table from 11.90 to 17.00 in Excel and rounding off the decimal position (a common trick in RPG design). An increase of 5 over 20 levels does marry up nicely with the PCs’ average attack progression (from +5 through +10). So that’s a good starting point. With a little tinkering we find that the actual range 11.90 to 17.60 provides the closest alignment with the green and blue data points. That’s a neat progression of 0.30 per level.

So do I have any concerns? Well, yes. I agree with most playtesters on the WotC forums that monsters and encounters are too easy, as things stand. And most of this comes back to the monster math. Where does AC factor into this? Well these numbers assume, like the matching Class stats, no power scaling outside the base progression. If your PCs have a stack of magic weapons and buffs, for example, that makes a difference and you need to adjust for it.

Without considering Magic Items and similar - how much can we adjust AC without skewing things too badly against the PCs? Well the variance of AC for most data points is around 5, so +/-5 AC for the creature’s level is within normal bounds. A creature with an AC 5 higher than it’s fellows is quite special, tho. Looking through random examples of these fringe cases, most are named monsters. So I’d recommend using +/-2 AC for most creatures and going beyond that only for special creatures.

So a level 10 creature with AC16 (instead of AC14) will be a little harder to hit, but for a named enemy you could go up as high as AC19 to put a little extra pressure on the PCs. But this is more about understanding your group and the variability in the system than the base numbers. So it’s probably sufficient to simply make some footnotes on our table.



So how do we know, with any certainty, that our table is correct? I used a couple of basic mathematical verifications.

First, I simply subtracted the average PC attack bonus for a given level from the AC for a creature of that level. The result was an average 7.08 for each level with a variance of 0.111 (stdev 0.334), which was randomly distributed throughout the progression.

Second, I divided PC the average AC for the level by the level’s recommended monster AC. The results was an average 0.90 with a variance of 0.001.

So I have quite a high degree of faith in this table.

In reality this validation is built right into the process – it’s a separate table in my Excel spreadsheet that references the AC table I am building, right alongside the AC table. So visually I can immediately see the results of changing one cell or the whole series. Sometimes I hide this table, because I don’t want it to influence my tests and conjectures. But it does allow me to seek the closest match when I need to.



Check back in a couple of days for the next installment Interlude: A New Packet...