Posts Tagged ‘assumptions’

So things have been quiet here at the Toolbox; a part of that is because I’m doing a lot of prep work for a thorough investigation of Pathfinder feats, but an even bigger part is boring “personal life” stuff, like a big move for work that I’m in the middle of.  Anyways, hopefully I’ll have interesting things to talk about here soon, but my “hobby time” has been pretty scarce lately.

But that’s for another time.  Today is Swords & Wizardry Appreciation Day!

So, Swords & Wizardry is one of many game systems that are indicative of the “Old School Renaissance  movement in tabletop RPG gaming. The idea is that RPGs these days aren’t like they were “back in the old days,” and that we’ve lost something in modern games that we had back then.  I generally agree with the notion, with the caveat that I don’t think modern games are bad, just different, and there’s value in reviving this older style of play.  S&W itself claims to be a “restated” version of the “Original Game” written by Gygax and Arneson in 1974.

In a lot of ways, I feel like Swords & Wizardry matches up a lot better with my assumptions about characters and the world than modern interpretations of Pathfinder and Dungeons & Dragons – not because those games don’t match my expectations, but because they are more-general systems that allow for a wider range of experiences, and Sword & Wizardry intentionally restricts itself to the grittier core of fantasy RPGing.

So, let’s look at some of what S&W does and how it does it, and I’ll throw my thoughts in as well.

(more…)

So I’ve been scouting around the Internet for dice stats since LS posted his “race-weighted attributes” post because work blocks me from Anydice.com but lets me wander around all sorts of message forums (and with two sick daughters at home, work is the most likely time I can do this sort of research). I found a link to an old (circa ’93) newsgroup post that lists probabilities and expected values for 3d6-drop-zero to 9d6-drop-six (they’re arranged by “drop lowest” but you can reverse the tables to get “drop highest”). That’s useful information for a number-crunching nerd likle me.

But in a couple places in the thread I found a meme that seems all-too-common in certain parts of the hobby, and I wanted to address that.  Specifically, it’s the notion that 3d6-roll-in-order or other systems that approach it are better because you’ll get low scores, and low scores “provide much color to a good ROLE-playing experience.” I submit to the reader that this is crap.

I’m not saying that all characters should have 12+ in every stat to be “worth” playing. I’m not saying that playing a character with some (or many) low stats can’t be fun. I’m not saying that stretching your horizons and playing out of type isn’t a good thing. But I am saying that the notion that playing a statistically-average or mathematically-likely character, especially one that is wholly or substantially generated randomly, is a better roleplaying experience is disingenuous at best.

At it’s core, role-playing has nothing to do with statistics. Role-playing is about taking on a persona and acting through scenarios, making decisions as though you were your character. We make a game out of it and attach mechanics so that you can understand and predict the likely outcomes of your decisions in a consistant way, but those are structures we build up around the core of role-playing.

The statistics are simply a way of describing our persona in a common language so that players and GM all understand the character and how he interacts with the environment. To say that a mathematically-likely character is better than any less-mathematically-likely character, we are first asserting that one persona is better than another for role-playing, and are then further asserting that it is better because of the randomness of it’s generation. Or, perhapse, it is better because it “forces” the player to “deal with” a flawed character. But why is that better, for role-playing? Can you not have just as-satisfying an experience role-playing as Superman as you can role-playing as Jimmy Olson?

Even if your character is stronger, faster, smarter, and better-looking thabn everyone else, there can still be interesting motivations, internal struggles, and decisions to be made, and that is what makes for good role-playing. Statistics say that my character is weak or clumbsy or stupid, and that’s one class of flaws, but it doesn’t say if he’s an alcoholic, a misogynist, bound by his word, or an extreme pacifist.  That’s another class of flaws. You can have an interesting, flawed character who’s stats are all 15+.

And here’s the crux of it: you can have an interesting time with a character who’s statistically perfect, but that wouldn’t be a terribly interesting character to me. I wouldn’t choose to play that character, much the same I wouldn’t choose to play a character who was randomly handicapped. I might choose to play a character with low INT or WIS or DEX, but the love-affair that gamers have with random generation has rarely made much sense to me. I have a couple theories:

It’s a game, and since it’s a game the notion of “fairness” comes in to play.  People want to know that they’re on even footing with their opponents, that no one is starting out with undue favor. But the problem here is two-fold – firstly your fellow players are not your ‘opponents’ (nor is your DM, if you’re “doing it right”), and secondly, how is random generation “fair,” exactly? It’s like the card game “We Didn’t Playtest This At All” where the rules not that star cards are simply better than other cards, and for game balance every player has an equal chance of drawing a star card. Rolling 3d6 is only “fair” in the sense that everyone has an even chance of rolling a superstar (or a dead-weight).

I suspect that another factor is that “that’s the way it was done” in the old days, and that’s the way it continued to be done out of tradition (and probably the above notion of fairness), and so people who played back then (or have adopted that mentality) had to live with bad rolls.  And occationally, having to live with sub-optimal results causes some people to rationalize and justify and find some reason to believe trhat sub-optimal is better, or at least not so bad. And from what I can tell, in old school rules attributes meant a lot less than the do in more-modern games. In Swords and Wizardry (ostensibly based off the 1974 rules), most stats are either +1, +0, or -1, so the swing between a “good” score and a “bad” score was minor. In 3.X, though, the swing is from +6 to -6 which is +/- 30% (a swing of 60%) on a d20! That is significant. Modern stats try to cover a larger range of variation, from vegitative 3s and retarted 6s to genius 14s and Ozimandian 18s. I suspect that all of old D&D’s 3-18 range covers just 7-14 in modern stats, because old D&D had a narrower focus.

My point is this: yeah, random-rolling characters makes things quick and ‘fair’ and can give you the ‘opportunity’ to play a character you might not have chosen for yourself. That’s fine and good and if it’s what you like, have at! But it isn’t going to fit everyone’s tastes, and please don’t act like it’s objectively better in any way. The core of role-playing doesn’t care about stats, except in that it’s how we describe our personas to the game. Hand-picking stats is just as valid, so long as everyone in the game agrees on what an accepitble character looks like.

So I mentioned before that the Crafting rules in Pathfinder as essentially useless as-written. I went through the whole of it in my previous post, but here’s the cliff notes: success is practically guaronteed given time and materials (if you’re bad you’ll waste a lot of money on ruined materials), and how much time is inversely related to the DC of the item being crafted, so more difficult items take proportionally less time to craft than simpler items (of the same value).

That’s actually the only complaint I can really level squarely at the system, I think. I have a little bit of concern about impractical craft times (my calculations show a Master smithy taking upwards of 9 months to make a suit of plate armor, but I have no notion of how realistic that is) and the fact that there are several different systems for crafting common items, traps, magic items… But each of those things are mechanically different in the game as well, so having different mechanics for crafting them isn’t absurd of its face. I think I’ll need to address both of these, but it’s more a matter of argument and investigation, whereas my primary complaint is simply math.

Of course, the fly in that ointment is that it takes less time to craft a more-difficult item of the same price, and in general it looks like more-difficult items tend to cost more, so the otherwise-wonky math just offsets the escalation, so that it doesn’t take a hundred years to forge plate armor.

Here’s a comparable pair: hook hand (DC 12, 100 sp) and a short sword (dc 15, 100 sp). Assuming a Master of moderate talent we have a Take 10 score of 18 (10 base +1 attribute, +1 skill, +3 class skill, +3 skill focus). His weekly crafting score is 216 for the hook and 270 for the short sword; in each case it’s double-but-not-triple the target (100), so they each take a half-week to complete. Huh. That doesn’t tell us anything.

If instead of a Master we assume a craftsman of minimum capable skill, with a Take 10 of 12 and 15 respectively, the hook will score 144 for the week and the sword will score 225, so the hook is done in a week and the sword is done in a few days… but the sword was done by a better craftsman. But that same craftsman would score a 180 on the hook and take a full week!  Aha!

I’m… not sure what this proves. Maybe hooks are harder to make than swords? Maybe the abstraction is good enough, without getting into the minutia of every item’s form and composition?

One more.  A Dwarven Longaxe (DC 18, 500 sp) and a Greatsword (DC 15, 500 sp). Our Master would get a weekly score of 324 for the longaxe and 270 for the Greatsword, so it would take two weeks (648 and 640) to complete each. Again, doesn’t really tell us anything, I think.

So, here’s the question that I’m left with, given a flawed system that seems to work out alright in practice: why are we doing this?  What are we trying to accomplish? LS at Pencils and Papers was actually looking to change the Crafting skill, he said so deep in his first post on Crafting: “If characters are to be able to craft magic items using the crafting system (as is my goal)…”. As-written, D&D3.X/Pathfinder Crafting isn’t intended to create magic items, as those are covered by a Feat and a separate system that (if I recall) requires no roll. By declaring that Craft should allow players to create magical items and then declaring that Crafting is broken because you can’t find a hapy medium where Decent Characters and Focused Characters can co-exist, he’s kind of making his own problem. (Sorry for the slight, LS.)

Crafting in D&D is meant to model, to some rough level of “good enough”, mundane craftsmanship. A first level character with moderate talent and training can master all but the most difficult of crafts – Alchemy has DCs in the 20 to 25 range, but most other items top off at DC 18; a Master craftsman with a few apprentices (or high-quality tools) can Take 10 on a DC 26. I propose that it is mostly a tool for guaging the efficiency of NPC craftsmen – it’s deep enough that it can be applied to PCs because NPCs and PCs exist in the same world and abide by the same mechanics.

I think this comes down to a difference in philosophy: why have a skill in your game system if it’s only really meaningful to NPCs? One of the things like I like about D&D is that, for the most part, it is a complete system. That is, it can model the whole world. Others don’t like this, and there are game systems designed with minimal mechanics, or mechanics that only pertain to PCs, or rely on GM fiat to cover anything that the designers didn’t think was important. And although I think Craft (and other skills) are mainly intended for NPCs, that doesn’t mean they aren’t useful for PCs. It’s unlikely that a Player will have the time or opportunity to forge plate armor while on an adventure, but if the group uses downtime well (and I propose that all groups should use downtime, and use t often) he might have a few months to put some together. Will it be better than the magical gear he can find while adventuring? Probably not, unless the DM decides to fudge things the way LS intends to.  But is it a pointless endeavor? Again, no – it’s cheaper to forge your paladin a new suit of armor (if you have the time and talent) than to buy a new one, and it can be used (if you have the time and talent) to pad your coin purse a bit if you can find an interested buyer. It’s not directly related to dungeon crawling, but I propose that it doesn’t need to be, and it doesn’t even need to be directly related to PCs. The power of the D&D system is it’s completeness.

(As an end note: this isn’t where I expected to be when I started the post, but in investigating the actual application of the Craft rules I don’t think it’s as broken as I thought.  Wonky? Sure. Perfect? No way. But definitely meaningful and workable.)

LS over at Paper and Pencils has been doing some great stuff at re-inspecting Pathfinder, much of which I’m still catching up on.  And seeing as last night was a “no sleep for daddy” night and this morning has been a “coffee weak as water” kind of morning, this probably isn’t the best time for me to try digging in to such a topic.  But I go where the spirit moves me!

Both LS and I agree that D&D/Pathfinder Crafting skills are pretty much useless as-written. We both think there should be a way to re-cast the crafting system so that it still works within the bounds of the Skill System (skill points, roll d20+bonuses against a DC to determine success or failure, etc). But LS and I are working off of a different set of assumptions; he wants to balance Crafting PC-to-PC (focusing on game balance and utility), and I’m interested in balancing PC-to-NPC (focusing on in-world modeling and meaning). I think LS and I had words over this difference of opinion before, but it’s mostly a matter of taste and interpretation.

LS draws up a table comparing a moderately-invested PC (we’ll call him Min) versus a heavily-invested PC (he’ll be Max), level for level. Min has a +2 attribute bonus, has the skill as a Class Skill (+3) and takes a point in the skill every level (+lvl); Max has a +5 in the attribute at level 1, adds to his attribute at every chance (+1 at 8 and 16), takes Skill Focus (+3 at Lvl 1, another +3 at Lvl 10), has the skill as a Class Skill (+3) and takes a point in the skill every level (+lvl). Right off the problem is clear, as Min has a score of 5+Lvl and Max has a score of 11+Lvl at Level 1, 12+Lvl at Level 8, 15+Lvl at Level 10, and 16+Lvl at Level 16. Max starts out essentially double Min’s effectiveness and has several hops in his progression where Min increases linearly. LS concludes that crafting can not be balanced, I conclude that we’re trying to balance the wrong thing.

Based on my assumptions, I think there are three characters to consider when determining how we should treat the skill: the Amature NPC (Al), the Professional NPC (Paul), and the Master NPC (Matt). Like most people in the world, they are all level 1 and do not advance. Al has an average attribute (+0) and no formal training (not a class skill), just what he’s able to pick up by doing (+1 skill point). Paul is talented (+1 attribute) and has been trained (+3 class skill) in addition to applying the skill (+1 skill point).  Matt is truly gifted (+2 attribute) and has been not only trained (+3) but focused on his craft (+3 Skill Focus) in addition to applying the skill (+1).  So we have three flat values that most of the world will conform to: +1 for Al, +4 for Paul, and +9 for Matt. With an assiatant (+2 help) and taking their time (Take 10), they can respectively hit DC 12, DC 16, and DC 21. Reaching beyond their skill (ie, rolling the die) gives them the chance to hit DC 22, DC 26, and DC 31, but risks ruining the whole effort.

Player characters will start out as an amature, professional, or master – possibly with some variation and potentially with much more raw talent (if the GM allows high ability scores). But unlike most of the rest of the world, PCs perform deeds that gain them Experience and raise their level, gradually becoming more than mundane. Higher level NPCs may exist, but just like PCs they are suitably Heroic, Mythic, Legendary, or God-like as well.

Masterwork items should have a DC of 20, so that a talented Master can create them reliably. The entirety of mundane crafting should be achievable within DC 30 or less, noting that these crafts are beyond the normal ability of a Master. Beyond that (and I might even say beyond DC 25) we enter the realm of crafting things that are more than mundane.

LS tosses out this notion, concluding from his treatment of Min and Max that there’s no good way to make the skill useful for Min without being broken by Max if item quality alone determines the DC. But this is because he’s comparing players to players in a competative sense, where as I’m comparing players to the world being modeled with the understanding (or even expectation) that players will quickly outshine all others. (That’s part of the point, isn’t it?) I also think that there’s a component of Skill bonuses that LS is neglecting – yes, it determines maximum range of the feats you’re able to pull off, but it also determines the complications that you can cope with and still be successful. Crafting an item without proper tools, in an unsuitable environment, or clandestinely (such as creating weapons in a jail cell without the guards catching on) might heap on a bunch of penalties, andit would take a suitably talented and skill individual to pull it off.

As-written the Crafting skill uses time, cost, and DC in an interconnected way that leads to non-intuitive results and/or absurd crafting times.  I’d like to address that, probably just by de-coupling the three of them.  But I’ll have to say that for another time.

The Gaping Wound, Part 4

Posted: 19 September 2012 in Game Structure
Tags: ,
Death and Dying

We’ve established that the average person in D&D has 3 hit points. They can take 3 points of damage before they collapse from their wounds, and they can supper a total of 13 damage (10 more than their HP) before their body gives up and they are dead.  Up to -9 HP it is possible for them to stabilize and recover naturally. 

According to the SRD, a character who is dying (fewer than 0hp) has a 10% chance per turn to naturally stabilize, or else they lose another hp.  At this rate (and understanding a D&D Turn to be 10 minutes), most people will be dead within an hour and a half.  Even if they stabilize naturally, unless they have someone to aid them they will lose hp every hour until he becomes conscious (again, 10% chance per hour), and even then he will not begin healing naturally for some time (10% chance per day, or lose hp). If there’s someone around to help, the character stops losing hp and start healing naturally as soon as they stabilize — pro tip: always travel with a group.

Having established that hit points are real measures of actual injury, this becomes a model for the body’s ability to repair itself; it is litterally a measure of how close your character is to death, whether he’s being pummelled or just bleeding out.

Natural Healing

Natural healing in D&D 3.X is 1hp per day, per level — or 2hp per day per level if you get complete bed rest.  Generally, any fight you can walk away from you can recover from in a day or two; this is partly because a hp is roughly 1/13th of a character’s vitality, and because an injury does not need to be completely-healed to be mechanically-irrelevant.  Bruises, scratches, and the like are too small for the coarse-grained HP system to track, and day-old wounds appear to fall in the same category.

I have to admit that I’m surprised to read that an Nth-level character heals N times faster than other people; given three characters stabilized at -9hp, the 1st level character is on his feet in a week (6 days at 2hp/day), the 2nd level character is on his feet in a few days (3 days at 4hp/day), and the 3rd level character is on his feet in only a couple days (2 days at 6hp/day).  One way to explain it would be to say that a higher level character recovers quicker even if he doesn’t actually heal quicker, but that undermine’s my intent of “1hp means 1hp, a wound is a wound,” and I’d probably scrap the idea in my own games — characters heal 1hp per day regardless of level. (As an aside, actually healing quicker makes sense for a evel 5+ character, and below that level it might be easy enough to handwave the difference.)

Magical Healing

A major benefit of “hp is wounds,” in my opinion, is that it makes magical healing more reasonable.  This is more important than the quirks of natural healing because it’s concievable that a higher-level character actually does heal quicker, but more particularly because in most games natural healing doesn’t come up.  With access to Clerics and potions, most groups will take the time to refresh themselves to as close to full health as they can as often as they can.  Asserting that “hp is wounds” normalizes magical healing, so that it affects people the same way regardless of Class or Level.  A 1st Level Wizard and a 5th Level Fighter recieve the same, objective benefit from a potion of Cure Light Wounds, rather than having the Wizard’s sucking chest wound close up while the Fighter’s cuts and bruises just sting slightly less.

This also lets us talk objectively about the Cure spells and what they mean.  Cure Light Wounds does 1d8+1 points of healing, about 5 on average.  “Light” in this case is something of a misnomer, as 5hp is enough for most people to be dying.  Cure Moderate Wounds does 2d8+3, or 12 on average.  “Moderate” wounds are enough that most people would be on death’s door.  Cure Serious Wounds does 3d8+5, 18 on average; Serious wounds put a trained fighter into his grave.  Finally, Cure Critical Wounds does 4d8+7, or 44 on average.  That’s more than three mortal wounds for regular people.

Closing

I think that wraps up the notion of hit points as actual wounds.  By the book there are three factors to consider: how long you can fight (positive hit points), how long you can survive (negative hit points and death threshold), and how quickly you can recover (whether 1hp/day or 1hp/day/level).  Regular people can sustain 12 and 14 damage, with trained soldiers weathering as much as 17.  Higher level characters can fight longer and survive more serious wounds, and may recover from their injuries quicker.  A few points of damage are enough to put someone out of the fight (it’s a coarse-grained system), and 10-13 points of damage can be considered a mortal wound.  Scratches, superficial cuts and bruises, scars, and the like are too small to be tracked by hp, and the actual details of any given wound/attack are abstracted into the damage roll.  If you know what the wound is (ie, slitting someone’s throat), the hp system probably isn’t appropriate.

Part 3

Baseline

When my last post ended, we had established that there was a baseline in D&D that 14 to 20 points of damage is enough to kill a man, with 4 to 8 generally being enough to ‘drop’ him and cause him to start dying.  This is based off of die type and Constitution score and (importantly) assumes a Level 1 character.  That most people are Level 1 is one of my guiding principles, and I believe it will serve us well here.

Let’s take our notional baseline and put a finer point on it: the statistically average Level 1 Commoner (human, for what it’s worth).  His hit die type is a d6 and he has a 10 CON, so his (statistically average) hit points are 3 (rounding down) — he will begin dying after just a few points of damage and will be dead after a maximum of 13 damage.  A Warrior will, on average, have 5 hp and die after a total of 15 damage, making them a bit more resilient but still in the same ball park.  PC classes are comparable.  Extra points in CON effectively add 1.5 points to the total damage a character can take before death, so a tough Warrior might be able to survive up to 18 points of damage, but he’s still down after 6.

So far we can make sense of this.  Hit points represent the body’s ability to sustain damage.  After so much punishment, you will begin dying and, eventually, you body will beyond the point where it can recover; you are dead.  If you’ve been hurt and survived, rest and medical attention can, over time, return you to health.  Hit Points only measure the proximity to death; they do not track scars, broken bones, pulled tendons, torn muscles, etc. except in as far as those things bring a character closer to death.  Hit Points on their own can not tell you if you lose a limb, or an eye, or threw out your back.  Hit Points (on their own) can’t track bruises, fatigue, hunger, or exposure to the elements.  they just tell you how close you are to dying in a coarse-grained kind of way.  But for that, they do a pretty good job: some people are tougher than others, but everyone is effectively within a few points of each other (with the exception of extreme Constitution), and everyone heals at the same rate.

The real problem comes from scaling hit points with level and, perhaps to a greater extent, random hit points.

Scaling Hit Points

The way D&D does hit points is that you get X hit dice of type Y, where X is the level of your character.  So a Level 1 Commoner (on average) has 3hp, but once he hits level 2 he jumps up to 7 hp!  It’s worth noting here, though, that this isn’t really twice the vitality; he has 7hp, but he’s still dead at -10, so instead of dying after 13 damage he’s dead after 17.  It’s not a huge leap in those terms, but it does mean that he can take a lot more punishment before he ‘drops.’  What’s more, the average Level 2 Commoner can take more punishment than the average Level 1 Warrior, both before he drops and before he’s dead.  That is to say Level matters, which I think is appropriate.  The difference between Level 1 and Level 2 in many respects is more important than the difference between Warrior and Commoner; the Level 2 character is better than the Level 1 character fundamentally (though a Level 2 Commoner who says that to a Level 1 Warrior is unlikely to ever see Level 3).

Does this mean that the Level 2 character has more meat to them?  That their bones are stronger, that they’re more resistant to decapitation?  The answer is no: hit points don’t track those sorts of things, and if they’re important hit points are the wrong tool to use.  All it means is that the Level 2 character can keep fighting despite more severe punishment and that he can recover from graver wounds.  After 13 damage the Level 1 Commoner’s body can’t keep up and shuffles off this mortal coil; the Level 2 Commoner has taken the same punishment but is still holding on, and may yet recover.  The Level 2 character is more resilient.

At Level 3 the Commoner would have 10 hp and survive up to 20 damage before dying, and it starts to become clear that such a character can keep fighting despite having taken wounds that would drop a lesser man.  In fact, when the Level 3 Commoner has taken enough damage to drop, the Level 1 Commoner is on death’s door and fading fast.  The Level 3 character is truly heroic, though still within ‘normal’ bounds.  By the time he reach Level 5, though, he has 17 hit points and can sustain 27 points of damage before his last gasp; he fights on after receiving a wound that would kill other men outright.  He is on the verge of the superhuman.

Random Hit Points

But what if he’s not? This assumes that a character could increase in level without significantly increasing their resilience, but I don’t think that’s much of an assumption at all.  First, it’s easy to imagine a Wizard who becomes a better Wizard without becoming noticeably tougher.  Second, it’s already coded into the way we do hit points: statistically unlikely though it may be, that Level 5 Commoner could have only 7 hp (if he rolled ones for every Level after the first). And in terms of the purpose of the Hit Point system I think this flaw may be the worst because it does damage to the purpose of hit points: it divorces them from the character they’re meant to represent.

I think I get why we do it.  Dice are a thing that gamers love, they’re fair, and they help us determine otherwise uncertain things.  But my contention is that hit points are, in one sense, not uncertain.  The character either is or is not getting more resilient, and either by a lot or a little.  In a way, it’s as important as whether he’s a Wizard or a Warrior, a Gnome or a Half-Orc, Lawful or Chaotic.  It talks about his ability to act beyond his old limits; it is deliberate.  Determining this randomly causes problems because now anyone can suddenly be twice as resilient without a firm connection to the fiction; it’s random.

There’s not really a good ‘fix’ for this, and in many cases I’m not sure a fix is desired, but I think it’s important to recognize. If you don’t acknowledge that your character’s hit point increase is tied to the fiction then the mechanic is going to become divorced from the character’s reality.

Part 2
Part 4

It’s been about a month since I first opened the topic of hit points in D&D.  Although I still haven’t had the time to get in to the meat of it, I did want to look at a little bit of history of hit points.   That being said, I didn’t enter the hobby until the late ’90s, so none of my history lessons come first-hand.

In Chainmail, as near as I can tell, there was no notion of hit points; a unit was hit or not and, once hit the unit was dead.  There was apparently a set of rules made to model Civil War era ironclads (as noted by Roles, Rules, and Rolls), where the structure of a ship could take so much damage before being sunk.  According to the interview RR&R references, those rules were incorporated into D&D in order to ease the harshness of sudden, random death that Chainmail would have otherwise. This let D&D players act and feel like heroes.

Anecdotal evidence (the first comment, not the linked post) suggests that hit points for humans were originally set at 7, but player complaints lead to an increase.  Though, it seems like OD&D had starting hit points ranging from 1 to 7 (d6+CON), and 0 hit points was dead.  Then in AD&D, various hit dice were introduced based on class, with a range of 1 to 9 (depending on class and CON) for first level characters.  Again, evidence suggests these numbers were historically meant to represent regular people; the average joe.

So originally (for some value of “originally”) something around 4 points of damage were enough to kill a man (on average), but a burly fighter might be able to withstand twice that.  This is a rather coarse-grained system in that it can really only measure one quarter of a man’s vitality — anything less is too small to be measured.  This was alleviated a little bit with the addition of “below zero” rules, where a character was incapacitated (and dying) at 0hp, but they weren’t dead until -10 — it effectively takes 14 points of damage to kill the average man and 19 to kill a burly fighter, which is a much closer ratio than 4:9.  This addition makes the system a bit more granular and levels the playing field a bit (fighters are no longer taking two mortal wounds before they die). With 15 to 20 points of granularity (though, less than half of those count as “action-ready”), there’s a lot more room to address ‘lesser’ wounds, but we’re still talking about the sorts of things that are going to leave a mark and require a bit of time to recover from.  A busted lip is probably not hit point damage.

Hopefully it won’t take me another month to address ‘modern’ notions of hit points (I don’t think current systems stray too far from the historic baseline, at least not at first) and the contention that hit points are incoherent, that they are explained as “luck, divine favor, etc” but treated as actual physical wounds.

Part 1
Part 3