Sanctimonia: October and November Updates
As I mentioned previously, October was a bad month. Fortunately my insane roommate has left the building and peace has been restored to the universe, which means I can actually concentrate on getting work done on the game (queue November).
October’s improvements, though generally subtle, include user-controlled zoom modes (full zoom-out and dynamic), a new soil sublayer for a total of three and a new vegetation sublayer for a total of four.
November’s improvements were some of the best to date in any given month, naturally occurring post-harpy. They include:
- Inter-player dialogue (multiplayer and NPC conversation)
- Persistent player skins
- Persistent player portraits as seen in my “Avatar” application
- Initial implementation of persistent player inventory (“gear”)
The specific details may be found here under months 9 and 10.
The game is nearing the point of attaining early implementations of the holy trinity of gameplay elements, those being exploration, interaction and communication. That moment is generally when even a primitive game becomes fun. I expect an open invitation to test in six months and an initial release in twelve.
Congrats :-). I’d love to see to what extent you pushed the NPC tree dialogue handling and what does the format look like?
Same questions for the inventory too!
Count me in for the testing 🙂
Keep it up!
Jc
Just recorded a video of multiplayer dialogue:
http://youtu.be/14Arh7uLMig
@ FEARYOURSELF:
Thanks, and everyone on Aiera is invited to alpha and beta test when it gets to that point. Alpha will be for debugging mostly and beta will be more for gameplay and usability feedback.
Right now dialogue is simply send/receive based on the sender’s proximity to receivers, although the way it’s set up it could be done offline (AI) as easily as online (client app). Something to point out is that I’m making little to no differentiation between online and offline players, meaning NPCs generally are players and vice-versa.
Your player will have reasonable defaults configured to govern their offline behavior, but you’ll be able to customize it further. As well as defining your weekly/monthly/whatever routine you’ll control your responses to different conditions, both verbal and physical, to any number of conditions. It won’t be a tree, but actions based on combinations of conditions. For example, here are some off-the-cuff boolean conditions:
Player possesses sheathed weapon
Player draws weapon
Player sheathes weapon
Player picks up item marked as belonging to you
Player moves > x feet from position where they picked up your item
Player equips your item
Player drops your item more than one foot from where it was picked up
Player drops your item less then one foot from where it was picked up
Any of these conditions will be able to be combined into “If X And/Or Y And/Or Z = True/False Then …” type responses to construct your offline AI. While it might seem like programming, the reasonable defaults will take care of most scenarios neatly and players who want more control will have just that. AI templates will also allow players to perform common tasks like harvesting resources (hunting, gathering wood, feeding livestock) so setting up a job schedule will be pretty straightforward.
The inventory system is also an early implementation. It’s basically a gear/inventory class (currently a structure–must change) embedded into an array of player classes that define all the players. It looks something like:
Player[0].InHandRight.ObjectProperty
Player[0].InHandLeft.ObjectProperty
As an example, I plan on setting it up to where each finger can have a specific number of items, so you can have two rings on a thumb, three on other fingers, etc. I’ll differentiate between “in” and “on” for some body parts. The hands can have something in (sword) and on (glove), while the mouth can only have something in (small knife, bite of apple). If you have a scabbard but no belt, you have to hold the scabbard in your hand or stow in a suitable container. When you obtain and equip a belt, you can equip a scabbard, then place your sword in it. A backpack will be “on” your back, but a leather pouch must be secured by a belt or else will be “in” your hand.
Getting all that to work on a gamepad will be a challenge, but ultimately rearrange, equip/stow and drop/pickup commands should suffice with a radial selection GUI.
If I don’t have that working in six months, someone please shoot a crossbow bolt into my forehead Ultima VI style. 🙂
Ok, I think I understand most of that. As you say, I don’t know how it will look like on a gamepad but I’ll be curious to see that in action.
As for the scripting, I went to a LUA sub-engine to handle the Npcs, it seemed to make more sense as it allows you to create a script file for each NPC separately and handle their code structure and their possibilities directly.
Jc
I remember you talking about using LUA for that. I was recommending you code your own stuff and you argued against it for convenience. I respect your choice; I’m not chastising or anything… Hell, I use SDL and OpenGL and am using a BASIC dialect, so I can’t say anything! LUA I think makes it more “open” because more people know it and could modify it.
My first implementation of AI will probably be offline players saying random things randomly. You might walk up on one and have to wait for a minute and then they’d say, “No, I have no tomatoes. I have no tomatoes today.”
My second implementation would make them walk three feet to the side (perpendicular to angle of movement, random side) if you were less than ten feet from them and closed the distance at a rate over 50% of your maximum running speed. I’d make them say, “Look out…” at the same time so online loiterers could see who the asshole was. A subtle way of getting people to slow down and discourage item snatching and run by fruitings.
Basically what I’m saying is that everything needs to be dynamic because when a player isn’t playing he’s offline and becomes an NPC. That’s why instead of having the standard tree I have to use conditions. I’ll probably weight them and have a “score” that must be exceeded before a certain response is warranted. Like if a customer picks up two items and walks out the door of your shop, it will score higher than if he just had one.
Having the responses weighted might be cool too. If someone takes an item off your showcase you might automatically say, “A fine piece, do you like it?” If they walk away from you over one foot you might move around from behind the counter and tell them where you got it from. The specific attention might intimidate them if they were thinking about stealing the item, even though it was a programmed response. Your character’s “threat to specific target” level might go up one for the player who lifted your item. If they put it down, the threat level would go down one and you’d resume conversation or they’d continue browsing, lesson learned about picking things up and walking around the shop. They could also have a memory of the average negative and positive behaviors, or if the negative exceeds a certain threshold. Their overall personality could be governed by an average of all players’ positive and negative averages as experienced.
Subtle conditions and responses could hopefully persuade players to behave in different ways under them. Everything would appear to be alive, even if it were just people automatically stepping against the bar when you ran through it and past them.
What do you think about having a very basic set of multi-variable conditions (Frustration As Byte, Hunger As Byte, Proximity As Single, etc.), defining mixed boolean combinations of said conditions and assigning a response procedure to each combination?