Procedural chatting: Crewman Greer’s lack of diplomacy concerning Ensign Crawford’s personal hygiene is poorly received.
This week(s) I’ve again spent finding stuff that’s broken and fixing them as I go. For example I found a little bug that made faces look bad, now they should appear at least 900% better looking (according to the Daniel scale of face appreciation). Also I again got some time to work on the sections. As you might imagine, there is a certain amount of work that has to be done to be able to release the game and on my end of things, the sections are one of the larger parts of the remainder.
These past few weeks have been dedicated to more usability work. I focused on improving highlighting of characters and sections with a nice bright, contextually-colored outline. This allows us to draw attention to important elements without blotting them out entirely with opaque colors.
I also implemented functionality to allow for quick focus on selected elements, zipping the camera towards it and framing it as appropriate to the type of item.
Finally, once I was able to focus in this way, it became obvious that the info panel placement needed work. The goal was to place the info panel near the selected item without overlapping and keeping it onscreen. This proved to be an unholy challenge but we pulled off a passable implementation.
Hey guys, this last week I’ve been iterating on the game-final version traits, and their overall structure. I want to get them just right before moving on! In the words of Mark Twain: “The difference between the right word and the almost right word is the difference between lightning and a lightning bug.”
You can see above a first glimpse of our procedural chat generation. Dave has worked hard to create these “watercooler chats” that the characters will have as they interact. The results are driven by each character’s unique personality traits and are generated on the fly using our legacy trait system’s data. These aren’t canned phrases. They will make you to cheer, chuckle or groan over and over without tiring of the same lines of text continuously reappearing in different conversations. The conversations ultimately impact how the characters will get along in the future.
Dave is in the process of wrapping up an enhanced version of the trait system, which should lead to an improved chat experience.
This week I’ve mainly been doing two things. Firstly hunting bugs and fixing the ones I am able to. Secondly, I got some time to get back to creating sections for the game.
Here you can see the Science Station, where the crew can commit science to their heart’s content
This week I started on my second pass on organizing and designing “story rooms” the atomic units where the narrative takes place, in preparation for further story development. The goal is finding the right way to trigger events within in the story (using the procedural generation) which is necessary at a fundamental level for both character injury and rewards, and all story plot development.
This week, I spent some more time knocking out bugs that cropped up. Now that the game is much more playable, team-members have had a chance to run it for longer periods. As such, my bug list has been getting longer.
Also, after having made major changes on how module parts are loaded and what data the new AI system uses for pathing, it created a bunch of work to get module rotation on placement and correct validation of valid placement locations working again.
Finally, our move to a deferred rendering pipeline broke some of our graphical effects. I finally had a chance to get familiar with the related tech and wrote a solution to smooth out those issues.
This week has been a bit lower intensity the previous one for me, going at too high a speed for too long is not good for humans.
Anyway, what I’ve been working on is more GUI stuff. Trying to figure out a good way to combine readability and style in order to get more contextual information to the player.
It again, like many things I’ve talked about in the past, ends up being a question about balance. You want as much of both as possible, but usually you need to choose. Most of the time the choice is a gradient between the two options so it ends up being a quest to figure out what serves the game the best on the scale from pretty to useful.
This week I worked on categorization for the new words going into the word bank, in order to match the recently completed trait re-design. This is part of the larger tasks of improving the 1st pass / demo quality data in the narrative system into its final form.
I spent the better part of this week catching up on admin that was set aside while we wrapped up this last milestone. However, since I couldn’t help myself, I spent some time addressing bugs that came up with the slew of new features we recently added.
These last couple of weeks we’ve been really busy working up to a major internal milestone. Much of it for my part has been behind the scenes type of work, like creating path graphs so the crew know where to go and what to do, working on gui layouts and other functional parts.
In addition to that I got the chance to polish up the visuals behind both the Species Creation character visuals (the guys you see during species creation at the start of a game) and the Photobooth visuals (the little “photograph” images on ID-cards and character sheets), this all makes the crew of your astrobase way more relatable.
And I got around to finishing the effects package i talked about previously, the teleporter effects. For any classic Sci-Fi series the teleporter was always a fun place to get some special effects in that reflected both the setting and the era it was made in. Since I draw inspiration from several different classics I combined some of my favourite elements into one whole. A gold star to the first one to figure out the three main influences (which may or may not be very apparent) for this effect.
The Teleporter in action, sans a brave volunteer to be taken apart piece by pice, atom by atom, and risk arriving at the destination as a puddle of fleshy goop.
Lots done this week! I finally settled on the distribution that gives us some a great balance scheme for the traits, and went through and renamed them all based on their new configuration. Lots of cool synergies already emerging. Next step is to take another pass on the word bank to match the adjustments.
As Daniel mentioned, we’ve been pushing for a major milestone with the game. While our last milestone revolved primarily around making the station look good and feel good to build, this one was more about making it a living, breathing place and allow the player more interaction with that. As such, my main tasks revolved around 3 things: AI, contextual UI and mission flow.
As I’ve mentioned in previous weeks, I spent a bunch of time architecting and implementing an AI system that would meet our specific needs. With that in place, I developed the tools to allow us to design behaviors within that system. So, crew now have basic actions to meet all their needs: they can eat at the cantina when there’s room, sleep in their quarters if they have any, work their assigned jobs and socialize with their fellow crew members.
While that was a huge leap in itself, it wasn’t complete without contextual UI. There’s only so much you can glean from a character’s decision making when you can only observe their comings and goings up close. With the crew info panel, we can click on a crew member and and get information about their current action (“Going to eat at bla”, “Going to Teleporter”), their current job assignment and the state of their needs (currently labelled as Eat, Sleep, Work and Socialize).
Once we had this in place in a way that felt good, we replicated the same contextual UI for sections. We can select a given section and get relevant information, like storage capacity and contents, available beds, as well as jobs assigned, who occupies those positions and which skills are used in that job.
Finally, once we had all of the necessary tools to understand what is going on, I spent some time making more of the active gameplay available. So, armed with Crew AI and UI, I worked on assigning a crew member to a mission, seeing them walk to the teleporter and teleport out and see the mission being executed, receiving one mission report at a time. Also, we used Daniel’s punch cards as a notification system, dumping one into the inbox every time a new report comes in. Running that card through the datapad, you can now read the report (and the previous ones from the same mission) at your leisure.
As you can see, it’s been a busy week and we’re inching ever close to the time where we think we can get valuable feedback from playtesters. Stay tuned for that!
This week I’ve had the pleasure of going through a bunch of fun paperwork, I guess it was my turn. I did also finish the punch card, which sounds like something that shouldn’t be too much work. But, if you think that you dont know what type of punch card we’d make for ABC.
Since we have an affinity for the procedural, of course our punch card is indeed procedural. Not only is the text added in dynamically through GUI elements but the actual holes are punched through the paper using a procedural texture. This of course not only allows us to make every punch card unique but since the punched holes actually are bits we can represent data from the mission reports in the punched holes.
Here you can see the back and front of the punch card. Don’t worry about trying to figure out what the data means, it’s just set to some random values.
This week I added an error checking section of the spreadsheet for the trait distribution to make sure I did the maths right which ensure all traits are perfectly balanced (on a number of different balance lines). There’s a very minor fluctuation where some 3s and 6s are supposed to be 4s and 5s, but once I finish building out the matrix (a few more hours) it should be easy enough to find and correct.
This week, I started the transplantation of the crew members brand-new AI brains. Integration went pretty smoothly! I’ve since started working on locomotion, which is the system that manages how an AI actually walks around (moment-to-moment change to position and rotation) given a pre-calculated path to follow.
This has exposed some issues with how I generate path networks, which has given me an excuse to go back and re-write not only the things I need to solve but be proactive about a bit of feedback from Daniel.