Since last week I was surprised by Adam with the functionality to combine skinned meshes, which allows me to build characters from pieces instead in-game instead of having to have one mesh per permutation or some other dirty dirty solution to modularity in character visuals; I began this week by weighting the skinning (when preparing a mesh to be animated you bind its vertices to its bones, the default initial skinning is pretty good nowadays but in some areas it is lacking) for all the character meshes we have in game so far. This means that characters now don’t deform in an overly broken manner.
Aside from that I also spent time making sure the projectile weapon assets will export right and I coded a bit so that they now actually build themselves properly in-game. Which helps when wanting characters not to look broken.
This week I continued tackling the landing phase of missions. As expected, it uncovered several over-arching systems required for missions in general in addition to the systems used specifically in landing itself. For example, we need text generation for mission leaders to command their crew to do perform a given task (in this case landing, but we’ll need it for all tasks).
So for most of this week I’ve been doing the design work to get command sentences — and how this relates to task performance by the taskee, using the trait system — added to procedural text generation.
This week I decided to change it up and get the combined skinned meshes working for Daniel. His mocap work was so cool that I wanted him to be able to use his great characters instead of some pretty ugly placeholders.
Then I jumped back into datapad work, tackling the beast that is the module builder program. If I can finish the GUI conversion this weekend, that’ll be it for the desk for a little while!
Greetings! This week I’ve spent with shoestring budget mo-cap. In other words, from now on I will be able to ambulate just like you humans and my infiltration attempt will be successful.
Also, it means that we now have a bucket load of emotive animations to help out with visually indicating how, for example, a conversation between two crew-members is going.
This week I’ve been working on designs for the (hopefully, final) version of the mission system. The difficulty is I need to tie in traits, skills, stats and planet hazards to give characters some sort of non-linear problem solving that balances risk v reward, and can also be ouput to text descriptions.
What a character does has to make sense given the challenges they face, and who they are. And it should be generic (I don’t want to rewrite the thing when I move on from planet landing to planet exploration, for example). Also, I prototyped the module builder and equipment manager datapad programs with Adam but this was a lot easier.
In the continuing saga of Datapad programs, I completed global inventory management and am working on both the new module builder and the equipment manager. The last two are ones I’m particularly looking forward to playing with. Our first versions were un-intuitive and a little clunky so it took away from the fun of assembling a module from parts. These will be oh-so-satisfying.
Since Daniel was working on some great homebrew motion capture this week, I also snuck in the code to assemble the new character parts together and animate them correctly. This will make for a great visual upgrade over the last set of characters once they’re integrated.
This week I’ve been working on particle effects. Particles systems are one of the fields in game graphics that are relatively separate from the rest. They exist in the same world as everything else but compared to how a model and its animation have a symbiotic relationship particles tend to be tacked on wherever they make sense.
Their main purpose is to indicate states, for example if food is hot you can have steam coming off it, indicating its current state. If a character is poisoned, having puffs of poison come out of him can give the player at a glance information that he might be in trouble.
Most particles are limited to being textured planes, emitted and then behaving randomly inside the limits defined for it. In some cases you use custom meshes instead of planes and/or animated textures, this allows to make some effects that would otherwise be impossible to do.
Pictured: An example of a status effect highlighted by the use of a particle effect. In this case the status of the carpet is “on fire”.
I’ve spent the past few days working on the PR front, getting things into place for when the big day comes to let Astrobase out into the wild. It’s important to get your ducks in a line well ahead of time, since it can take quite some time for some groups to get turned around and release a piece.
Since getting noticed is a huge issue for indie Devs, we’re trying to stack the odds in our favour by investing some time early on.
I’ve also been working on getting some screenshots and wallpapers ready to go so you can keep a little digital piece of Astrobase close to your heart (or your screens). Here’s a work-in-progress preview!
Captain Barrett investigates the disappearance of her astrobase’s crew.
To cap things off, I’m currently working on a series of icons that will represent Daniel’s groovy new social modules.
This week I finished the procedural conversation that characters engage in on the station.
I spent this past week working on more updated versions of Datapad GUIs. I wrapped up the promotion program and got through most of inventory management. Here’s a preview of promotion to give you a feel for the visual style.
I’ve also had a chance to integrate a prototype version of Dave’s conversation work in action. It’s kind of amazing! Looking forward to showing that off in-game!
This week I’ve spent making small objects to populate the world with. You’d probably be surprised by how adding a bunch of every day objects helps tie a room together, you know, unless you’re an interior decorator by trade or just an enthusiastic hobbyist.
A prime example of how the lack of everyday items makes an area feel too empty, even when full of furniture.
Of course, in games this means all these objects need to be modeled, uv-mapped and everything. For a game aiming at regular contemporary realistic design choices this means they can load up an asset store and buy a bunch of stuff to fill their rooms.
For us however that won’t quite work as it would be a bit boring if these guys living on a space station for many years had the same earth-style objects every other game has. So I’ve been making our own!
While each item is relatively quick to make they all add up into a quite substantial amount of work to produce.
Last week I worked on the business-development side of the company. I’ll spare everyone the details, but it’s part of setting up a successful indie studio. Don’t worry, this week it’s back to game dev!
This week, I completed the tech conversion of our UI system, which was an absolute treat. The new solution is cleaner, simpler, performs better and is just an all-around improvement.
I spent the rest of the week working on a conversion of our existing datapad programs (module builders, sensors, mission planning, etc.) to the new tech. I used the opportunity to improve upon the previous version in terms of visual communication, layout and usability. I’m really happy with the new direction.
Hi Everybody! (Hi Dr. Nick!)
This week I’ve worked on the graphics for more social sections. It struck me that we had an abundance of really useful super practical areas, like reactors and command and sleeping quarters.
However, what we lacked was somewhere for the crew to spend their time off. Not even in the more militaristic sci-fi settings (Battle Star Galactica and Babylon 5 comes to mind) do they avoid the types of activities that make up normal every day life.
Of course these areas might not be the first thing you build but they are definitely are important to the on-station life. If you imagine it from the view of a survivor of a civilization after a cataclysmic event the comfort in being able to do something like go to the gym would be immense.
As pictured above we clearly do not in any way view humans/humanoids as hamsters.
Hey everybody! My work this week is more non-sexy programming stuff. I’m in the process of gutting our GUI back-end to replace with one that plays nicer with Unity (the game engine we’re using). This’ll make it a lot easier to manage the 3D GUI components and won’t cause nearly as many visual glitches.
I’ve also been working with Daniel to get some of the coding problems of character appearance sorted for the models you’ll see walking around the station. He’s provided beautiful art for them as well as tech to stitch all the parts together. We just need to get the animations working with the merged version of that character.
I recently finished getting all the data that encompasses procedural text generation from my design spreadsheets into sensibly structured database tables, and I’ve moved on to writing the back-end data repository classes that loads that data from the db and stores it in code. Because ultimately the AI needs to be able to make heads or tails of the data, everything needs to be stored/structured in the correct manner to make that happen.
My workflow is something like: spreadsheet design->table/stage organization in excel->write exporters->table organization in SQL->export data from excel to SQL->write code-side data structures->load data into those structures from the DB->write AI that uses the data->output/test loop. Many of these steps happen in parallel.
For example, if I know the AI needs X,Y and Z then I want to build that into my tables and ultimately my design. There’s a lot of going back-and-forth to make sure I don’t accidentally paint myself into some corner by constraining the data in some way for some step that ends up being stupid for some future step.
TTDR (too technical didn’t read): the progress bar on procedural text moved from about 50% to 75%. (100% is working, integrated, and playable).