During our sixth game project we were required to have animations in the game, so I began developing an animation system for the engine that would support a large amount of animated characters, keeping the interface simple and easy to use and at the same time powerful enough to have a lot of animations.
- Instanced skinned animations
- Seamless blending between animations
- Blend trees
- Animator state machines
- Simple code interface
Since we wanted a lot of enemies on the screen at once it was obvious that instancing the animated characters would be a must if we wanted to maintain a playable framerate. Initially I was using an animation ID in the model instance data to do a lookup in a constant buffer containing information about the animations, and using that data to do a lookup in another constant buffer containing the animation keyframes, but as we started to get more animations from our animator the size of a constant buffer was simply too small to contain all of our animations. So I started researching a more robust way of doing GPU animation, and found this whitepaper by Bryan Dudash on how to use textures to encode animations.
At first I was worried that the texture lookups would be too slow to be efficient, but the result was still very fast, and the encoded animations produced a pretty cool image too.
After the animations were up and running, it was time to start working on the animation pipeline. Since our engine is similar to Unity in several areas I wanted to continue on that path and make something like Unitys Mecanim system. If I could make the system similar enough the documentation and tools for Mecanim could be used as a rough guide for our engine as well, so I started by setting up an animation controller in unity and playing around with it, testing it out and deciding what features to support and which we could get away with not using. Then I exported the data needed to replicate that system in our engine, so that I wouldn't end up creating a system based on data that we couldn't get from Unity anyway.
Thanks to the very descriptive documentation I was fairly quickly able to get a workable animation controller up and running, with some states and some 1 dimensional blend trees based on some basic parameters. From the programmer side it was quite easy to control the animations using those and letting the system handle transitions and blending as it had been set up in unity.
For the rest of the sixth project the creation of animation controllers remained the responsibility of the programmers as we did not feel we had enough time to deal with the issues that might arise from giving our animator free reign with the tool, but as the seventh project began we let him take over the creation of the controllers and after a short learning period with some minor bugs and feature requests we had quite nice workflow.
Issues / Planned changes
- 1D blending is the only supported blend type.
I'd like to allow more advanced blend types like 2D directional or freeform blending to give more freedom to our animator.
- The animations break on older / lower tier graphics cards.
When I first created the animation system I had no idea what a structured buffer was, so I just used the old animation texture technique described in the white-paper linked above. I also set the animation texture to be 16384 pixels wide which seems to be causing the animations to bug out, either due to a lack of memory, or more likely due to textures that large not being supported on all cards.
- Do animation blending in a compute shader instead of per-vertex
Animating the bones on a per-instance basis instead of per-vertex would definitely decrease the cost of blending since a lot of vertices are controlled by the same bones, which means there are a lot of redundant calculations being made that could fairly easily be removed if the bone poses were pre-calculated before the vertex shader in a compute shader. Perhaps keeping all the bones of all the instances in a large buffer and letting the instance keep track of where in that buffer its bones can be found?
Only 4 simultaneously blended animations.
Currently we've only used 1D blend nodes and not utilized the tree structure since there is a maximum number of animations that can be blended at once. It's possible that extracting more of the animation data from the instance data into a separate buffer containing information about which animations to blend would remove that limit.
- Blend layers
- We'd like to add support for blend layers so that we could add things like hit animations that can play over the other animations,
I had 7 weeks to work on a project of my choosing for the specialization course at The Game Assembly, and I chose to work on a level editor for our custom engine as it was a something i had wanted to do for some time. Originally I had intended to use the Qt framework for the interface, but due to my relative inexperience using Qt and the time constraints I settled on using ImGui so that I could focus on producing functional features rather than get stuck learning a new tool.
- Multi-entity component editing
- Saving and loading levels
- Drag selection of entities
- Undo and redo
- Editing of entity transforms using ImGuizmo (rotation and scaling not fully functional)
- Visualization of entity hierarchy (not currently editable)
Initially the goal of my specialization was to create a functional editor that could be used in our seventh game project, but it quickly became apparent that the time available would not allow that without causing significant problems in our pipeline. Instead it was used to create modular pieces for a level generator that my fellow group member Jonatan Svensson worked on.