Nirion»Blog
Ben
This past month I've been working almost exclusively on improving my tools for metaprogramming. Generating code at compile time has been an important part of Nirion's engine since almost the beginning. It generates the data necessary for reflection at runtime, which is then used to implement serialization of assets and entities as well as property editing in the editor. The way I was going about this before was really hacked together and rushed. The tool would basically go over every file in the project and look for special tokens in the code. These tokens were just macros that expanded to nothing. For example:

1
2
3
4
5
6
7
8
TYPE()
struct S
{
    PROPERTY()
    f32 x;
    PROPERTY()
    f32 y;
}


This struct would be recognized as a type that needs reflection data generated for it. The parser would see 'TYPE', read in the parts of the struct definition it needed, and skip the rest. 'PROPERTY' was used to signify that the member should be included in the reflection data, so that some members can be excluded. This also allowed me to specify extra metadata for members:

1
2
PROPERTY(Category="transform")
v2 position;


Besides the parser itself being a mess, this worked okay most the time. I came up against some cases where definition orders were an issue. For example the 'Entity' type in Nirion has a game data part that consists of all the game specific entity data:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
struct Entity
{
    // Lots of engine data that entities need
    ...

    // Overlapped data. Data that is never shared between entity types.
    union
    {
        Player player;
        Door door;
    }
    // Shared data. Data that any entity type can access.
    struct
    {
        HealthComponent health;
    }
}


To avoid having to update the Entity struct every time I add or remove a new game entity data struct, my solution was to generate this part of the Entity struct using metaprogramming. Because Entity is always defined above game entity code(so the entity type can actually be used), any types defined here could not be embedded in the Entity struct as shown above. So I ended up with this situation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
ENTITY_DATA
(
 struct AnimatorEntity
 {
     u32 animation;
     i32 loopCount;
     
     EntityId attach;
 };
 );


'ENTITY_DATA' would actually expand to nothing, but the metaprogramming tool would copy it's contents and place it in a generated file that gets included before the Entity struct. This means that some types that should be visible to 'AnimatorEntity' were actually not. Ignoring the fact that structs like this now have to be wrapped in macros like this(which is just ugly), it leads to some weird issues that I would rather not have to think about.

This is just one example of the (relatively minor) problems I had with this approach. Overall the biggest deciding factor for me redoing this was just that the code was rushed and pretty bad. I'd constantly run into issues where the tool would just crash because it encountered something unexpected.

Redesign
So after looking at how other people had approached metaprogramming, I decided to ditch using custom markup in the code itself and just parse a completely different format that roughly resembles C++. This was mostly inspired by Ryan Fleury's Data Desk project.

The big advantage to this is being able to add more complex and readable markup than you'll be able to do with macros. Also, the resulting C++ code can be output in an order that's completely unrelated to where it's defined in this custom format. It also makes parsing a little easier, since you don't have to deal with skipping unsupported code.

So I called this new tool CG(just for 'code generation'). All code that needs reflection information or other metaprogramming functionality must be defined in a '.cg' file. The code is then parsed and transformed into an AST. The metaprogram then does some processing with this AST and outputs C++ code. Here is an example struct definition:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
@Introspect @EntityDataOverlapped
struct WorldAreaEntity
{
    v2 size;
    i32 priority;
    f32 layerParallax;
    b32 fadeAreasAbove;
    f32 distanceBetweenLayers;
    b32 dontTintLowerLayers;
    b32 isSaveRoom;
    
    @HideInEditor
        i32 dependenciesCount;
    @AC_Mem=dependenciesCount @PersistentRef
        u32 dependencies[8];
    
    @PickableAsset=Asset_SoundQueue
        u32 ambientQueue;
    
    
    b32 clearFog;
    GameAreaType areaType;
};


Initially I started with parsing data definitions and adding some custom functionality that I knew I needed such as 'tags'. Tags just allow you to annotate different parts of the code. Tags are in the form @'key'='value' as can be seen above. A tag doesn't need a value, and a value can be a list of values or other tags by enclosing them in '{}' and separating each entry with a ','.

I then went on to add things like functions, loops, if/else, union, switch, return, etc. It's important to note that this is all just being parsed, transformed by the metaprogram, and then output to C++. So most of what is involved here is just parsing. No part of this 'language' is actually evaluated in any way, so it's probably not as much work as it may sound like. Parsing all these features took me only a day or two.

My initial motivation to go beyond just type definitions was for defining immutable array counters. For example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
@Introspect
struct Tilemap
{
    @NoIntrospect
        b32 invalidatedResource;
    @NoIntrospect
        void* rendererResource;
    @MetaData
        TilemapInfo info;
    
    @AC_Func=Tilemap_GetLayerCount @BulkDataPtr
        TilemapLayer* layers;
    @AC_Func=Tilemap_GetTotalTileCount @BulkDataPtr
        Tile* tiles;
    @AC_Func=Tilemap_GetTileCount @BulkDataPtr
        GlobalTileData* globalLayer;
    @AC_Mem=info.collisionCount @BulkDataPtr
        TilemapCollisionInfo* collisionInfo;
};


The tag 'AC_Mem' can be used to define a memory address that points to the i32 array counter. The editor can use this to add or remove array members automatically(when the user presses the '+' or '-' buttons on an array property). This works well for a lot of things, but to define an array counter based on transient state, such as the tile count in a 2D tilemap, a memory address isn't going to work(unless you're willing to duplicate data). So I decided to parse functions in order to define immutable array counters. The 'AC_Func' tag specifies a function that returns the array count. In this case, 'Tilemap_GetTileCount' looks like this:

1
2
3
4
proc i32 Tilemap_GetTileCount(Tilemap* tilemap)
{
    return tilemap->info.width * tilemap->info.height;
}


This is mostly just used for serialization. Since serialization is done at runtime, the serialization function can use this functionality to know how many tiles it should serialize without needing to serialize additional data. This was important to me, since one of the big goals with this redesign was to make serialization more automatic and robust.

Eventually I got back to where I was with my previous metaprogramming tool in terms of functionality. After this I started to look for ways to improve what I had.

Templates

While transitioning from the old system to this one, I found that I had a need for C++ template like functionality. Around this point in the game's development I changed how I was designing container data structures. For things like linked lists and hash tables I had designed their interfaces around holding void pointers to make them generic. I found this interface pretty clunky and so decided to redesign these data structures as macros. These macros would take the types as inputs and expand to specific versions of the container.

I like this approach a lot better and needed a way to replicate it in CG so that they could be introspected(among other things). My solution was to recreate macros at the AST level using a 'template' block like so:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
@OutputAsMacro=_DEFINE_STATIC_ARRAY(Name, type, size)
template StaticArray(Name, type, size)
{
    @Introspect
        struct $Name
    {
        @HideInEditor
            i32 count;
        @AC_Mem=count
            $type data[$size];
        
        proc $type& operator[](i32 index)
        {
            return data[index];
        }
    };
};


An instance of the template can be generated like this:
1
generate StaticArray(Array_i32, i32, 32)


When a 'generate' command is encountered, all the code in the 'template' block will be copied and any identifier starting with '$' will be replaced with the arguments passed. The template itself is never output to C++, but the '@OutputAsMacro' tag tells the metaprogram that the template should be output as a C macro. I can now use this in CG and C++ in the same way. Tags such as '@Introspect' are also copied when generating, so the template instance can still be introspected.

An example of my dynamic array type:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
@OutputAsMacro=_DEFINE_ARRAY(type)
template Array(type)
{
    struct Array_##$type
    {
        i32 count;
        i32 allocatedCount;
        @AC_Mem=allocatedCount
            $type* data;
        
        @NoIntrospect
            Allocator allocator;
    };
    proc void Clear(Array_##$type* list) 
    { 
        if(list->data) 
        { 
            Assert(Allocator_CanFree(list->allocator)); 
            Allocator_Free(list->allocator, list->data); 
        } 
        list->allocatedCount = 0; 
        list->count = 0; 
        list->data = 0; 
    } 
}


As you can see here, I implemented concatenation using '##' just like C macros. As long as the '$' is generated as an identifier, it will concatenate. I find this really useful when writing this type of code. Also 'template' blocks can also include functions. They work in exactly the same way.

After this I figured out I could extend this feature to allow me to generate more complex code directly without needing to rely on the metaprogram as much. For example, in Nirion there is an
1
enum EntityType{...}
which contains an enum entry for every type of entity in the game. This was automatically generated by the meta program based on what entity types exist after parsing and didn't exist in CG directly. I added the concept of "inline templates". If a template uses the 'template_inline' keyword instead of the 'template' keyword, when an instance is generated, the generated code gets inserted at the template definition rather than at the 'generate' command. This means we can do this for EntityType:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
enum EntityType
{
    ET_None,
    template_inline EntityType_Entry(entry)
    {
        $entry,
    }
    ET_Count,
};

...

// Somewhere else in the code

generate EntityType_Entry(ET_Player);
generate EntityType_Entry(ET_Door);


This results in EntityType looking like this in C++:

1
2
3
4
5
6
7
enum EntityType
{
    ET_None,
    ET_Player,
    ET_Door,
    ET_Count,
};


We can also use this to define a entity type name hash to enum type function pretty easily('template_hash' just tells the metaprogram to hash the template parameter):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
proc EntityType EntityTypeHashToEnum(u32 hash)
{
    switch(hash)
    {
        template_inline EntityTypeHashToEnum_Entry(case, template_hash value)
        {
            case $case: return $value;
        }
    }
    return ET_None;
}

...

// Somewhere else in the code
generate EntityTypeHashToEnum_Entry(ET_Player, "Player");
generate EntityTypeHashToEnum_Entry(ET_Door, "Door");


We can also use a template to package these together:

1
2
3
4
5
template DefineEntity(template_hash hash, type)
{
    generate EntityType_Entry($type);
    generate EntityTypeHashToEnum_Entry($hash, $type);
}


And now entities can be generated in one call with:

1
2
generate DefineEntity("Player", ET_Player);
generate DefineEntity("Door", ET_Door);


Without any custom metaprogram code at all. It's also a lot clearer that the enum/function exists and where they are defined. I think this has some drawbacks, though. It can be a bit harder to know what the final result will look like compared to just reading C++ code that directly generates it. Some things are also much easier to do in C++, so I don't do everything like this. I was on the fence about using templates this way, but ultimately decided that it made relatively simple cases like this easier to write and clearer.

Introspection

When a type has the "@Introspect" tag, reflection data is generated for that type. The data structure that represents a type/property at runtime is:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
struct CGIntroNode
{
    CGIntroNodeType type;
    
    String name;
    u32 nameHash;
    
    CGIntroNode* tags;
    i32 tagCount;
    
    memindex runtimeSize;
    
    union
    {
        // Property
        struct
        {
            memindex memoryOffset;
            CGValueType valueType;
            CGIntroNode* valueDefinition;
            
            i32 staticArrayMax;
            CGIntroReferenceType referenceType;
            
            CGArrayCounter arrayCounter;
            
            b32 hasNoSerializeTag;
            b32 hasBlockSerializeTag;
            b32 noCopyFromPreviousVersion;
            
            i32 enumValue;
            
            b32 isIndirect;
            i32 category;
        };
        
        // Type
        struct
        {
            CGIntroDefType defType;
            
            CGIntroNode* properties;
            i32 propertyCount;
            
            i32 typeId;
            
            CGIntroNode* previousVersion;
            UpdateDataFunction* updateDataFunction;
            u32 version;
        };
        
        // Tag
        struct
        {
            CGValue tagValue;
        };
    };
};


The information for a 'CGIntroNode' is generated as global variables by the metaprogram. An example:

1
2
3
4
5
6
7
// EyeTurretType
CGIntroNode EyeTurretType_Tags[] = {TagNode("Introspect", (u32)951444544), };
CGIntroNode EyeTurretType_CGProperties[] = {
EnumEntryNode("EyeTurret_RadialFire", (u32)-997209664, 0, 0, EyeTurret_RadialFire, false, 37), 
EnumEntryNode("EyeTurret_StraightFire", (u32)1971805262, 0, 0, EyeTurret_StraightFire, false, 37), 
};
CGIntroNode EyeTurretType_CGType = TypeNode("EyeTurretType", (u32)972719426, EyeTurretType_CGProperties, ArrayCount(EyeTurretType_CGProperties), sizeof(EyeTurretType), EyeTurretType_Tags, ArrayCount(EyeTurretType_Tags), CGIntroDef_Enum, CGTypeID_EyeTurretType, 0, 0, 0);


As I mentioned before, serialization is done at runtime. The "CGIntroNode" type is used to traverse the data structure recursively. A version number and previousVersion is stored on types. If a version number does not match, a conversion function(defined in CG) is called that converts from the stored version to the next version, all the way up to the current version.

Conclusion

I think that covers most of what I worked on in at least some detail. Overall I think this system is working a lot better and is much more robust. It took a very long time to change everything, but it seems like it was worth it. I still have a lot more improvements to do on the engine like this, so I'm not sure when I'll be back to developing the game's content.

Thanks for reading, let me know if you have any questions!

Ben
Since my last blog post I've made some progress across a number of different areas of development. My last post here was a video showing off the music I've created for the game so far, which I got some useful feedback for. If you're interested, check it out and let me know what you think!

I decided to move on after creating some music tracks for the game areas I've created so far. There's obviously still a lot more work to be done on the music, so I'll revisit it in the future. I tend to do a lot of work on one aspect of the game(programming, music, art, level design etc) for a relatively small amount of time until it's passable and then come back and iterate on it at later times. After completing a reasonably polished demo I decided it was time to move onto the second area of the game.

Lava Caverns

The next area of the game is a lava themed area that's called the Lava Caverns. It has the ruins of ancient temple like structures throughout it, but also more natural looking cave structures, and even some more advanced technology which has, in the lore of the game, recently appeared. For a couple of weeks I developed a lot of the art for this area as well as some of the gameplay mechanics that will be unique to it.

Here is some of the earlier art as I was developing it:


Some more up to date art:


Like how the mines had the minecart pushing gameplay mechanic, the lava caverns has block pushing puzzles. This was fairly easy to implement since it mostly just uses the the physics and collision detection system. Blocks can pushed around freely(not aligned to any sort of grid), can only be pushed around the "sand" like path that they're on, and can only be pushed and not pulled. Collision entities that only interact with push blocks are spawned at the sand-dirt transition tiles. This keeps the block locked on the sand so that it cannot be moved around the room freely. Personally I think the free and smooth movement of the blocks makes the puzzles a lot more fun to interact with.

There are also different types of blocks. Some that can only be pushed along a certain axis and some that are immovable.


One problem I encountered was with how to separate blocks that were pushed together and therefore can't be separated without a pull mechanic. I wanted to avoid a pulling mechanic since I don't think it would really suit the game and how the player moves around. After trying a few things I decided on having the player be able to separate the blocks by moving "in between" them:



Hopefully this is intuitive enough for someone to try if they're in this situation for the first time. I think it's pretty natural to try to separate them this way if you feel they're stuck, but I guess I'll have to see how it goes once more puzzles are designed and I can get more people testing the game.

I also spent a bit of time on drawing and programming a couple of new enemies.



Meta Programming Update

After I did this work on the new area, I decided to work on cleaning the engine up a bit. I'm not happy with how a lot of the code is right now. Quite a bit of the code started out as just the simplest solution possible to get something working, but I never got around to updating it as the game got bigger. Some of the things on my list are improving the renderer interface, improving the editor(I don't even have a search bar or any sorting, but have +100 entities!), improving debug code, fixing and expanding undo/redo for the editor and finishing some aspects of the asset system. Probably the worst offender though is the code that parses the project and generates introspection data.

This part of the code basically just scans over every C++ file in the project and looks for special macros like "PROPERTY" or "ENUM" which are used to annotate parts of the code. It then outputs data that can be used for introspection. This is used in a few places like the editor for displaying entity properties and the serialization code. The problem with the current system is that it's pretty hard to add new features or even new flags to properties since it's all very hard coded and doesn't really build a proper representation of the program structure. It just parses what it needs(crashes if it encounters anything it can't deal with) and uses a bunch of string manipulation to output some C++ code.

I've been working on a better solution for this over the last few days. Inspired by Ryan Fleury's Data Desk, I'm working on parsing a custom text format into an AST that is then manipulated before being translated into C++ code. So far this is turning out to be a huge improvement. It's much simpler to add new features and way less lines of code to implement. It's still got a long way to go, and replacing the old system in the game code will be a ton of work, but I think it'll definitely be worth the effort in the long run.

The custom format will look something like this:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
@Introspect
struct MineCartEntity
{
     MineCartType type;
     
	 @HideInEditor
     i32 nodeCount;

	 @ArrayCounter=nodeCount
	 @Spline
     v2 nodes[24];

     f32 friction;
     f32 maxSpeed;
     
	 @HideInEditor
     i32 wallSegmentCount;
 	 @ArrayCounter=wallSegmentCount
     MineCartWallSegment wallSegments[4];
     
     i32 startingNode;
     
     MineCartState state;
     
     f32 tPosition;
     f32 tVelocity;
     f32 trackLength;
     PlayingSoundReference armedSound;
};

TagData EntityDefinition
{
	@hash="MineCart"
	@type="ET_MineCart"
	@func_Default="EntityDefault_MineCart"
	@func_Update="EntityUpdate_MineCart"
	@func_OnDamage="EntityOnDamage_MineCart"
	@func_Constructor="EntityConstructor_MineCart"
	@func_OnBlock="EntityOnBlock_MineCart"
}


What's Next?

After I've finished with the metaprogramming/code generation system I'll probably move on to cleaning up some more parts of the code. I think that will keep me busy for a pretty long time. Once that's done though I'll continue working on more enemies, art, level design, puzzle mechanics and eventually get back into music composition. Thanks for reading! As always, let me know if you have any questions about the game!
Ben
Over the past month I've been doing pretty much nothing except bug fixes and sound design. I'm getting closer to having a section of the game at a level that I'm happy to show off, but it's still probably a couple more weeks away. My process for fixing bugs and polishing the game is just playing through it, writing down stuff that needs to be fixed or improved, fixing those things mostly in the order that they were written down in, and then crossing those out once they're done. I've also been working on making sure the game works at multiple resolutions(both "internal" and "output" resolutions, internal meaning the resolution the game actually renders at, and output meaning the resolution that the game scales up to in order to match the window/screen size).

I've also done some more rendering optimisations, but the game seems mostly fill rate limited at the moment on weaker GPUs. This is mainly because the game renders strictly back to front(for correct alpha blending) and does no culling or depth testing at all. I definitely need to come up with a better solution to alpha blending than this, but it might have to be enough for the demo. The game still runs acceptably on decent GPUs. I also need to decide how I'm going to handle variable frame rates, as I've mostly been testing the game at a fixed 60fps so far.

So basically there's not much to talk about this month. Instead of going over a massive list of bugs that I've fixed(half of which are probably content bugs and not programming), I thought I'd talk about how the world is built and stored in Nirion.

Overview and World Areas

Nirion has one large interconnected world with no loading screens or "cut" transitions of any kind. Because of this, I've decided that the world should be streamed in and out based on the player's(and camera's) position. The main "unit" of the world is an entity called a world area. A world area can contain many transient and non-transient entities. One type of entity that it can hold is a tilemap. Tilemaps make up most of what can be seen visually in Nirion. Besides tilemaps, any other type of entity can be placed inside a world area to complete the area, such as enemies, destructibles(barrels, crates) and doors. World areas are always rectangular.


The purple outlines show the bounds of a world area in the editor.

When the game first starts the one and only "map" asset is loaded and all world areas contained in it are spawned. By itself the world area is just a single entity that stores it's bounds and state. A world area can be in one of three states; unloaded, loaded, and active. Right now whether or not a world area is loaded or unloaded is based entirely on if the world area's center is within a given radius to the camera. This tends to load way more than is actually needed, and also loads unnecessary entities. I plan on changing this to a connection based solution eventually, where world areas will load/unload based on if they're adjacent to an active world area or not. When a world area becomes loaded, it will spawn all it's non transient entities.

A world area can be in an active state if the camera's position is contained within the world area's bounds. When a world area is active, it will spawn all of it's transient entities. An active world area must be loaded already, and when the camera leaves the world area's bounds, it is no longer active. Because world areas can overlap, they are assigned a priority value where areas with higher values will be favored if the camera's position is contained in multiple world area bounds.

Entities are split into transient and non transient categories so that world areas are easier to reset. When a world area is loaded it will load two lists of entity delta values into memory, one for transient entities and one for non transient. It then immediately spawns non transient entities from the delta values in memory. This usually contains entity types such as doors, lamp posts, and lights. When the player enters the world area, it will spawn transient entities from the delta values in memory, and this contains entity types like enemies, destructibles, gear pickups, and energy pickups. When the player leaves the area, transient entities are destroyed, but the delta values still remain in memory. The delta values are finally destroyed once the world area is completely unloaded.


Before the player enters the area. No transient entites are spawned.


After the player enters the area. Transient entities such as the wasp enemies and rocks are spawned.

Tilemaps

World areas can contain an important entity called a tilemap. As the name suggests it's a pretty traditional 2D grid of tiles that can be rendered efficiently. In Nirion, a tile is 8x8 pixels in size, and can can be a static tile, auto tile("baked" down to a static tile at build time) or an animated tile. A tilemap stores a variable amount of tilemap layers, which can either be above the layer that entities exist on, or more commonly, below the entity layer. It also stores one layer of data that describes things like flags(is the tile lava, should the player fall through the ground at this point), what type of collision should be used for that tile, and what "height" the tile is in the game.

I have created a tilemap editor to build the tilemaps in Nirion. This tool includes functionality for painting static, auto or animated tiles, painting other data, and creating/removing layers. Tiles are picked from larger "tilesets" which can be selected by this tool as well. Layers can also be marked as emissive for rendering purposes.


A screeshot of the tilemap editor.

I think Nirion tends to look more diverse due to how it uses layers to composite a relative small amount of tiles. A pretty excessive amount of layers are usually used. For example we can see that the transition between the lighter and darker ground is created by layering a rock auto tile on top of them both.



Another example is how the "lava crack" tile is layered on top of the ground. The lava is also on it's own layer, so we can mark it as emissive without affecting the rest of the ground. This makes it stand out more.



Another feature which I think greatly contributes to the game looking more diverse is auto tiles. Auto tiles are just regular static tiles which will change based on what tiles it's adjacent to. They are mainly drawn in the following sections, inner corners, outer corners and a center:


A grass auto tile

The format is then described in an auto tile asset, which is just a text file:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
AutoTile
{
	bitmap = CaveTilemap.png;

	brushSize = (2, 2);

	center 
	{ 
		(64, 160, 2, 2, 1.0)
	}
	right = (80, 160, 1, 2);
	left = (56, 160, 1, 2);
	up = (64, 176, 2, 1);
	down = (64, 152, 2, 1);

	outBL = (56, 152, 1, 1);
	outBR = (80, 152, 1, 1);
	outTL = (56, 176, 1, 1);
	outTR = (80, 176, 1, 1);

	inBL = (0, 144, 1, 1);
	inBR = (8, 144, 1, 1);
	inTL = (0, 152, 1, 1);
	inTR = (8, 152, 1, 1);
}


This just tells the editor where each tile will go in the auto tile. The auto tile can then be selected in the editor and draw freely:



When an auto tile is placed, it just does a brute force search of all the tiles around it and decides which static tile it should draw based on the asset shown above. This saves me tons of time and makes the environments look much more diverse and interesting. Auto tiles can also have variations which are randomly picked when drawn.

Bitmap animation tiles can also be drawn if a given animation asset is marked as "tile". This animation can then be selected from the tilemap editor and drawn like any other type of tile. When rendering, all animation frame data for a used animation is uploaded to the GPU and evaluated in the vertex shader. This means we don't have to update vertex buffers for a tilemap every frame just because it's animation is updating.



Altogether tilemaps make up most of the world in Nirion. They define environment collisions and most of the visuals in the world. Once they are drawn, they are attached to an entity and placed in the world. Placing multiple tilemaps together gives the illusion of one large world.

Tilemap collision and data

One thing I didn't mention about tilemaps is that they don't just draw static tiles directly. When a static tile is drawn, a special tilemap called a "tileset" is referenced. A tileset is drawn on, just like a normal tilemap, except instead of drawing graphics on it, information about a specific tile is drawn. This includes data like collision and jump direction(used to compute where the player should land when they jump off a wall).

This uses the tilemap editor as well.


Painting jump direction on the tileset

When you paint a tile on a normal tilemap, it copies this data over. This adds a level of indirection so that I only have to paint collision and jump direction etc for a tileset(all of the Mine area's tiles in this case) instead of on every individual tilemap. This obviously saves me a ton of time and avoids mistakes.

Some other data is computed when the tilemap is saved, such as the height at that tile:


Automatically generated heights.


Entities and layers

The last thing that makes up the world is entities. Entities are used when an object needs additional functionality that a tilemap can't provide. For example, doors are entities because they can be opened and closed based on events in the game. Other entities include enemies, destructible objects, triggers, lights and upgrade pickups.

When entities are placed in the world they can be assigned a layer value. This is an integer value that determines how "high" in the world the entity is placed. Entities with different layer values do not interact physically, and rendering will be altered based on an entities layer relative to the camera's layer. Layers below the camera will be dimmed and scaled down, and layers above the camera will be scaled up and eventually faded out.



Conclusion

That basically sums up how I create the world in Nirion. The way this works has been constantly changing all throughout development and is hopefully stabilizing now. Some things will probably have to change as the game scales up(such as the world area loading criteria and how entities are rendered on top of tilemaps), but hopefully most of this will remain the same. I hope this has been interesting, and maybe next time I'll finally be done with a demo.

Let me know if you have any questions!
Ben
This update will be a fair bit shorter than usual due to being busy with Nirion's development, trying to get a polished demo done, and not having a lot of new things to show. Next month should be back to normal. This month I've mostly been working on a little more level design, reworking one of the bosses, the minimap and sound design. In this post I'll go over my approach to Nirion's minimap.

Design

The minimap in Nirion looks like this:


As you can see, it roughly shows the shapes of rooms that have been discovered, uses colours to give the player information about the room(green for save room, flashing white for current room), and shows door connections as narrow red paths. The purpose of the map in Nirion is to help the player get around the world, but without being distracting or too "useful".

Personally I dislike minimaps that are either too detailed or show precise locations of moving objects(like the player), because I tend to look at the minimap in place of the actual game world a lot of the time. At first I did try out having an indicator(red dot) for the player as they moved around, as well as revealing tiles on the map as the player explored:



I didn't really like how this was feeling while playing. Revealing the map tile by tile felt like a chore and made you feel like you had to reveal every single one. After trying a few things, I settled on revealing rooms all at once(still hiding secret areas until the player walks into them). I also removed the player indicator and then just "pulsed" the current room white instead. I think this feels a lot better, and I'm fairly happy with it as a whole.

Implementation

Implementing the minimap took me a few days. I tried out a few different ways of rendering the rooms, as well as a few different ways of storing the map reveal data. One thing I decided on pretty early(and stuck with) was storing the map data as separate chunks that could go into the asset system.

When the game map is saved(there's only one map in Nirion) the asset preprocessor analyses the world and builds a collection of minimap tiles for each world area. A minimap tile consists of minimal information such as the tile's position, flags(is it a wall, passage way, etc), and if it's a secret area. Determining if a tile is a wall or not is just done by checking it's neighbors. Since a wall is just a white block, no "wall direction" needs to be considered. Passage ways(narrow red paths) are already being determined for gameplay anyway, since any entities in a passage way needs to render at a lower layer, so if a tile is part of a passage way(and not a wall), it gets a red colour. A world area is a chunk of the world that is loaded together in Nirion. These are what world areas look like in the editor, they are outlined in purple:



These tiles are saved as their own asset and can be loaded independently of each other and the actual world data. When a minimap asset is loaded, a vertex buffer is created which just stores position, colour and room ID per tile. Room ID is needed while rendering to fade in/out hidden rooms, since the minimap chunk is rendered in a single draw call. Right now the tiles are just instanced rectangles. This seems to work well enough for now, and gives a look that I'm happy with.

Saving minimap progress was done very naively while I was still working out how I was going to approach the design. In the save game data, I would store a list of world area reveal data structures, which themselves would have lists of tile reveal data structures. When the player reveals a new tile, I would just do two linear searches over these lists to check if the tile was already recorded. If not, it would be added.

I definitely had some concerns about this, but it become a great deal easier once I removed tile based revealing. Now all that has to be stored is which world areas have been discovered, and within those which rooms and secrets have been revealed. Now all that's required(when updating map progress, and rendering each chunk) is a linear search over revealed world areas. Maybe I'll have to make this more sophisticated later on, but it works fine for now.

I tried a few different ways of rendering and storing the minimap tile data. One approach I tried was to create a smoothed concave hull around the vertices of the tiles and then triangulate the hull. I used this algorithm: https://www.researchgate.net/publ...egion_occupied_by_a_set_of_points, which I had already implemented previously for another purpose. I got this working, but it a few issues. For one, the resulting mesh was much less consistent than a tile based approach. It looked a little too smooth for the art style of the game, and also had varying behavior based on the room shape that the walls formed. Another issue was with holes:



These were completely left out(which makes sense, since it's just a concave hull and only makes one shape around the entire collection of points). The last issue was with displaying information like passage ways and walls. At the end of the process, the information about whether a tile was a wall or a passage way etc was lost, and all I had was one giant mesh to work with.

Besides the rendering of the actual map itself, some objects can be seen and displayed on the minimap as a dot. This happens with upgrades. If they're seen, they show up as a white dot on the map, and when they're collected they change to an outlined dot. An entity can simply add itself to the list of minimap objects for the world area that it's in, and this will be rendered on top of the map.



Conclusion

That's basically all there was to the minimap. It was more about trying out different designs until I got something I was happy with. Right now it's just being rendered in the top left corner of the screen, but eventually(maybe after the demo) I'll use it to make a full sized map screen that can be zoomed in/out.



Thanks for reading! Like I said, next month I should be back to doing larger updates. Right now I'm currently doing a lot of sound design, since that's been somewhat neglected for a while. Since my last blog post I have started posting on my twitter, so for more frequent and mostly visual updates, you can follow here: https://twitter.com/broscoe8

Again, let me know if you have any questions!
Ben
Over the past few weeks, I've been slowly going down my list of things to fix and improve for Nirion. One of the bigger tasks was to create a more generic and easy to use particle system solution. This ended up taking a week or two of my time, but so far it seems like it was worth it. Another big focus was on figuring out more interesting puzzle elements and improving level design overall. I believe the last few weeks of development have really gone a long way in making Nirion feel more like a complete game!

Generic Particle System

Up until I reworked this system, all particle effects had unique code written for their setup and simulation. This works great for some more complex effects, like the player evaporating into pixels when saving:



But for simple effects, I found it to be quite a lot of setup and code just to do what is basically changing parameters. My solution to this was to just change common settings in the editor. This is pretty much the solution you would find in something like Unity or Unreal 4. While doing this though, I needed to keep the ability to define effects in code since some effects are just easier to program directly.

My approach was to define a bunch of parameters that the editor exposes, based on the things I think I need to be able to change(velocity, colour, rotation, parallax), and apply these parameters to a "generic" effect type. This effect will just see what parameters are enabled and how, and then simulate the particles based on this.

I ended up with a data structure like this one:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
struct GenericParticleParameters
{
        GenericParticleParameterF32 lifetime;
        GenericParticleParameterF32 scale;
        GenericParticleParameterF32 velocityX;
        GenericParticleParameterF32 velocityY;
        GenericParticleParameterColour rgb;
        GenericParticleParameterF32 colourA;
        GenericParticleParameterF32 parallax;
        GenericParticleParameterF32 rotationalVelocity;
        GenericParticleParameterF32 animationTime;
};


Where GenericParticleParameterF32 looks like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
enum GenericParticleParameterType
{
        GPP_Off,
        GPP_Constant,
        GPP_LinearOverLife,
        GPP_CurveOverLife,
        GPP_RandomBetween,
        GPP_RandomBetweenCurvesOverLife,
};
struct GenericParticleParameterF32
{
        GenericParticleParameterType type;
        f32 valueA;
        f32 valueB;
        AnimationCurve curveA;
        AnimationCurve curveB;
};


While editing the effect, the user can select the type from a drop down list and fill in the needed data based on the type. GPP_Off just ignores the parameter, GPP_Constant just sets the value to "valueA" on spawn, GPP_LinearOverLife will perform a linear interpolation between "valueA" and "valueB" over it's total lifetime, GPP_RandomBetween will pick a random value between "valueA" and "valueB" on spawn, GPP_CurveOverLife will evaluate a curve defined in the editor where the x axis is the particle's life, and GPP_RandomBetweenCurvesOverLife will interpolate between two different curves based on a random value that is set on spawn. These settings seem to cover most things I need right now, but it should be pretty easy to add more later if needed. GenericParticleParameterColour is basically just a v3 version of this struct.

Using these parameters when spawning looks like this:

1
2
3
4
5
6
7
8
    if(IsGenericParticleParameterEnabledForSpawn(&genericParams->velocityX))
    {
        EffectModule_GenericF32Parameter(system, system->particles.velocityX, spawnStart, spawnEnd, genericParams->velocityX, randomState);
    }
    if(IsGenericParticleParameterEnabledForSpawn(&genericParams->velocityY))
    {
        EffectModule_GenericF32Parameter(system, system->particles.velocityY, spawnStart, spawnEnd, genericParams->velocityY, randomState);
    }


And during actual simulation update:

1
2
3
4
5
6
7
8
    if(IsGenericParticleParameterEnabledForUpdate(&genericParams->velocityX))
    {
        EffectModule_GenericF32Parameter(system, system->particles.velocityX, 0, system->activeParticles, genericParams->velocityX, randomState);
    }
    if(IsGenericParticleParameterEnabledForUpdate(&genericParams->velocityY))
    {
        EffectModule_GenericF32Parameter(system, system->particles.velocityY, 0, system->activeParticles, genericParams->velocityY, randomState);
    }


EffectModule_GenericF32Parameter will loop over all particles and apply the parameter based on it's type.

Here's what the editor looks like for a generic particle effect:



Each parameter is listed in the entity properties. It's type can be changed through a drop down menu, and it will only show the parameters that apply to the current type. Multiple emitters can be added to the same entity in order to compose interesting effects.

Here are some effects I've done so far:





Obtain Upgrade Sequence

I've finally gotten around to making something happen once you pick up an upgrade. This is definitely still a work in progress, but shows roughly what should happen when an upgrade is picked up:



It's pretty straightforward, the player touches the upgrade, an effect plays and a popup appears telling the player how to use it or what it does. For this I've draw some animated tiles and a dialogue box which is probably way too obviously inspired by early Final Fantasy games. Since this is the first time I've had to add player facing text, I should be using a string table for localisation, but of course I'm putting that off for now.

Editor Improvements

Although the generic particle system itself was pretty straightforward, improving the editor UI to support the various things needed by it took a lot of work.

One early improvement I made was to the entity properties window in general. It use to display properties as one giant tree which would quickly indent very far to the right. I've added the ability to assign properties to categories, and made category headers large and obvious:


I added a basic curve editor which manipulates an array of points with a certain scale. This is pretty basic at the moment and only connects the points together with a straight line. I'm planning on implementing saving/loading for curves, but haven't gotten around to it yet.


I've also implemented a colour picker for easily specifying colours. Until now I've just been typing in RGB manually:


The editor is now a lot nicer to use, but still has a long way to go. I am really spending a minimum amount of time on it, since I prefer to spend time on the game where I can, but it was good to make some large improvements like this over the course of a week or so. All the editor UI is still being done using the IMGUI code I wrote near the start of the project(over a year ago now) and it's still holding up pretty well. I was feeling like it was too verbose for what I needed, but after using it for some of these things I think it works well.

Entity Templates

Up until this point entity types could only be defined in code, as I described in my first blog post. This worked completely fine for the most part, if I wanted to make a new type of entity, I could just define it in code, expose whatever properties it needed, and then change properties on the map instance.

The way entity instances work is by saving their differences with the "default" entity for that type. For example, if I define a point light entity:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
TYPE()
 struct PointLightEntity
 {
     PROPERTY()
         f32 radius;
     PROPERTY()
         f32 intensity;  
     PROPERTY()
         v4 colour;
 };
ENTITY_DEFAULT_FUNCTION(EntityDefault_PointLight)
{
    ENTITY_EDITOR_NAME("Point Light");
    entity->pointLight.radius = 20.0f;
    entity->pointLight.intensity = 2.3f;
    entity->pointLight.colour = V4(0.8862f, 0.7254f, 0.4549f, 1.0f);
    
    entity->pauseLevel = SimulationLevel_Cutscene;
    
    PushEntityVisualPlaceholder(system, entity, {});
    
    SET_EDITOR_ICON(BID_EditorPointLightIcon);
    return CreateEntityDefault(entity, true, true);
}
DEFINE_ENTITY(hash="PointLight", type=ET_PointLight, func_Default=EntityDefault_PointLight);


The "default" instance of this point light entity will be constructed by running this function when the game starts up. This is used as a base for saving or loading any point light entities in the future. When we place a point light in the map and change it's radius for example, all that will be saved about that point light entity is that it is a point light and that we've changed the radius. If we want to spawn our point light, we simply need to copy the default point light and then apply our property differences on top of this.

Now that I'm making different types of generic particle effects, which are the exact same type as far as the code is concerned, this approach means that it's not actually possible to spawn these effects since they were created in the editor and there's no way to reference them. My approach to dealing with this problem was to save the entity as a "template" in the same way I would save entities to the map. We save which type the entity is and it's differences from the default of this type. This information is saved as an asset and can be referenced through the asset system just like any other asset type.

When spawning an entity we can now optionally take either a reference to a entity type that was defined in code, or an asset reference to an entity template.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
struct EntityDefaultRef
{
    EntityDefaultRefType type;

    u32 templateHash;
    i32 codeType;
};
static EntityDefaultRef CodeRef(EntityType type)
{
    EntityDefaultRef result = {};
    result.type = EntityDefaultRefType_Code;
    result.codeType = type;
    return result;
}
static EntityDefaultRef TemplateRef(u32 hash)
{
    EntityDefaultRef result = {};
    result.type = EntityDefaultRefType_Template;
    result.templateHash = hash;
    return result;
}


After an an entity template instance is placed in the map, it loses it's reference to the template and acts as if it's just an instance of the underlying code type. I've thought about linking instances to their template type and updating instances whenever the template changes, but it didn't seem necessary to me and seemed to complicate things quite a bit. Overall this works pretty well for referencing editor created entity types.

Level Design

I've also been trying to improve the level design. I've added more objects to interact with and some areas that take more thought to get through. As I've said before, I don't want puzzles to be the main focus of this game, but I think some of the areas I've made so far were a little too empty. Playing through it, this seems like a massive improvement and like it's heading in the right direction.





Conclusion

That's everything I've been working on lately. I'm still going through the mines area and improving the level design mainly, but besides that it's mostly just bug fixes and minor improvements left before I would want to release a demo. Once I get the first area of the game very solid and to a level I'm happy with, hopefully it will be relatively easy to add more and complete the rest of the game. Thanks for reading!