Grouped Lifetime Thinking
One of the Most Useful Concepts In Programming
Memory Management is a critical skill for programmers. Even when using Garbage Collected languages. Yet, it’s often misunderstood, poorly taught, and even maligned.
In this article, we’ll explore the concept of grouped-lifetime-thinking and what the (huge) benefits are.
But First, A Short Story…
You’ve decided to learn programming. You choose a systems language like C - it’s small and simple - right?
You pick up a data structures text-book or look at examples online.
You work through the problems, learning a lot about data structures and algorithms to use them - Great!
You’ve probably also learnt to use “dynamic memory allocation” like this:
Node *new_node = (Node *)malloc(sizeof(Node));
if (new_node == NULL) { /* error */ }
The resource you learnt from probably taught you to write a “destroy” function of some kind - traversing the data structure and freeing each Node
.
You pick up this habit. You allocate one Thing
at a time when you need it.
You start to create more complex programs with many moving parts.
Forgetting to free
a Thing
causes a memory leak somewhere leading to an eventual hang.
You spend hours debugging, looking for all calls to malloc/free
.
You create ThingA
that has a pointer to ThingB
. ThingB
gets freed and the pointer remains.
You inadvertently start writing the wrong data to ThingB
which is now used by some other allocation.
You eventually learn how to use tools like Asan and valgrind. They help… with managing bugs - you still have to manage the complexity of the allocations.
Eventually, you decide that people are right and manual memory management is too hard for humans.
You switch to a Garbage Collected language. You can now litter new
and delete
everywhere in your code without a care. Fantastic - ultimate productivity, and no pesky allocation bugs!
One day you send your project off for beta testing with the users. The users don’t have a MacBook M3. The responses come back: “It keeps slowing down”, “It’s laggy”, “Noob dev? Lol”.
You load up the program with a profiling tool and manage to drill down into the issues. You see a saw-tooth memory usage chart. The tips of the pattern line up with with lag-spikes from high CPU usage. It’s the Garbage Collector.
You do some research, wondering how to remedy this situation. Eventually, you find an article about “Object Pooling”.
Object Pooling allows you to re-use the same memory for each Thing
of the same type.
That way it doesn’t get freed by the Garbage Collector and is exempt from the clean-up phase.
You implement the solution and it seems to work very well!
Congratulations, you just implemented Manual Memory Management to circumvent the Garbage Collector.
What I’m covering here is one of the most important concepts when it comes to memory management.
Thinking In Groups
I’ll use a gaming example here. Imaging you are working on a first-person shooter which has collision detection and damage text.
Each update, you want to check the collisions of the player and enemies to make sure they are staying within the world geometry.
These collision results need to live somewhere - but they only need to live until the end of the physics update.
This is a perfect candidate to put into an arena allocator that is “cleared” at the end of each update.
In practice, this means you allocate a chunk of memory that can either grow or has enough space to store all update-lived objects.
When you want to allocate something that will not be alive on the next loop, you allocate into this chunk of memory.
At the end of each update, you free the memory of the entire chunk. The behavior here will be implementation dependent.
Some Areas release the memory and some just set the position to 0, ready to overwrite the previous data.
main :: proc() {
world := new(World)
world.update_allocator = arena_allocator_init(UPDATE_ARENA_SIZE)
}
update :: proc(world: ^World) {
player_collisions := detect_collisions(world, &player)
for pc in player_collisions {
// do whatever
}
// sets the memory block position back to zero
free_all(world.update_allocator)
}
detect_collisions :: proc(world: ^World, unit: ^Unit) -> []Hit {
hits := make([dynamic]Hit, world.update_allocator)
// whatever collision detection code
return hits[:]
}
What about damage text? Well, it probably needs to float on the screen for a while - maybe fade out or have some kind of animation.
That means not sticking it into the update-arena.
This is a good candidate for a Pool - it’s exactly the same concept as mentioned above.
Implementations vary, but essentially when an instance of damage text is done animating, it’s marked as “unused” and can be reused the next hit.
The Pool would be allocated at startup and never freed.
The benefits of thinking in grouped lifetimes are immense.
You get the usage code similar to Garbage Collection but without the trade-offs.
Or, rather, the trade-off is a few lines of code and a tiny bit of thought.
- Less cognitive load
- Less bugs
- Faster ship time
- Performance by default
Why Don’t We All Do This?
I’ve listened to Casey Muratori speak about this. He says that we learn programming in stages - and this individual lifetime stage is where everyone starts.
Not everyone makes it to the grouped lifetime stage, but many do.
I feel I’m firmly in the grouped lifetime stage now, and I can’t help but wonder if all the engineering effort that goes into individual lifetime solutions is a sad waste.
For example, Garbage Collection, Smart Pointers, RAII. They all seem like unnecessary complications when you already know the lifetime of data in your programs.
I can say for certain that my programs are easier to implement, understand, and modify since I’ve adopted this style of programming.
I never have to track down malloc/free
pairs.
The amount of thought I put into memory management is tiny, and my code is better.
As I discover more about these game programming topics, I’ll continue writing about them - so, sign up for free today to get future issues.
Thank you for reading! If you know anyone who may benefit from from reading this, please share it with them.
Cheers, Dylan