
Project Details:
Rager is an online co-op multiplayer party game in which up to four friends crash an american-style house party and cause chaos while avoiding getting caught. The project was a co-production with US partners Gus Johnston and Dave McElfatrick (creator of Cyanide and Happiness), with my studio handling all design and development while handling creative input and direction changes from stakeholders. The project has been through three major phases of development, from initial prototype to fully playable vertical slice, and is now on hold while a publisher is found for full development.
Key Responsibilities:
• Lead Developer: Responsible for the majority of gameplay and systems programming on the project, integrating third party tools, netcode, physics, UI development, and for tasking and unblocking the other programmers.
• Business Development: Developing timelines, budgets, and pitching materials to secure funding.
• Stakeholder Management: Maintaining relationships with funding stakeholders, including publishers, investors, and public bodies.
• Business Development: Developing timelines, budgets, and pitching materials to secure funding.
• Stakeholder Management: Maintaining relationships with funding stakeholders, including publishers, investors, and public bodies.
Technology Stack:
• Unity 2022.3
• C#, Visual Studio
• Universal Render Pipeline
• ShaderGraph
• Netcode for GameObjects
• Unity Game Services: Lobby, Relay
• Perforce
• Miro
• Ora
Third Party Tools Used:
A* Pathfinding Pro, Amplify Shader Editor, Bakery, Obi Fluid, Behavior Designer, Rewired, Various 3D Asset Packs.
• C#, Visual Studio
• Universal Render Pipeline
• ShaderGraph
• Netcode for GameObjects
• Unity Game Services: Lobby, Relay
• Perforce
• Miro
• Ora
Third Party Tools Used:
A* Pathfinding Pro, Amplify Shader Editor, Bakery, Obi Fluid, Behavior Designer, Rewired, Various 3D Asset Packs.
Planning

Initial Game Design
The initial game concept was provided by the clients and translated into game mechanics using an MDA (Mechanics, Dynamics, Aesthetics) mapping process to break the game idea down into its core, pillars, and individual features supporting each pillar. The game design sessions were collaborative between all departments, and as the Lead Developer my contributions included:
• Estimating feature complexity and scope
• Collaborative ideation sessions
• Research on other games and technical feasibility
• Designing solutions to game design problems
• Finding development time and cost savings using third party tools and assets
• Collaborative ideation sessions
• Research on other games and technical feasibility
• Designing solutions to game design problems
• Finding development time and cost savings using third party tools and assets
The initial design changed substantially throughout the development process, starting as a survival horror game with a monster chasing players through a mansion and ending as a chaotic party game with a 90s frat house vibe.

Feature Specification
Each feature went through a formal game design and specification process to fully explore the idea and produce documentation that can be referred back to during implementation. These sessions involved collaboration between game design and the programming and art department leads, and the end results were then presented to stakeholders.
Feature specification sheets helped bridge the gap between design and implementation, provided QA signoff lists early in development, and identified potential blockers early. They also ensured we researched existing games with similar mechanics to avoid reinventing the wheel, got the entire team on the same page with every feature, and forced us to challenge our assumptions. They gave us a very accurate prediction of how long a feature would take to implement and which developers would be best suited to it.

Flow-charting
Many features required detailed specifications such as tables of design specifications or flow-charts showing logical flows and further notes on implementation details. I used collaborative whiteboard software Miro to create flow charts on features that could then be given to other programmers to implement.
This approach let me explore the full scope of a feature quickly, identify any complexities we had missed in the design process, and provided a reference document throughout the project for the intended behaviour of the feature. The flow charts were presented to the programmer working on the feature and served as a platform for making notes or asking questions about it.

Phase Post-Mortems
Development of Rager came in phases tied to funding milestones and event deliverable deadlines. Development was planned in sprints tied to each milestone, and after each milestone we performed detailed post-mortems with the whole team.
This approach helped us identify pain points, things that didn't go to plan, and things that went well during each development phase and lay down a plan for the next milestone or phase. This also gave us a head-start when entering a new phase of development as we had an existing list of actions and priorities, and helped significantly with stakeholder management and producing plans and budgets for further development.
Development

AI Development
The AI in Rager was built using Behaviour Designer with a series of custom actions and pathfinding handled by A* Pathfinding Pro. A modular approach was used where generic actions and sense systems were created that could be re-used across a variety of different AI types:
• A blind horror monster that tracks the player using only sound and attacks
• A janitor enemy that uses a vision cone and sound tracking system together to chase players and drags them back to a room
• A friendly AI that can be made to follow the player, can be escorted to a target location to complete and objective, and runs away scared when it sees an enemy
• A scared AI that runs away from the player and screams when it spots them
• A party guest AI that con be interacted with or given an item and performs an action such as a dance when given the right item
• A janitor enemy that uses a vision cone and sound tracking system together to chase players and drags them back to a room
• A friendly AI that can be made to follow the player, can be escorted to a target location to complete and objective, and runs away scared when it sees an enemy
• A scared AI that runs away from the player and screams when it spots them
• A party guest AI that con be interacted with or given an item and performs an action such as a dance when given the right item

Modular Character Customisation
I implemented a customisable modular character using an approach similar to how No Man's Sky's creatures were designed. The player model contains all possible hair styles, body types, face customisations, and clothing types that can be switched on and off depending on what's needed.
A customisation system handles turning them on and off and applying different textures to different parts. A hat system was implemented and attached to the head bone so that new hats can be implemented without changing the character model.
The online gameplay complicated this as cosmetic changes have to be transmitted to other players in the game. I developed an efficient serialisable data format for customisation data and transmit it via Unity's Lobby system to sync changes with all players. A custom script then activates the cosmetics and allows us to change cosmetics mid-match, such as when the player picks up a hat during play.

Custom Character Controller
The game design called for the character to act drunk throughout the game, using consumables to manage a buzz level (in the bottom left) that influences their movement. We created a custom third person character controller for this purpose, using a physics-based propulsion method:
• Movement speed was determined by the natural physics intersection of the player's acceleration rate and a drag variable, allowing us to tweak the player's controls to slide around more depending on how drunk or buzzed they are.
• As movement is physics based, the player feels like they have momentum or weight. Players running full speed will drift around corners like in a racing game, and players can turn around to come screeching to a stop.
• I added visual and audio cues for hitting certain momentum change thresholds, so a player's sneakers will squeak when they rapidly turn a corner and a puff of dust is seen when screeching to a halt. This added to the comedy effect of the game and the frantic feeling of running away from enemies.
• Created a complex animation controller with blended and masked animations for things like holding items and throwing while running, requiring very few custom animations to be made by the art team.
• As movement is physics based, the player feels like they have momentum or weight. Players running full speed will drift around corners like in a racing game, and players can turn around to come screeching to a stop.
• I added visual and audio cues for hitting certain momentum change thresholds, so a player's sneakers will squeak when they rapidly turn a corner and a puff of dust is seen when screeching to a halt. This added to the comedy effect of the game and the frantic feeling of running away from enemies.
• Created a complex animation controller with blended and masked animations for things like holding items and throwing while running, requiring very few custom animations to be made by the art team.

Netcode
Rager used Unity's NGO (Netcode for GameObjects) framework for all of its networking. The project predated serverless support with distributed authority in NGO and so we used a strict client-server model.
• Unity Lobby was used to allow players to create and join lobbies of up to 4 players.
• Whoever hosts the game becomes the server and all AI executes on their machine. All other players have elements such as the AI disabled and then just see the results of the AI's actions through movement, animations, and RPC calls.
• Player position and animations were synchronised using client-authoritative NetworkTransform and NetworkAnimator components.
• Object positions were not synchronised at all, instead we used deterministic local physics and dispatched events like objects breaking to all clients via RPC.
• In many places we used a paradigm where the client sends data or a command to the server, which then dispatches that data or command to all clients. This gave us a point into which anti-cheat reckoning could be added later if needed.
• Many actions were performed locally on clients instantly for sharp player feedback and then dispatched to all other clients so they see the exact same result. This functionally eliminated the appearance of lag in most scenarios.
• Whoever hosts the game becomes the server and all AI executes on their machine. All other players have elements such as the AI disabled and then just see the results of the AI's actions through movement, animations, and RPC calls.
• Player position and animations were synchronised using client-authoritative NetworkTransform and NetworkAnimator components.
• Object positions were not synchronised at all, instead we used deterministic local physics and dispatched events like objects breaking to all clients via RPC.
• In many places we used a paradigm where the client sends data or a command to the server, which then dispatches that data or command to all clients. This gave us a point into which anti-cheat reckoning could be added later if needed.
• Many actions were performed locally on clients instantly for sharp player feedback and then dispatched to all other clients so they see the exact same result. This functionally eliminated the appearance of lag in most scenarios.

Level Design
The level was initially designed using a set of prefab walls, floors, and furniture from various model packs and other sources, and then new assets were created for items such as beer bottles and anything custom required. As lead developer, my role in the level design was mostly:
• Creating folders full of processed prefabs of items ready for the level designers to use. All objects to be used needed to have correct colliders, layers, tags, and materials.
• Creating functional versions of assets, such as furniture that triggers the bump system, windows that can be smashed, destructible items that can be smashed, items that can be picked up and thrown, and objects that can be wrecked by fluid, working vending machines, and drawers that can be opened to find items.
• Implementing objective targets such as cupboards that spawn keys, the generator that can be plugged in, an escape van that players can use to leave, triggers for special objectives.
• Implementing a mini-map and zone system where each room has its own name.
• Implementing AI pathing and solving pathing problems
• Adding custom NPCs to the map with different needs and activities
• Implementing doors and switches
• Babyproofing the level with invisible colliders to prevent players being shoved out of the level
• Creating functional versions of assets, such as furniture that triggers the bump system, windows that can be smashed, destructible items that can be smashed, items that can be picked up and thrown, and objects that can be wrecked by fluid, working vending machines, and drawers that can be opened to find items.
• Implementing objective targets such as cupboards that spawn keys, the generator that can be plugged in, an escape van that players can use to leave, triggers for special objectives.
• Implementing a mini-map and zone system where each room has its own name.
• Implementing AI pathing and solving pathing problems
• Adding custom NPCs to the map with different needs and activities
• Implementing doors and switches
• Babyproofing the level with invisible colliders to prevent players being shoved out of the level

Physics and Clutter System
A typical Rager level can contain over 10,000 individual physics objects that can bumped into, moved, picked up, thrown, or knocked down. This posed a challenge both in terms of performance optimisation and in reconciling networking so that all players see the same physics interactions.
I adopted a paradigm of doing all physics locally on every client's computer rather than doing it server-side, making the physics feel sharp and reactive for all players. Using deterministic physics simulation, small physics interactions such as walking through empty bottles on the floor were found to produce roughly the same end result on all clients.
Larger interactions such as throwing objects or players ragdolling and falling were implemented using a dispatch system where the controlling player initiating the action would send the details of the interaction to the server and that then gets dispatched to all players so that they execute the exact same effect on their end. The end result is a roughly synchronised physics simulation with almost zero network overhead, and in playtests the difference in simulations was not noticed.

Fluid System
One of Rager's core mechanics is a fluid system that in which the player pees and pukes on objects and surfaces. This involved:
• Integrating third party asset Obi Fluid into the project
• Creating custom fluid profiles for the various fluids in the game
• Performance optimisation experimentation to find the best settings
• Developing a fluid controller that controls emission of fluid and dispatches details of fluid through netcode to other players.
• Creating new shaders using ShaderGraph for things like the chunks in the fluid and adjusting the colours
• Adding appropriate fluid colliders to object prefabs, with performance in mind
• Updating the fluid system to use new versions of Obi Fluid as they were released, taking advantage of new optimisations.
• Implementing fluid particle detection systems for objectives
• Creating custom fluid profiles for the various fluids in the game
• Performance optimisation experimentation to find the best settings
• Developing a fluid controller that controls emission of fluid and dispatches details of fluid through netcode to other players.
• Creating new shaders using ShaderGraph for things like the chunks in the fluid and adjusting the colours
• Adding appropriate fluid colliders to object prefabs, with performance in mind
• Updating the fluid system to use new versions of Obi Fluid as they were released, taking advantage of new optimisations.
• Implementing fluid particle detection systems for objectives

Throwing System
Players in Rager can pick up and throw objects. This system required several iterations throughout development to get right in terms of game feel, accuracy and predictability, and making sure it plays well with netcode.
Initially throwing used a simple parabolic arc, but turned out to be difficult to aim accurately. I changed it to use a raycast to first detect the point under the center of the screen and then adjust the parabolic curve and throwing force so that it always hits that exact point. The start position, angle and force are then dispatched to all players so that they execute the exact same throw locally and see the same results without having to sync timings between players.
I implemented a smart soft auto-aim feature that attempts to detect possible targets such as smashable glass near the center of the screen and adjusts the throwing angle just enough to hit its edge. This auto-aim is subtle and barely noticeable during play, but in testing it made players feel as if the throwing system was more accurate. It also dramatically improved accuracy on controller.

Performance Optimisation
Toward the end of each phase of development, I did a performance optimisation pass to improve frame rate, memory usage, load times, network latency, and physics performance. Some key optimisations we made:
• Fluid System - The fluid system uses a softbody particle simulation that can be very heavy on CPU. We updated to a new version that used a GPU-based solver and were able to convert everything over to that. I also tweaked the fluid simulation to use fewer particles and remove constraints we didn't need. Finally, I incorporated solid particles into the vomit fluid by creating a custom shader that draws instanced models within a deterministically random percentage of the fluid particles rather than creating new particles or objects.
• Rendering - I massively improved the performance of our map rendering by splitting the map into two layers, a background layer that gets rendered once to a texture and a foreground layer that gets rendered in realtime. This allowed a fully detailed map without any of the overhead.
• Physics - The physics timestep was set such that it was in-line with the networking timestep, eliminating common physics sync bugs. All physics objects were put to sleep by default when the game starts, and only awakened when the player bumps into them or interacts with them. A slight physics dampening was applied and the threshold for sleeping physics items slightly increased so that items went to sleep more quickly, using far less CPU.
• Lighting - The vast majority of the lighting in Rager is baked. I used Bakery and figured out the optimum settings to bake on the GPU as part of the build pipeline. We then used post-processing to make the game's lights seem to glow slowly up and down over time, making it seem as if the lights were changing in realtime.
• Memory - I limited the final texture size across the board to 1024 to reduce build size and video memory usage.
• Load Times - I implemented a background loading mechanism for the game's main level. Since we only have a single playable level, we load it automatically in the background and keep it frozen until the game starts, making it seem to load rapidly. In future we would put this loading process behind the map selector on the Lobby so that all players pre-load the level before they can click ready.
• Rendering - I massively improved the performance of our map rendering by splitting the map into two layers, a background layer that gets rendered once to a texture and a foreground layer that gets rendered in realtime. This allowed a fully detailed map without any of the overhead.
• Physics - The physics timestep was set such that it was in-line with the networking timestep, eliminating common physics sync bugs. All physics objects were put to sleep by default when the game starts, and only awakened when the player bumps into them or interacts with them. A slight physics dampening was applied and the threshold for sleeping physics items slightly increased so that items went to sleep more quickly, using far less CPU.
• Lighting - The vast majority of the lighting in Rager is baked. I used Bakery and figured out the optimum settings to bake on the GPU as part of the build pipeline. We then used post-processing to make the game's lights seem to glow slowly up and down over time, making it seem as if the lights were changing in realtime.
• Memory - I limited the final texture size across the board to 1024 to reduce build size and video memory usage.
• Load Times - I implemented a background loading mechanism for the game's main level. Since we only have a single playable level, we load it automatically in the background and keep it frozen until the game starts, making it seem to load rapidly. In future we would put this loading process behind the map selector on the Lobby so that all players pre-load the level before they can click ready.