Survival Game – Behavior Trees & AI Senses (Section 3)

Reading Time: 6 minutes

The project was originally created as a series of bi-weekly updates and documentation. Since the official wiki has been taken down, the sections are now hosted here and slightly updated.

The source is updated to the latest engine version. Get the latest project source at GitHub.

Introduction

In section three we introduce the first features for our enemy AI. It can sense a player with both vision (using line of sight checks) and by sensing noise made through footsteps and gun shots. The AI is set up using C++ and a Behavior Tree including a custom Behavior Tree task in C++ to find an appropriate waypoint to wander around the level.

For a step-by-step tutorial on setting up Behavior Trees to follow a sensed player (using Blueprint) see Behavior Tree Quick Start Guide

Update: I have a more detailed video on Behavior Trees (as part of my Udemy course below.

This section will go into the C++ concepts of dealing with PawnSensing, Blackboards, and Behavior Trees in Unreal Engine 4.

What is a Blackboard?

A blackboard is a data container for the Behavior Tree to use for decision making. A few examples of Blackboard data are TargetLocation, NextWaypoint, TargetPlayer and NeedAmmo.

What is a Behavior Tree?

A Behavior Tree holds the logical tree to drive motion and decisions for a bot. It can be used to search for ammo, follow a player or hide from the player in case hitpoints are low or the bot has no ammo available. It requires a Blackboard to retrieve and store data.

PawnSensing

UPawnSensingComponent will give eyes and ears to AI bots. For noise sensing an additional UPawnNoiseEmitterComponent is required on our AI character.

Seeing

PawnSensing supports line of sight checks to sense other pawns. There are a couple of variables to tweak including peripheral-vision angle and sight radius. By default only players are sensed, so we don’t need to filter out AI controlled enemies when updating our target to follow.

Hearing

PawnSensing supports hearing of other pawns. This has nothing to do with the actual audio playback in the game, but is a separate system that uses UPawnNoiseEmitterComponent and calls to MakeNoise(…) to trigger events related to noise (eg. footsteps or loud gun noises)

For the AI of this project we implemented footsteps and gun sounds that both call MakeNoise(…). To trigger footstep noise at the appropriate moments of the animation we need a custom AnimNotify as seen below.

The top (yellow) notifies are custom notifies that we bind to our C++ code in the Animation Blueprint to add calls for MakeNoise and to keep track of the last moment a noise was made (to visualize noise in the HUD)

Structure of the Zombie AI

There are many ways of setting up your AI class structure, I will briefly go over the one used in this project to make it easier to dig into the code and follow along with this guide.

SZombieAIController

The AI Controller possesses an AI Character and holds the components for the Blackboard and Behavior Tree. It’s the access point to update and retrieve Blackboard data in C++.

SZombieCharacter

Has the components for pawn and noise sensing. Updates Blackboard data through its AI Controller. Contains a Behavior Tree asset that is initialized by the AI Controller on spawn/initialization.

Behavior Tree

Referenced by the AI Character class. Initialized by the AI Controller.

Blackboard

The blackboard is referenced by the Behavior Tree asset.

Setting up the senses in C++

To set up our senses in C++ we need a UPawnSensingComponent in the AI character class.

To react to sense events from this component we bind our delegates (functions that can be hooked to other classes to trigger events, much like you do with binding of mouse and key input to functions in C++) during BeginPlay.

void ASZombieCharacter::BeginPlay() 
{ 
Super::BeginPlay();

\* This is the earliest moment we can bind our delegates to the component */
if (PawnSensingComp)
{
	PawnSensingComp->OnSeePawn.AddDynamic(this, &ASZombieCharacter::OnSeePlayer);
	PawnSensingComp->OnHearNoise.AddDynamic(this, &ASZombieCharacter::OnHearNoise);
}
}

When either of these functions are called, they will update the Blackboard with a new move-to target (the sensed player character) through the AI Controller of the AI Character instance.

Do note that to support hearing the AI requires a UPawnNoiseEmitterComponent to receive data from any MakeNoise(…) calls other Pawns may produce. (We add this component in SBaseCharacter.h class)

Setting up the AI Controller

The AI Controller contains the components for Blackboard and Behavior Trees (Although note that the behavior tree itself resides in the AI Character so we may re-use the same AIController class with different bot behaviors) It is the gateway to update data to the Blackboard and runs any available Behavior Tree that was provided by the AI Character it possesses.

Initializing Blackboard & Behavior Tree

Whenever a bot is initialized or respawned it will be possessed by an AI Controller. This is the moment to initialize the Blackboard and run the Behavior Tree to start the bot decision making.

void ASZombieAIController::Possess(class APawn* InPawn) { Super::Possess(InPawn);

ASZombieCharacter* ZombieBot = Cast<ASZombieCharacter>(InPawn);
if (ZombieBot)
{
	if (ZombieBot->BehaviorTree->BlackboardAsset)
	{
		BlackboardComp->InitializeBlackboard(*ZombieBot->BehaviorTree->BlackboardAsset);

		\* Make sure the Blackboard has the type of bot we possessed */
		BlackboardComp->SetValueAsEnum(BotTypeKeyName, (uint8)ZombieBot->BotType);
	}

	BehaviorComp->StartTree(*ZombieBot->BehaviorTree);
}
}

Updating Blackboard data

When new sense data is available it must be updated in the Blackboard for the Behavior Tree to use. For this we need the KeyName (eg. “TargetLocation” as specified in the Blackboard asset) and the Blackboard Component. Below is one example of how we can push this data into the Blackboard.

void ASZombieAIController::SetTargetEnemy(APawn* NewTarget) 
{ 
if (BlackboardComp) 
{ 
BlackboardComp->SetValueAsObject(TargetEnemyKeyName, NewTarget); } 
}

Breaking down the Behavior Tree

The Behavior Tree steers the decisions and motion of our AI bot. These decisions are based on the available data in the Blackboard asset.

Chasing the player

Whenever a player is sensed and the TargetEnemy is updated by the bot class in C++ we will successfully pass “Has Sensed Enemy” and move to “Has Target Location” which is set by the same AI Character C++ class. That should succeed and move into “Move to Sensed Player” to finally move to the TargetLocation.

“Has Sensed Player” has “Aborts Lower Priority” set up so we can immediately cancel our any other running behaviors when this value changes. This is used in this particular tree to cancel the patrol/wandering behavior on the right side of the tree.

Wandering the map

By default the bot is set to Passive (this is a custom Enum we created in STypes.h), this completely disables the wander/patrol part of this blackboard through the conditional “Should Wander” check in the tree.

When enabled it will try to locate a Waypoint object in the level through the “Find Bot Waypoint” task. This is a custom task we created in C++ to search for objects on the map of the SBotWaypoint type. When the “Has a Waypoint” succeeds we will continue with another custom task that finds a position on the navigation mesh nearby the Waypoint object we found previously. And finally we “Move to Waypoint”.

Both “Find X” tasks in the tree update the blackboard with new data for the other nodes to use (in this case CurrentWaypoint and PatrolLocation are updated by these tasks)

This flow will be cancelled as soon as “Has Sensed Enemy” is successful so sensing an enemy takes priority over wandering around the map.

Notes

  • To work with the AI features of the engine we must include the “AIModule” in SurvivalGame.Build.cs, please don’t forget this module if you’re re-creating any of these features for your own project.
  • We have several physically simulated barriers in our level, this requires a dynamic Navigation Mesh. To set this up in your own project go to Edit > Project Settings > Navigation Mesh and enable “Rebuild at runtime”.
  • When using bots any level must include an encapsulating Nav Mesh Bounds Volume (under Modes > Volumes, see image)

Closing

In this section we’ve added the basic follow and patrol features for our zombie AI. In upcoming sections, we will continue to expand the enemy by attacking an attack ability etc. If you are confused about a particular feature or piece of code, feel free to leave a comment below!

Leave a comment on this post!