This is an AI system aiming to replicate similar behaviors to real life creatures, making use of sensors such as hearing, vision and proximity to create AI behavior that looks as natural as possible.
Thereafter we can make use of these sensors to impact an awareness system created for the AI.
This system can be designed easily on both Unity or Unreal, the engine is not important, the logic and the design of the system is.
- Creating AI sensory system
- Creating AI awareness system
This class represents our AI. Holding information such as our ranges for hearing, seeing and proximity. Our sensors rely heavily on the information it is given from the AI.
In order to create sensors for the AI I need to first identify to my AI what it will find of interest (the player). This is done using a simple detectable target class that functions more like a tag Thereafter I create a detectable target manager class that stores all detectable targets so that they are obtainable information to the AI when needed.
A sound will be emitted from the player when they do anything that is audible, for example running. If the emitted sound is within range of the enemy AI this sound will be heard with a specific intensity value. This is to make louder noises have a higher intensity value. For Example running would be louder than walking.
public void OnHeardSound(GameObject source, Vector3 location, EHeardSoundCategory category, float intensity) { // outside of hearing range if (Vector3.Distance(location, LinkedAI.EyeLocation) > LinkedAI.HearingRange) return; LinkedAI.ReportCanHear(source, location, category, intensity); }
The player will be seen by the AI if the player is within the AI’s vision cone, this is done by casting a ray from the AI eye location and then calculating the peripherals using the dot product, The closer the player is to the middle of the eye location the more visible.
Using several exceptions we can make sure to return early before doing any of these calculations to save on performance. There exceptions are
- If the candidate (player) is ourselves
- If the candidate is out of range
- If the candidate is out of the vision cone
void Update() { // check all candidates for (int index = 0; index < DetectableTargetManager.Instance.AllTargets.Count; ++index) { var candidateTarget = DetectableTargetManager.Instance.AllTargets[index]; // skip if the candidate is ourselves if (candidateTarget.gameObject == gameObject) continue; var vectorToTarget = candidateTarget.transform.position - LinkedAI.EyeLocation; // if out of range - cannot see if (vectorToTarget.sqrMagnitude > (LinkedAI.VisionConeRange * LinkedAI.VisionConeRange)) continue; vectorToTarget.Normalize(); // if out of vision cone - cannot see if (Vector3.Dot(vectorToTarget, LinkedAI.EyeDirection) < LinkedAI.CosVisionConeAngle) continue; // raycast to target passes? RaycastHit hitResult; if (Physics.Raycast(LinkedAI.EyeLocation, vectorToTarget, out hitResult, LinkedAI.VisionConeRange, DetectionMask, QueryTriggerInteraction.Collide)) { if (hitResult.collider.GetComponentInParent<DetectableTarget>() == candidateTarget) LinkedAI.ReportCanSee(candidateTarget); } } }
The AI will immediately detect the target (player) if it is super close to the enemy, even if the AI might have not heard the player or seen it. This is done using the proximity sensor. The range for the proximity sensor is extremely close to the AI. This is to simulate a natural behavior where the AI can feel if something is too close to it.
void Update() { for (int index = 0; index < DetectableTargetManager.Instance.AllTargets.Count; ++index) { var candidateTarget = DetectableTargetManager.Instance.AllTargets[index]; // skip if ourselves if (candidateTarget.gameObject == gameObject) continue; if (Vector3.Distance(LinkedAI.EyeLocation, candidateTarget.transform.position) <= LinkedAI.ProximityDetectionRange) LinkedAI.ReportInProximity(candidateTarget); } }
Finally with all the AI sensors created we can make use of these systems with an awareness system. The awareness system is too complex to explain in detail in this text. You can check the code on github if you're interested in every detail.
Simply put the awareness system is a meter that ranges between 0 and 2. at 0 the AI is completely unaware of any detectable target. from 0 to 1 the AI is suspicious of the detectable target or targets. Between 1 and 2 the AI has detected the target and at 2 the target is finally fully detected to the AI.
This awareness value is impacted by the sensors. The proximity sensor will increase the value to 2 very quickly. The vision sensor will increase the value slower than the proximity sensor but faster than the hearing sensor. The hearing sensor is the weakest sensor and will impact the awareness system least.
When a detectable target (player) stops impacting the awareness system the awareness value will slowly start to decrease. When the awareness value reaches 0 the AI is no longer detecting the detectable target or targets.
This awareness system was very fun to work with and makes use of the way humans generally act. The goal was to get natural behavior and I personally think this was met. The system could obviously be expanded to be much more complex but I don't think more work is needed for this project.
There was a lot of hardships and new things to learn during this project, the awareness system was definitely the hardest system to build and understand.