Unreal Blueprint documentation: Interactive VR Item
The blueprint that I am showcasing in the video (Unreal Blueprint Implementation video on the 'Portfolio 1' page is most certainly one that I am incredibly proud of! I made this blueprint to be able to interact with items in a VR space with basic functionality for sounds in my “VR Dungeoneer” game audio demo. It should be easily transferrable to a non-VR game as well. The blueprint is simply copy/pasted onto all interactive objects that can be picked up, dropped, thrown, and swung around. I use “AND” in reference to the AND gates I have in the blueprint. The sounds in this game are all controlled by Wwise. This will be a quick and not super in-depth description of the blueprint.
The first portion shown is when the item is picked up. When picked up, a Wwise event is triggered for that specific items pickup sound (example: Ak Event BattleAxe_Pickup is assigned to the Battle axe). The second portion down is for when the item is dropped. I have a sequence object that plays a drop sound followed by a whoosh sound when the actor is dropped.
Moving on to the crazy long string of code we can study the more complicated physics. The second section of the video is dedicated to the velocity of the held object for controlling when an impact sound should occur and when a whoosh sound should occur when flailing an object around. To make this happen, I made an array and created two child actor components and placed them on the highest and lowest regions of the objects to trace their location and velocity every game tick. After the game realizes the actor is held, a ForEachLoop is enabled with the “Velocity Watchers” array. This triggers a few things like using the velocity formula to determine the threshold velocity in which the whoosh sound can occur. This is all so we don’t hear a whoosh with every tiny movement repeated over and over and over again. The “Velocity Watchers” last location is grabbed and it does the math. Velocity = ((final position) – (initial Position)) divided by time. Our time variable is our delta in seconds from the game tick event. This gives us our “speed” so to say. Then we check if our speed is greater than 1000. If it is over 1000, AND the “Whoosh Is Not Busy” (not being played), AND we are not teleporting (a safeguard so when we teleport we don’t hear a whoosh sound), then our Ak Event for our whoosh sound is performed. While the sound is being performed, the Boolean whoosh variable is set to “on.” Which is “Whoosh Is Busy.” I have a delay of 300ms before that variable is switched back off. Finally, we must set the last location of our “Velocity Watchers” to our current location for use in the next tick.
Our final component of this blueprint is to bind impact events. The component is line traced, which feeds into a custom event that I call “OnActorHit_Event.” A Vector Length is taken from the impact event to return a floating-point number for when the object physically hits a material. I use the vector length to determine the impact sensitivity. The number is checked and if it is greater than 1250 AND not currently playing a sound, then it meets a Boolean condition, triggering a branch’s true statement. If it is true, then our signal goes to an attenuation scaling factor object before the sound is spawned. This attenuation factor is set high so we cannot hear the impact noise from far away. This helps smooth out the volumetric radius that I set in Wwise for the Oculus Spatializer plugin. Even if not using the OSP, this would help localize the sound more accurately. The Impact sound is spawned at the impact location. I don’t use a “Post Event” object because if I did, the sound would travel with the object instead of staying at the impact location. Finally, a variable is set for “Impact Is Busy” after the sound is spawned. After 100ms, the variable is switch off, allowing the sound to occur again at a different impact.