UDN
Search public documentation:

DungeonDefenseDeveloperBlog
日本語訳
中国翻译
한국어

Interested in the Unreal Engine?
Visit the Unreal Technology site.

Looking for jobs and company info?
Check out the Epic games site.

Questions about support via UDN?
Contact the UDN Staff

Dungeon Defense Developer Blog

Blog 1: Day One

Hey all,

These are some exciting times for indie developers. Not only has the advent of digital distribution opened up a huge market for all sorts of innovative games by small creative teams, but now we have Epic, with undeniably the most advanced game technology in the industry, opening up their platform to anyone with an urge become the next Shigeru Miyamoto (or Kim Swift)!

As a long-time Unreal developer, I’ve jumped on this bandwagon, and over the coming months I’m going to be putting together a series of straightforward mini-game demos on the UDK. My goal is to provide the growing community with some open-source, open-content examples of ways to implement various types of gameplay within the UDK. These should be relatively simple chunks of code and content that may be more easily digestible by Unreal newbies than the full-course meal that is Unreal Tournament.

So without further ado, let’s chow down! (mmm now I really am hungry)

When thinking about what kind of game to start with, I noticed that some people seem to be wondering about creating various kinds of third-person games with Unreal, which is an easy thing to do once you know how. Also, I just came off a round of playing PixelJunk Monsters, so I’m in kind of a “Tower Defense” mood. Fighting off hordes of beasties by strategically building stuff just seems like… lots o’ fun.

Therefore, the first mini-game that I’m going to tackle will be called “Dungeon Defense”, and it will be a semi-topdown (more like ¾ perspective) action/tower-defense hybrid. You’ll play as a l’il mage who, ala the “Sorcerer’s Apprentice”, has got in over his head with defending his mentor’s lair while he’s away. You’ll have to use a variety of “summoning” spells to produce magical defenses throughout the lair, and zap stuff with your magic staff when the defenses happen to get overrun. Should be a nice hybrid of action tactics and resource-management strategy, and with Unreal implementing all of this craziness is going to be a pleasant breeze.

So over the weekend, I did some design work to plan out my assets and control schemes, and then today I officially started programming. First, I started with implementing my own GameInfo, Pawn/Player, Player Controller, and Camera classes. I’ll describe what they do:

The GameInfo class contains the global rules of the game, and so far in my case I just overwrote its default values to spawn in my own custom PlayerPawn and PlayerController class (1). The PlayerPawn is of course the physical character in the World, and the Controller is the class in which the user’s input is transferred into gameplay action on that PlayerPawn. For the camera class, I modified the UpdateViewTarget function in order to position the camera above the Pawn, rather than looking directly out its eyes like UT, and also to dynamically offset a little in the direction that the player is looking, so that it tends to rotate towards your target direction (2). I made use of Epic’s built-in RInterpTo and VInterpTo functions to handle the Rotator and Vector interpolations respectively, which are always handy. This allowed me to lag the Camera from the player’s current location (and rotation) a little bit, giving it a smoother feeling than it if was locked down exactly on his position.

In the PlayerController, I changed the PlayerMove function to only change Rotation Yaw, and no Pitch (so that the character’s orientation is restricted to the 2D plane). At first I was just using mouse delta to try to change the Yaw a little every frame as you scrolled the mouse, but it didn’t feel very natural. It was just too imprecise for the PC. We just want the character to look directly at wherever we’re pointing, right? So I wrote a unique bit of code that gets the results of a “canvas deprojection” of the current mouse position, and then raycasts that 3D vector against the world to find out where in the World the user is pointing at – then makes the character look towards that point. (3)

Of course, I also transformed the input directions by the camera rotation, so that the inputs result in intuitive “camera relative” movement (remembering the ever-useful TransformVectorByRotation function, equivalent to the “vector>>rotator” operator). The Player-Pawn remained pretty much like the default Pawn, except that I did some fun Animation Tree work to blend between multiple custom full-body animations, so we don’t get any pops even if we’re trying to play several animations in sequence. (4) Working with the Anim Tree system, I was reminded of how much I enjoy the Unreal tool suite… configuring your character’s blending visually with real-time feedback sure beats having to hardcode it!

But that stuff only took a couple hours, so I wasn’t ready to call it a day just yet. Next, I got to work on the Mage’s Staff weapon. I sub-classed Weapon and modified it to support “charging” the weapon when it’s held, and only firing upon release (I want the Staff to support variable charge attack) (5) – and then I sub-classed Projectile to support a variable-strength projectile as well, which procedurally scales all of the visuals accordingly. (6)

I should mention that I find it extremely useful to spawn “Archetype” references rather than a direct class reference (7) -- you can specify an Archetype in the “Spawn” function, as the so-called “Actor Template”) If you spawn Archetypes for gameplay, then you can have your entire library of gameplay objects ready to be configured real-time in the Editor, rather than having to go change the “Default Properties” every time you want to tweak a value. It also makes it a lot easier to visually configure values, swap out media, and create multiple variations of the same core class that differ only in properties. I’ll get into more detail about the power of Archetypes later, but suffice it to say that it helps me loads for iteration. And, of course the “Remote Control” is another really useful tool for real-time iteration, which you can access by launching the executable with the “-wxwindows” argument. I’ll talk more about the power of the Remote Control in another post!

Next, I got to work on the AI Controller for my first enemy, the Goblin. I wrote a bit of “state logic” that picks a target (based on all Actors that implement an “interface” that returns a targeting weight) (8) , then decides when to pathfind, when to navigate directly to the target, and when to stop pathfinding/moving & launch an attack (9). I’ll go into more detail about the AI script later on. I also implemented a nifty MeleeAttack state for the Goblin enemy, which uses animation notifications to enable/disable Tracing (a box sweep) each frame between his current and previous “hand socket” locations. (10) This ensures that the area that the Goblin swipes for damage actually depends on the animation and its timing, rather than any hardcoded values. I also made sure that the Goblin only damages each “traced” Actor once per unique swipe, by maintaining a list of all Actors hit during the current swipe and checking against that. (11) When all was done, this melee attack felt really good and accurate to what the animations were conveying.

Then, I couldn’t resist and implemented a basic “tower turret” that attacks the enemies. I didn’t bother with an AI Controller for this simple non-moving Actor, and instead just pick targets via a Timer (and remember that you can use State logic for any Object, not just Controllers). (12) I also added a LookAt Bone Controller to this Turret’s Animation Tree, to get the top of the turret to look towards any target that it picks. (13) Once the Animation tree was setup, all that took was one line of code to tell it where to look. Yay.

With the gameplay really starting to take shape, I went ahead and implemented the “Crystal Core” that the enemies will attempt to destroy as their primary objective (14). I used the ‘interface’ I created for any targetable Actor (15), to give the Core an especially high priority, so that the enemies are drawn to it with a greater intensity than the player or towers. ‘Interfaces’ allow you to have Actors of totally different classes share a common set of methods, so that they can interact or be checked in the same way. So even though my “Crystal Core” class is not directly related to by “Player Pawn” by hierarchy, they both implement the same targeting-weight function provided by the shared interface, which the enemy AI generically accesses to determine which entity is a more important target. Cool!

And then finally, the project’s lead artist, Morgan Roberts, put together a test level that represents the Mage’s Lair quite nicely, and so by setting up a bit of Kismet I created some waves of enemies to repeatedly spawn-in and proceed to attack the core. (16) So there we had it, an essentially playable prototype in about a day.

Well the gameplay’s already too challenging for me, so in the coming days I have some serious balancing work to do, along with of course implementing a bunch of additional mechanics and refining what’s there. Thanks to the great tools of Unreal it’s just so much fun to implement this stuff!

In the next posts I’ll go into more detail about many of the subjects briefly addressed above, and also start to review bits of the code or functionality that I think you’ll find interesting or particularly useful. And once we have some visuals for you to see, we’ll get some screenies up too. Cheers, I’m looking forward to you all playing this little game soon enough.

Now to get that pizza I was thinking about earlier...

- Jeremy

Blog 1: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. Main.uc: 618, 638
  2. DunDefPlayerCamera.uc: 240 - 248
  3. DunDefPlayerController.uc: 1561 - 1652
  4. DunDefPawn.uc: 222
  5. DunDefWeapon_MagicStaff.uc: 111
  6. DunDefProjectile_MagicBolt.uc: 24
  7. DunDefInventoryManager.uc: 13
  8. DunDefEnemyController.uc: 222
  9. DunDefEnemyController.uc: 764
  10. DunDefGoblinController.uc: 52
  11. DunDefGoblinController.uc: 32
  12. DunDefTower_ProjectileType.uc: 99
  13. DunDefTower_ProjectileType.uc: 95
  14. DunDefCrystalCore.uc: 19
  15. DunDefTargetableInterface.uc: 15
  16. DunDef_SeqAct_BasicActorSpawner.uc: 11

Blog 2: Day Three

In days 2 and 3 of working on Dungeon Defense, I focused on improving the aiming mechanic, AI navigation, and adding in the basis for an intuitive system to place down defensive “Towers” in the world. Let me tell you a bit about each implementation!

For the aiming, my initial mouse-based scheme simply had the player character’s Yaw looking at wherever you were pointing with the mouse. This was fine, except when it came to 3D aiming – the character only would turn with Yaw, because there is no practical way to intuitively determine input for Pitch from a (mostly) top-down perspective. This led to the problem of when enemies were below or above the player, well, you annoyingly couldn’t hit them. So I implemented two distinct fixes, one for the mouse-based scheme, and another for the gamepad scheme, that worked quite well. For the mouse-based scheme, I first calculated the character’s Pitch to aim at whatever point the mouse screen coordinate’s “unprojection” ray collided with.(1)

A brief digression: “unprojection” means to transform from 2D screen space into 3D world space, like casting a line out from the screen, whereas “projection” is to transform from world space into screen space. Both can be achieved in Unreal via corresponding Canvas functions accessible through the Player’s HUD.(2)

In any case, I then fed this Pitch value to a Bone Controller setup in the Character’s Animation Tree, so that he bends at the waist to look up or down.(3) This provided the most accuracy which felt natural for a PC game.

However, I had to do something about the character tending to always look down when you are pointing near to him – in which case he’s basically aiming at his feet, because that’s actually what you’re pointing at. I decided to add a condition that if you’re pointing close to the player and there’s not much height difference different to what you’re aiming at, then just look forward.(4) This eliminated the player running around looking at his feet when you’ve got the cursor close to him. Of course, I used interpolation on the Yaw and Pitch set on the Bone Controller, so that if there is a rapid change in aim point, the character doesn’t snap harshly. Furthermore, I stored the actual aim point within my Pawn,(5) and in my weapon looks it up to explicitly aim the projectile at that point, simply by giving it setting its Rotation to Rotator(TargetPoint-ProjectileSpawnPoint), and Velocity to the Speed*Normal(TargetPoint-ProjectileSpawnPoint).(6) This resulted in pinpoint shooting that looked and felt proper for a PC game, but still kept that simple, arcadey top-down perspective I wanted.

For the gamepad control scheme, I had to do something a little different. Because the player doesn’t have a fast, precise pointing device, looking at an exact location that the user is pointing at was out of the question. But we still needed some form 3D aiming! So I decided to implement an auto-aim function, which determines the best target (if any) to aim within a maximum range, and sets the aim point to that target’s location.(7) Because auto-aim still uses the aim-point system, it fit in with the existing “look at” methods that I created for the PC’s control scheme – the only difference is HOW the aim point is selected.

So to pick the best auto-aim target, I started with an OverlappingActors check from the Player to gather all ‘enemy’ type Actors within the auto-aim range, and then iterate through them and see which potential target is the closest to me AND the nearest to my looking direction. The target who is closest and nearest to my look direction (which is calculated as “Normal(TargetLocation – EyeLocation) dot Vector(Rotation)”), within a degree of allowance and weighting for each, will be my ideal target. Once I tweaked up the allowance and weights, this auto-aim selection method worked well, and now with the gamepad you would get vertical auto-aiming at any you were (more or less) looking at. I also added a small bit of added Yaw rotation on the character’s Spine bone towards the target, so that the bullets didn’t appear to fly a bit sideways due to the dot-product allowance on the auto-aim.(8) The gamepad control scheme was now up to par with the mouse!

On a side note, I used DebugSpheres to draw at my auto-aim target, which helped identify how well the selection method was working. In fact, I employ DebugSpheres, DebugLines, DebugBoxes all the time for analysis – I highly recommend making use of them during your prototyping phases, and even leaving the code to draw them in your class. Just toggle them off with a custom “bDebug” Boolean, so you can turn them back on later if you encounter a problem or want to do further tweaking. Visualizing what’s going on with your code pertaining to the 3D operations in the World is a great little bit of functionality for gameplay programmers.

Next up, I decided to change my AI pathfinding routine from using Unreal’s longtime Waypoint-Pathnode navigation system (‘them apples’) to the new Navigation Mesh system. Holy @#$#! This was super easy to do and the results it yielded were pretty incredible to my jaded eyes. You just plunk down this Pylon Actor in your level, build paths, and it does all of the heavily lifting automatically. As soon as the (surprisingly fast) calculation is done, you’ve got a fully realized pathing network for your environment, as here:

ddblog2.jpg

No recreating pathnode setups as the level architecture changes, essentially you just click “rebuild paths” and the Pylons you’ve placed will do the heavy lifting of recalculating all valid paths!

As for the code to actually use the Mesh Navigation system, it couldn’t be simpler (and this comes from experience using other mesh-based navigation technologies). These few bits of code are essentially all it takes for an AI Controller to walk through navigation results from a Nav Mesh:

function InitNavigationHandle()
{
   if( NavigationHandleClass != None && NavigationHandle == none )
      NavigationHandle = new(self) class'NavigationHandle';
}

event vector GeneratePathToActor( Actor Goal, optional float WithinDistance, optional bool bAllowPartialPath )
{
   local vector NextDest;

   //set our return value equal to our destination Actor’s Location.
   //In case it’s directly reachable or pathfinding fails, we’ll just return this.

   NextDest = Goal.Location;

   if ( NavigationHandle == None )
      InitNavigationHandle();

   //if the Actor isn’t directly reachable, then try to find the next navigation point towards it.
   //Otherwise we’ll just return its location to go there directly.

   if(!NavActorReachable(Goal))
   {
   class'NavMeshPath_Toward'.static.TowardGoal( NavigationHandle, Goal );
   class'NavMeshGoal_At'.static.AtActor( NavigationHandle, Goal, WithinDistance, true );
   if ( NavigationHandle.FindPath() )
      NavigationHandle.GetNextMoveLocation(NextDest, 60);
   }

   NavigationHandle.ClearConstraints();

   return NextDest;
}

Then In State Code:

//WithinRange just checks a small distance from the TargetActor,
//otherwise just keep moving whereever GeneratePath tells us to go.

while(!WithinRange(TargetActor))
{
  MoveTo(GeneratePathToActor(TargetActor),none,30,false);
  Sleep(0);
}

Once I added something like that to my AI Controller(9), my AI characters were navigating around the entire level without fail, and with a high degree of movement efficiency. And I barely had to do a thing, thanks Epic. :)

Though I’ve only started using them for pathfinding, there’s evidently a lot more that you can use Navigation Meshes for. I am told they can provide the AI with lots of additional information about the environment around them for things like custom movement (e.g. mantling over a ledge), dynamic obstacle avoidance (e.g. moving around physics objects), or riding on dynamic moving Nav Mesh sections (such as an elevator lift or train). I’m looking forward to exploring lots more of their capabilities in the future, and as far as drag & drop navigation capability is concerned, they’re golden.

With newly-robust pathfinding taken care of, I was able to move onto my final task for the day, which was to design & implement an intuitive Tower placement mechanic. This was especially important because as a tower defense mini-game, DD could live or die based on whether it’s natural & “fun” to plunk these suckers down in the World.

First, I decided to create a TowerPlacementHandler Actor (TPH) to encapsulate all of the rendering and logical functionality associated with placing a Tower in the World. I altered my Player Controller’s PostBeginPlay() to initialize a TPH for itself, as an Actor that it owns. The TPH will actually have no physical representation, but it will have Components that will become visible specifically when you are in Placing-Tower mode.

I added a State to my PlayerController called PlacingTower(10) (Pushed onto the State Stack), which locks all standard gameplay input (or rather, simply ignores it with a largely empty PlayerMove() ) and passes only the pertinent input events along to the TPH for processing. I also added corresponding PlacingTower States to the TPH(11) and to my PlayerCamera class(12); the PlayerController is responsible for setting these two child Actors into their own PlacingTower States, when the PlayerController enters the Placing-Tower mode.

When in Placing-Tower mode, I wanted the PlayerCamera to interpolate towards a more extreme top-down perspective, so to achieve this I store the previous camera transformation when entering the Placing-Tower camera state, and then interpolate (VLerp, RLerp) from those previous values into my new “Placing-Tower” camera values.(13)

Now for the big enchilada, the TowerPlacementHandler class itself. I created a new struct called TowerPlacementEntry, which contains the representation details for a Tower Type that will be in the game (such as its placement Mesh, its collision testing extents, the Archetype that will actually be spawned when the Tower is finally placed, etc). The TPH has an array of these structs, defining each Tower that it will be able to place.(14) For visual representation of the Tower you are attempting to place, the TPH has a SkeletalMeshComponent, for which I dynamically set the Mesh based on the corresponding TowerPlacementEntry when you enter the PlacingTower state (and unhide it, simply by unhiding the TPH Actor itself).(15)

To make this Tower-Placement Mesh Component follow the mouse, I set its Translation to the mouse screen coordinate’s unprojection intersection point, similar to how I did it for the Player’s pitch calculation -- if no true collision point is found, I just use the mathematical intersection onto the Player Actor’s plane.(16) I also limit the range which the Tower-Placement position will move to a radius around the Player Actor.(17)

I wanted to convey this range in the World in a visually compelling way, so I decided to represent it as a Material-animated decal projected onto the geometry beneath the Player. I added a DecalActorMoveable (Spawnable version) to the TPH: it spawns one as a child, manages its visibility state based on whether the TPH is in the PlacingTower state, and positions the DecalActor with the Player when this state is entered . This DecalActor uses a Material as a circular indicator of the effective Tower Placement range.(18)

I also implemented some collision checks of the current Tower-Placement position. I needed to make sure that there is enough free room at that location to spawn in a new tower, after all. So I added a series of Extent Traces (using the size values from the particular ‘TowerPlacementEntry’ that was selected) around the placement location, to the left/right, front/back, and if it hits anything, I don’t allow placement.(19)

Next, I wanted a visual indicator of whether it was valid to place a Tower at your current mouse position. Namely, I wanted the representative Tower mesh to appear Green if it was valid, and Red if invalid. I chose to do this with a Material Instance Constant (MIC). MIC’s are the system you use to dynamically change Material values during gameplay, using “Parameters” (which generally come in the form of Scalar Parameters, Vector Parameters, and Texture Parameters).

In this case, I added a Vector Parameter to my Tower’s Material, to alter its Color. In the Editor, I then created a Material Instance Constant from my base Material, and then in code you can then use Mesh.CreateAndSetMaterialInstanceConstant() to actually assign a UNIQUE copy of the MIC to your Mesh instance (otherwise, if you change a value in the MIC, it will affect all Meshes which use that particular MIC – which may or may not be an issue depending your usage).(20) With this done, I just set the Vector Parameter (ingeniously named ‘Color’) to Green or Red depending on whether my I considered the current placement position “valid” or not.(21) And there was the basic visual feedback I wanted!

Finally, I just had to tie it all together with a bit more State logic in the TPH, to handle the mouse-clicks / button-presses passed on by the Player Controller to actually accept confirmation of placing a Tower. This was as simple as overriding the respective input ‘exec’ functions within the PlacingTower state of my Player Controller, and forwarding those to corresponding events to my TPH’s PlacingTower state.(22) State overriding of functions is a good way to encapsulate state-specific functionality, without clogging up your more general-case global versions of those functions. I decided that once the player confirms where to place a tower, he’d be given the option to set its rotation.

I thus made a state that would be pushed on top of PlacingTower, called PlacingTowerRotation,(23) containing an overriden update/input method to process the rotation; it simply rotates the Tower towards the mouse-projection position.(24) A confirmation processed in the PlacingTowerRotation state is what will actually spawn the Tower in question,(25) and complete the whole sequence (sending word back to the PlayerController, which then Pops its PlacingTower state and tells the Camera to do the same, giving the player back his regular control).(26)

Thus ended another fun day doing game development with the UDK. Whether I was working with the mouse & gamepad input to implement a sweet new aiming scheme, playing around with Epic’s latest and greatest pathfinding solution, doing some VFX/Material iteration in Editor, or setting up user-input-driven state hierarchies, I got to focus on what matters most – the gameplay – and had a blast doing it.

Blog 2: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDefPlayerController.uc: 1575
  2. DunDefHUD.uc: 122
  3. DunDefPlayer.uc: 304, 329
  4. DunDefPlayerController.uc: 1611
  5. DunDefPlayer.uc: 304
  6. DunDefWeapon.uc: 134, 157
  7. DunDefPlayer.uc: 241
  8. DunDefPlayer.uc: 315
  9. DunDefEnemyController.uc: 938
  10. DunDefPlayerController.uc: 577
  11. DunDefTowerPlacementHandler.uc: 340
  12. DunDefPlayerCamera.uc: 109
  13. DunDefPlayerCamera.uc: 163-174
  14. DunDefTowerPlacementHandler.uc: 89-135
  15. DunDefTowerPlacementHandler.uc: 304
  16. DunDefTowerPlacementHandler.uc: 438, 474
  17. DunDefTowerPlacementHandler.uc: 468
  18. DunDefTowerPlacementHandler.uc: 219-238, 396-404
  19. DunDefTowerPlacementHandler.uc: 492-516
  20. DunDefTowerPlacementHandler.uc: 240-249
  21. DunDefTowerPlacementHandler.uc: 524-525
  22. DunDefPlayerController.uc: 706
  23. DunDefTowerPlacementHandler.uc: 691
  24. DunDefTowerPlacementHandler.uc: 786, 795
  25. DunDefTowerPlacementHandler.uc: 801
  26. DunDefPlayerController.uc: 602

Blog 3: Day Seven

Hey all,

The Dungeon Defense team has covered so much ground since the last blog entry, it’s almost daunting to figure out where to begin writing about it! But let’s take an overview of what we’ve achieved over the last couple days since the previous post, and then I’ll go into more detail on each one of these topics:

  • Added rigid body “Mana Tokens” that enemies drop which are vacuum-attracted by nearby Players – these are the expendable resource of the game used to summon towers and cast other spells.
  • Added a system to upgrade your “Magic Staff” weapon through a series of archetypes in an editable array of structs (a data-driven system)
  • Added split-screen support and dynamic local-player joining.
  • Added a custom UI class to support an Editor-driven animation system (building on top of Epic’s existing UI animation infrastructure)
  • Added a bunch of functional placeholder UI scenes: a main menu, pause menu, game over UI, individual Player HUDs, shared global game info HUD, and a loading screen.
  • Setup our game logic to support asynchronous loading (“Seamless Travel”), so that we can have animated transition screen while the level loads in the background.
  • Added a new character animation node (a BlendBySpeed variant, with an option to specify which Physics States are considered “movement” as well as cap the speed multiplier), and upper-body blending support to our player-character Animation Tree.
  • AI improvements: Made the AI’s stop pathfinding when they’ve determined they’ve got a direct line-of-sight to the target, periodically re-evaluate what their ideal target is, and put in some fail-safes to detect if the get “stuck” and attempt to move back onto the navigation system.
  • Added a bunch of Kismet actions to support a full “Tower Defense gameplay cycle, among others:
  1. A cool latent (meaning “over time”) ‘Wave Spawner’ action that will spawn enemies-over-time from an arbritrary array of structs representing groups of enemies, with appropriate output links for when the player has killed a particular wave.
  2. An action to dynamically scale the number of enemies and intervals between waves, so that the game can get procedurally more difficult over time.
  3. Various actions to open UI’s while passing in custom info.
  4. An event to detect a “lose condition” (core destruction) shortly before the Core actually dies, so that I can trigger a cutscene early in this case.

So let’s take a closer look at some of the topics above, starting with those Rigid Body Mana Tokens.

This is pretty simple, by inheriting from the Epic-provided KActorSpawnable class, I took advantage of their class that is already setup to apply rigid body physics to an Actor based on its StaticMesh Component (which has convex collisions set on it).(1)

In my child class’ default propeties, I simply overwrote its “bWakeOnLevelStart=true” (so that it immediately drops), and set its bBlockActors to false so that the player can actually move through the object without getting stuck. I gave this ‘ManaToken’ a tiny gem-like Static Mesh (in its Archetype), and then had my “DunDefEnemy” spawn a variable number of these (from Archetype) in its Died() function.(2) I also applied a scaled VRand(), random direction vector, impulse to each dropped token, to get them to fly outwards from the enemy. I check for nearby Mana Tokens within the Player, and “collect” them if any are found (namely, destroy it and add its ‘mana’ value to our Player Controller’s total).(3) Finally, in order to avoid the necessity of actually touching each token to collect it, I added a periodic OverlappingActors test in the Player (not in every token!) to find all nearby tokens, and flag them to apply a Force towards the player to suck them in. I also added a slight inverse force when their velocity is not in the direction of the player, which basically applies an “anisotropic friction” to help get them towards the player faster.(4) All in all, it resulted in a satisfying vacuum effected as the tokens started flying around.

ddblog3-1.jpg

Now to support upgrading weapons during gameplay, I extended my “Summoning Tower” state in my PlayerController (which basically locked input and had the player character play a summoning animation).(5) I called this child state “UpgradingWeapon”, and simply had it play a different animation and visual effect by overriding a couple corresponding functions.(6) In this way, I was able to make use of all of the functionality of my original State while implementing just the new functionality I was interested in. State hierarchies are a super useful concept for gameplay programming, and one which from a language-standpoint are pretty unique to UnrealScript! So I had my player entering a state to play a unique animation once I pressed the “Upgrade” button, but now I needed to actually do something with the weapon.

I added an array of structs called “Weapon Upgrade Entries”, which contained information about each upgrade level: the mana cost, a description, the time the upgrade takes, and most importantly the actual Weapon Archetype Reference for the weapon which will be spawned and given to the player once the upgrade is complete.(7) Why did I use a struct (which soley contains values) and not class? Well, structs can be created dynamically within the Editor’s property editor, and thus I could setup my ‘Weapon Upgrade Entries’ values within the Editor, and keep the whole system data driven.

Next I added an “enum” to contain an entry corresponding to each supported Upgrade level (up to 5), and then simply picked the next enum value (current enum +1) each time the player upgrades, and use that as an index to get the next “Weapon Upgrade Entry” in the struct array. Within my PlayerController then I simply wait in the “Upgrading Weapon” state (playing the looping upgrade animation) for as long as the upgrade struct entry says so, and then once that time is expired I spawn the archetype for the new weapon (and, heh, destroy the old one).(8) It all worked well, and the fact that all of the values are contained in the PlayerController Archetype’s “Weapon Upgrade Entries” struct array means that iteration to fine-tune the associated weapon-upgrade costs and times can be done in the Editor, in real-time via the Remote Control. Now that’s efficiency for ya!

ddblog3-2.jpg

I also wanted to support split-screen, because since the game is starting to get really fun by one’s self, it should be 4x as fun with Four Players! (or something like that ;)

Supporting split-screen multiplayer is really simple, again thanks to the powerful framework that Epic has provided. I just needed to handle the “press start” input for any controller which doesn’t yet have a player associated with it , and then call “CreatePlayer” function with that new controller ID. I handled the “Press Start” input for gamepads that don’t have players in a subclass of Input, in its InputKey funtion. When the player presses Start on the gamepad, this key name is passed to the InputKey function, and there I call CreatePlayer with the corresponding ControllerID.(9) I added this new Input class to the “Interaction” list of my ViewPortClient class using the InsertInteraction() function, and that was it.(10) Player #2 pressed Start, and in popped a second PlayerController and associated Player-Pawn – and the viewports automatically split accordingly (if you don’t want split-screen, you can override the UpdateActiveSplitscreenType()(11) function in your ViewportClient class – in which case the first player’s camera perspective is going to be what’s drawn). Now what was once a singleplayer experience can now dynamically be enjoyed by multiple local players! Online multiplayer takes more doing using the Actor Replication system – though not much more thanks to the existing framework Epic has provided – we’ll cover that in subsequent installments.

ddblog3-3.jpg

Next, I wanted to tackle some basic functional User Interfaces for the game, so that it could start to function as a fully playable system, all the way from main menu to completed victory, and not just a single level. I took a look at the UI animation system, which was quite powerful but could only be edited through DefaultProperties. So making use of the power of UnrealScript, I wrapped the values of these UI animation classes into structs and made them editable within my extended UIScene class.(12) Upon activation of my custom UIScene, I copy these struct values into the dynamically-created UI Animation objects.(13) Thus I got the advantages of being able to edit and experiment with animation values in the Editor, while still using the existing UI animation system that Epic created.

With this new functionality in place, I created a bunch of sime placeholder UI’s. Some of these, such as the Player HUD UI(14) (opened by my HUD class), are meant to be drawn in each player’s viewport, while others are global, full-screen and not owned by any one player. I wrote some functions into my GameInfo class to display these Global UI’s directly based on the persistent game’s state (such as how much time is left in the build phase, how many enemies are left in the combat phase, etc).(15) I created some decent little (placeholder) Open and Close animations for the UI’s (tweaked from within the Editor).

ddblog3-4.jpg

Once I was satisfied, I decided that I wanted my loading UI to be animated(16), so that we have a nice jaunty transition from the main menu (which is actually a level that opens the main menu UI) to the gameplay level: a ‘never a dull moment’ kind of desire. This is possible using Epic’s SeamlessTravel functionality, which loads a level in the background while using another level as a temporary “transition” map. In my case, the transition map is what opens my Loading Screen UI scene – and that displays until the transition map is closed, when the destination level has been fully loadedin the background. All you have to do is call WorldInfo.SeamlessTravel(17), and the Transition Map specified in your INI will be entered, while the final destination level is loaded in the background. Simple and powerful.

ddblog3-5.jpg

Of course, you also have the capability to do what is called “level streaming”, which means streaming in parts of a level dynamically while gameplay is ongoing (such as the interior of a building when you enter the first room), or unloading old outdated parts (such as the exterior world when you enter an interior building). Particularly useful for large-World games, that’s a different process, handled through Kismet and the World Editor itself, and thoroughly documented by Epic here: Level Streaming Guide

Next, I noticed that my characters movement animation rates really could benefit from being dynamically adjusted with their movement speeds. Epic already has an Animation Tree node for this, called BlendBySpeed, but I wanted to add a bit of functionality to it: I wanted mine to only scale for speed when the player was in specific physics states (namely, on the ground walking) and to have a maximum cap on the rate scale – so that if the player happened to move really quickly for some reason (such as by getting a large momentum from an explosion) the movement animations didn’t look wacky as a result. Thankfully this was simple, I just inherited my new animation node class from Epic’s own “AnimNodeScalePlayRate” , and then added a Tick function to it, and in the Tick function check the current speed of its Owner Skeletal Mesh’s Actor (doing the clamping and phyiscs check that I was interested in).(18) I created a TickableAnimNode interface to support this new Tick function(19), and registered the node with my Pawn class in its OnBecomeRelevant() function (and de-register in OnCeaseRelevant()) so that the Pawn knows to Tick the node. Extending the Engine’s base classes with your own, and adding new functionality to them with UnrealScript, gets you the most power out of the framework, something that is also clear when you start adding Kismet functionality. Which is what I did next!

(I also added a CustomAnimation node that is filtered to only play on the character’s upper body, using the ‘AnimNodeBlendPerBone’ as its parent, set to filter from the ‘Spine’ bone upward. So my character can play reactive animations while still moving his legs independently.)(20)

So with local multiplayer, the functional UI’s, the basic resource and weapon upgrading system all taken care of, I wanted to put it all together with a playable “tower defense” game cycle, start-to-finish. This would require a bit of level scripting to do nicely (I could hard-code it, but that would be so lame and not extendable to more gametypes and levels!). Therefore, I was looking at using Kismet to drive my “Build and Combat” cycle, which is in essence: give the player some time to build (notify him of this time via the UI), then spawn waves of enemies (notifying him how many via the UI), and then repeat the cycle while procedurally scaling the number/intervals of enemies and time-to-build so that the game gets more and more difficult – until it becomes essentially overwhelming. Yay ?:)

To begin with, I wanted to use a latent Kismet action to spawn waves of enemies, which is an action that doesn’t complete/output immediately, but updates internally over time and only completes when some internal logic says so. I created my own “Enemy Wave Spawner” action (extending SeqAct_Latent to get the ‘ Update()’ function(21)), which has an array of structs, each struct defining a wave of enemies that will appear at a certain time after the action has begun.(22) Only when all of these waves of enemies have been killed will the “Wave Spawner” Kismet action complete and activate its final output.(23)

Here’s where things got particularly interesting. While I could have just made the ‘Wave Entries’ struct array be editable directly within the Action’s properties, I knew that I wanted to pass around these ‘Wave Entries’ between multiple spawners, procedurally scale their values, and have them be processed as information by the UI (“how many enemies to kill”)(24). So I decided to make a new Kismet Variable class, SeqVar_EnemyWaveEntries, which simply contains the struct within itself(25). THIS Kismet Variable object is what is taken as Variable Input into the Wave Spawner Kismet Action, which then copies the struct for its own use.(26)

Using a Kismet Variable object to wrap the Wave Entries struct, rather than just a direct editable value within the Wave Spawner action, I was able to pass the Wave Entries around Kismet visually. This enabled me to link the ‘Wave Entry’” Variables in Kismet to another action I wrote, ‘ScaleEnemyWave’. ‘ScaleEnemyWave’ takes a Wave entry and a float as input, for how many enemies and interval time to scale the Wave by.(27) By altering these float scales with a ‘Multiply Float’ Kismet action after combat cycle, I was able to make the game become procedurally more difficult per round. I plan on doing even more with this system soon, such as allowing the waves to have random archetype values (so you’re never quite sure what group of enemies you’ll face), and using a RandomFloat variable for the scales so that the quantity and pace of enemies spawned always varies slightly.

ddblog3-6.jpg

The bottom line is, thanks to Kismet the level can be balanced with Play-In-Editor iteration, and we can construct more unique sequences, such as adding additional events in between milestone wave numbers (say, every 5 waves you get to fight a Super Enemy – especially simple because our gameplay objects are Archetypes). It’s going to be a very fun designer-driven experience to tweak up the Build-Combat-Wave cycles through Kismet in the coming days. Until next time… keep on creating!

Blog 3: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDefManaToken.uc: 8
  2. DunDefEnemy.uc: 208
  3. DunDefPlayer.uc: 351
  4. DunDefManaToken.uc: 62
  5. DunDefPlayerController.uc: 812
  6. DunDefPlayerController.uc: 1266
  7. DunDefPlayerController.uc: 69
  8. DunDefPlayerController.uc: 1316-1320, 1280
  9. DunDefViewportInput.uc: 15
  10. DunDefViewportClient.uc: 474
  11. DunDefViewportClient.uc: 226
  12. DunDefUIScene.uc: 11
  13. DunDefUIScne.uc: 36
  14. DunDefHUD.uc: 27
  15. Main.uc: 223, 333, 482, 132
  16. Main.uc: 482
  17. Main.uc: 488
  18. DunDef_AnimNodeScaleRateBySpeed.uc: 17
  19. DunDefPawn.uc: 285
  20. DunDefPlayer.uc: 163
  21. DunDef_SeqAct_EnemyWaveSpawner.uc: 162
  22. DunDef_SeqAct_EnemyWaveSpawner.uc: 14
  23. DunDef_SeqAct_EnemyWaveSpawner.uc: 198, 231
  24. DunDef_SeqAct_OpenKillCountUI.uc: 31
  25. DunDef_SeqVar_EnemyWaveEntries.uc: 10
  26. DunDef_SeqAct_EnemyWaveSpawner.uc: 176
  27. DunDef_SeqAct_ScaleEnemyWave.uc: 53

Blog 4: Day Ten

Hello again my intrepid Unreal acolytes!

In the last past several days since the previous blog, our little team has made a ton of progress even through the holidays. Let me give you an overview, and then we’ll dive into some detail about each topic:

  • We have the first artwork (skeletal meshes) for our cool character designs! Yay for cheeky fantasy clichés! Now we just need to rig them for animation, and then we’ll be replacing the UT robots of doom -- which should sharpen the game’s own sense of style. In addition to further work on the environment, we also got our base “Mage Staff” weapon model, which works well even with the temporary visual effects.
  • I couldn’t help myself, and implemented some nifty Render Target usage on the main menu. Namely, the main menu now displays an animated image of each “player character” for 1-4 players (which will eventually be a color-swapped version of the main character), to indicate who is “signed in” to the game. You can press “Start” on any connected gamepad while in this menu and that player will become signed in for the subsequent gameplay, and this is reflected in an ‘active’ animation that these “render to texture” characters play in response (they are greyed-out ‘idle’ when unselected). Neat!
  • Further playing around with the main menu, I created a little Canvas particle system to emit particles from the cursor’s location. This system can be used for other UI effects as well down the line.
  • I got my hands dirty with matinee and implemented gameplay-intro and game-over cinematics, setting up the appropriate input-blocking states on the Player Controllers so that they can’t move/shoot while in cinematic. I also implemented a custom solution for cinematic skipping when a player presses Start/Escape.
  • I did a bunch of Player HUD work, like implementing the whole thing, including: a Material-based Health/Progress Bar (custom UI Control), state-reactive Spell Icons, animated Mana Token indicator, etc. I also implemented HUD overlays (dynamic Canvas draws) for floating health bars above the Towers/Core, as well as a rotating waypoint that points towards the Core when it is under attack. These all play correctly in 2-4 splitscreen, woot.
  • Implemented impact decals for our weapon, including making use of Epic’s very-powerful “Material Instance Time Varying” (mouthful) system.
  • Implemented the basic functionality for a ranged-attack enemy to serve as our “Archer” unit. State inheritance made this a breeze, though I also made some AI tweaks to get exactly the behavior I wanted (including aim-prediction, deliberate inaccuracy, and something I like to call projectile direction “fudge-factor”).

So first up, let me tell you about these Render Targets on the main menu. I wanted a nice sign-in UI where you could see who was in the game, and everyone could get signed-in at the main menu, so they’d be ready the moment gameplay begins (of course, you can also add a new player dynamically during gameplay just by pressing the Start button as well). I also wanted nifty 3D characters to visually reflect who was in the game. So I began by adding 4 Skeletal Meshes to my menu level’s environment, off in the distant void where only they would be rendered and no background elements. I put “SceneCapture2DActor” in front of each of them – which is a Camera-like Actor that renders the scene into a Texture from its view -- and assigned each of these to use a unique Render Target Texture.

ddblog4-1.jpg

Then, I created a Material that used one of these Textures. I then created a Material Instance Constant of this Material so that I could simply swap out the Texture parameter for the other Render Targets without having to create 4 unique base Materials. Material Instance Constants are literally “instances” of a Material that can each have unique “Parameters” (usually Scalar, Vector, or Texture parameters) to swap out values on a per-instance basis without having to each contain an entire copy of the Material. This allows you to alter such parameters dynamically while the game is running, for dynamically-reactive Material effects, and it also saves memory while making it easier to manage Material assets (you don’t need to copy a whole Material just to change a few values).

ddblog4-2.jpg

In the Material itself, I passed the Render Target texture to the Emissive output, but I also clipped the Opacity Mask by the “Green” background color that I set my camera to use – so that only the character would be visible in the resulting image, and not the background. That is to say, using the Material I’d only see my character’s mesh, pixel for pixel, and not a square image of the entire Render Target Texture. I also added a Material Scalar Parameter to control the brightness of the resulting image, so that I could dim the character when it was unselected. Each of the UI Image Controls I then added to my selection UI was given a unique MIC (for Players # 1 through 4).

Note that I actually made these character meshes use a custom class inheriting from Epic’s SkeletalMeshActorMAT (a good dynamic skeletal mesh actor class if you want to play animations on demand but don’t want a full Pawn). The reason I used a custom class is so that I could play blended animations easily through code(1) – it actually would have been possible to rig all this up using Kismet and Matinees, but that would have been more difficult to interface with the immediate player sign-in event.

In fact, rather than using a ‘UI Image’ control directly for the player character sign-in image, I created a custom control that extended ‘UI Image’. This allowed me to add the necessary functions to it for manipulating its corresponding Character Mesh and MIC’s in response to a player sign-in, rather than having to do it elsewhere.(2) Creating custom UI controls that inherit from Epic’s wealth of base classes is a very useful approach when you need to add specific functionality. In working on these custom UI’s, I did this a lot ?:)

Anyhow, the last step was to capture the “Press Start” button event in code and create a new player in response to it. This was as simple as assigning my own function to my Player-Select UI scene class’ “OnRawInputKey” delegate (remember to use custom UI Scene Classes if you want them to have custom UnrealScript functionality!). Then in that function, I just check if the ‘InputKeyName’ parameter was one of the Start buttons that the game uses, and if so, I create a new player (using the ViewPortClient’s CreatePlayer function) with that Controller ID – if one doesn’t already exist for that controller ID, of course.(3) My Custom UI Scene then tells my custom Player-Select Image Controls to update themselves(4), and they each take care of updating their Material and playing the animation on their corresponding 3D character(5) if they find a local player corresponding to their specified index. Presto, animated on-screen characters that respond to local-player joining!

ddblog4-3.jpg

With this done, I couldn’t resist sprucing up the main menu a little bit more with some magic particles emitting from the cursor (which I’m going to replace with a custom texture soon…). For this, I overrode the PostRender() function in my game’s ViewPortClient class, and I check my game’s MapInfo to see if the Current World is flagged as a “menu” level. If it is a menu level, I then use Canvas.DrawMaterialTile to manually draw “particles” defined by an array of particle structs(6). Each struct contains info about a particle – its location, size, velocity, acceleration, and lifespan(7). I update the particles in the ViewPortClient’s Tick() function, incrementing their Position with their Velocity, applying their Acceleration, decreasing their lifespan, and resetting them when they expire.(8)

Later I’m planning on extending the Canvas Particle System framework later to support predefined “defaultproperty” animation values, so that I can have UI particles that respond to input events, etc, to make clicking through the menus that much more satisfying. I like how Epic’s framework is flexible enough to support custom solutions to things like this – not only does UE3 have such powerful tools, but when I want to “roll my own” and create my own little particle system method, I can use the flexibility of UnrealScript to do so!

Moving back into gameplay, I wanted to make the beginning of the level more exciting, and provide the player with an overview of the environment he was tasked with defending. Matinee would be the obvious tool for creating such cinematics, and thanks to the framework this was really simple to get going. Unreal’s Player Controller class will by default apply the view of any Matinee that contains a “Director” track (considered to be a ‘cinematic’ matinee, as opposed to just some scripted gameplay sequence).

However, you do have to block input correctly to prevent the player from walking around during the cinematic (and maybe you also want to hide his character if it’s not pertinent to the sequence). This is as simple as overriding the “NotifyDirectorControl” event in your PlayerController, which has a parameter indicating whether the player is entering or leaving the cinematic sequence – if he’s entering director control, I push a state called “InCinematic” to the player controller which ignores most input functions (and can optionally hide the Controller’s Pawn Actor), and he’s leaving the director control, I pop that state. And there we go, no more moving around during “cutscene”. (9)

However, I also wanted the player to be able to skip the cinematic (‘games with unskippable cutscenes’ – the bane of gamers everywhere!). To achieve this, I overrode the “StartFire” exec function (and an input function that corresponds to pressing Escape in my game). Instead of shooting the weapon (which would be pointless during a cinematic of course), I iterated through all SeqAct_Interp (matinees) in the level’s kismet, found the one whose Director track was currently controlling our Player Controller, and then set that Matinee’s PlayRate to 10000 which causes it to end in a single frame (restoring the PlayRate to the original value afterwards).(10) Thus instantly ended ?:)

With my game now using some nice cinematics, it was time to focus on the task of improving the player’s HUD – namely, moving it from some DrawText calls in my HUD class’ PostRender, to an actual UI Scene that would employ animated graphics in response to gameplay events. The first thing I wanted on my new uber-HUD was a good progress-bar style graphic for indicating Health percent and spell-casting time. I decided to roll my own, using a Material that I cooked up which masks a progress-layer corresponding to a “Percent” scalar parameter (that is, you tell the Material Instance what percent you want, and it only shows “that much” of the progress-bar image within an overall frame). Because this was a Material, I could use it both in a UI Control within a UI Scene (for the Player HUD)(11), AND for dynamic Canvas overlays using DrawMaterialTile(12) (for floating health indicators on top of my tower actors). Sweetness.

The only thing to be careful about is that when you use a Material Instance Constant for such a purpose, you must create a UNIQUE Material Instance Constant for each bar you want to draw (with the original MIC as each one’s “Parent”). Otherwise, when you set an MIC Parameter like my “Percent” value, it would affect each of the bars (because they’d all be sharing the same MIC). So in my case I simply had each UI HealthBar instance(13) (or Actor which implemented the “Damageable” interface(14)) initialize its own “new” MaterialInstanceConstant, and set the original MIC as its Parent. Thus I got a health/progress bar system flexible enough to be used in UI Scenes and in floating HUDs, as well as support nifty Material animation so that it looks “magical” (i.e. it has multiple scrolling blended layers, haha).

Next I wanted Spell Icons that updated a their visual state corresponding to what the player was currently doing – indicating states such as whether the spell could be afforded, was currently applicable, was in the process of being casted, or could be cast. I also wanted these spell icons to dynamically display the input-button corresponding to cast that spell. For this I inherited my “Spell Icon” from UI Image, and I added a couple parameters to it so that it could contain a reference to two extra UI Labels, one for displaying how much “mana” the spell would cost, and another for displaying the text-button (or gamepad button image-icon) to press to use that spell.

The “Spell Icon” control would update upon every Player State change, query the Player Controller for its corresponding spell’s usability(15), and based on the result would update its visuals (color/opacity) to indicate each unique state, as well as the values within the corresponding labels when necessary(16). Two things made this easy: (1) That a UI Control can contain a reference to other UI Controls within the Scene(17) and (2) that a UI Control knows about its Parent UI Scene, and the Parent UI Scene knows about its Player Owner. Therefore, the results could be specific to the player which owns the UI HUD, without needing to add any extra references to get at this information.

I also found that all graphics aligned nicely in split-screen when I used the “Percentage Owner” positioning calculation, along with “Scale Height” Aspect Ratio size adjustment option for images that needed to retain their original aspect ratio (such as button icons!). This split-screen verification was made really easy by the ability to toggle between all split-screen UI view modes directly within the UI editor – powerful as heck. Finally, I made further use of the “UI Animation” system that I exposed to the Editor a few days ago, and created animation sequences for each of these controls to react to state changes – so they bounce around to reflect changes in state (such as when they become disabled or re-enabled), and the color changes occur over time, rather than being instantaneous. This added a nice subtle sense of feedback, and I enjoyed the process as it was entirely data-driven iteration within the UI Editor/PIE.

ddblog4-4.jpg

Finally, I wanted the player to know when the “Lair Core” was under attack (defending it is your primary objective!), and correspondingly display a floating HUD waypoint indicator to point towards the Core. For this I implemented my “HUD Overlay” interface on the Crystal Core (my Player HUD, in its PostRender, iterates through All Dynamic Actors which implement this interface, and calls a DrawOverlay function on them). In its newly-implemented DrawOverlay function, the Crystal Core checks if it was attacked recently. If so, the Core’s DrawOverlay calls DrawRotatedMaterialTile() with a waypoint icon, and the Rotation is calculcated as the direction (from the canvas center) towards the projected screen-position of the Core. And for the position passed to “Rotated Material Tile”, I calculated an offset from the center of the screen in the direction of that Rotation -- thus the waypoint moves in a circle around the center, similar to a compass.(18) Since I used a Material (rather than just a Texture), I was able to animate the Waypoint Icon, in the Material, to bounce around and flash, so that it draws attention!

Next I wanted to add some more “Oomf” to the impact of the Magic Staff’s projectiles. I already had a nice particle emitter (with light component) spawning at the impact point, so what more could I add? Decals baby, decals! And not just any ol’ static decals either – no I wanted to make use of Epic’s uber-powerful and uber-named “Material Instance Time Varying “ system so that these decals would artistically animate.

Spawning a decal upon the Projectile Impact was simple: in my projectile “Explode” function, I simply called WorldInfo.MyDecalManager.SpawnDecal(), passing in the HitLocation and negative-HitNormal that my projectile collided with, and corresponding values about decal size and lifespan. Of course, into SpawnDecal(), I also pass in a “Material Interface” from my Projectile Archetype – this generic reference can be either a static “Decal Material” or a dynamic “MITV”. In the case that it’s a MITV (which I checking using GenericMaterialReference.IsA(‘MaterialInstanceTimeVarying’), I create a “new” MaterialInstanceTimeVarying from it. This is necessary so that it can animate uniquely for each decal that I spawn. I set the MITV’s duration to the “GetMaxDurationFromAllParameters()” (so that it lasts as long as its Material animation tells it to), pass it to the SpawnDecal() function, and boom animated decals! (19)

ddblog4-5.jpg

MITV’s allow for easy artist-driven fading decals (just add some keyframes to interpolate an ‘Opacity’ parameter from 1 to 0 over some seconds), but they can be used for so much more: e.g. bullet holes that glow red hot before fading to black, or non-decal usage such as pulsating materials on a complex loop, or even materials that animate dynamically in response to gameplay events. Once I set up the simple code to spawn them on a decal in response to projectile impact, the system can be driven by artists to do pretty much whatever they can image. I’m looking forward for Dungeon Defense’s VFX artist to have some fun with this in the days ahead.

Finally, I got back to implemented more core gameplay content, in this case adding a ranged-attack enemy to complement my melee dudes. The “Ranged” enemy pawn itself was nearly an empty class inheriting from by base enemy pawn, but its AI Controller had a bit of unique functionality. In addition to having a large “attack range” from which its stops to shoot projectiles (of course), I extended the generic “Attacking” state from my base Enemy AI Controller to, in this case, spawn a Ranged projectile at the correct time during an attack (specifically upon an animation event passed up from the Pawn when playing the shoot animation).(20) I liked the fact that my generic Attacking state handles the more abstract logic, such as popping the state if the enemy takes damage while attacking, setting the “Last Attacked Time” value for interval checking, while the specific “RangedAttacking” state does the specific act of firing the projectile (whereas my Melee enemy’s “MeleeAttacking” state, mentioned in a previous blog post, does a melee trace).

State inheritance, which I also used extensively for the player’s spell-casting system, allows you to put more general functionality in abstract states, and then implement child versions of these states that are more specific to individual cases. As I’ve mentioned before, this is such a powerful system that is almost entirely exclusive to UnrealScript’s language design, and can really boost productivity and enhance code design. In any case, for actually spawning the projectile, I did a bit of fun math that predicts where the enemy should aim his shot considering the player’s current velocity(21) (OK I believe Epic’s own SuggestTossVelocity function does this as well, but I wanted to brush up on my Kinematics 101 myself!).

However, I didn’t want this enemy’s aim to be perfect -- that’d be no fun – so I added a tiny bit of deliberate error to the shoot angle (transformed the shoot target position by a small random rotator).(22) Finally, I clamped the projectile shoot direction to a 15-degree angle limit around the Pawn’s current Rotation, so that there wouldn’t be any cheap shots where the enemy shoots projectiles sideways at you.(23) I allowed those 15 degrees of freedom, however, as my “projectile fudge factor” to allow the enemies to not have to be perfectly oriented towards you, in order to still pose a threat with decent aim. It works well and “shhhh” no one will notice. :)

So folks, that about sums up the past couple days. We’re making rapid (ok – insanely rapid) game development progress on the back of the powerful Unreal tech, and soon we’re going to have a nice demo for y’all. We’ll keep you posted, and ‘till next time… keep on developing!

Blog 4: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDefPlayerSelectUICharacter.uc: 24
  2. UIImage_PlayerSelect.uc: 38, 48
  3. UI_CharacterSelect.uc: 97
  4. UI_CharacterSelect.uc: 65
  5. UIImage_PlayerSelect.uc: 38, 48
  6. DunDefViewportClient.uc: 108
  7. DunDefViewportClient.uc: 14
  8. DunDefViewportClient.uc: 122
  9. DunDefPlayerController.uc: 1713, 1726
  10. DunDefPlayerController.uc: 1832, 1809
  11. UIImage_HealthBar.uc: 55
  12. DunDefDamageableTarget.uc: 137
  13. UIImage_HealthBar.uc: 16
  14. DunDefDamageableTarget.uc: 154
  15. UIImage_SpellIcon.uc: 81
  16. UIImage_SpellIcon.uc: 90
  17. UIImage_SpellIcon.uc: 11-14
  18. DunDefCrystalCore.uc: 41-69
  19. DunDefProjectile.uc: 78-103
  20. DunDefDarkElfController.uc: 85-107, DunDefDarkElf.uc: 56
  21. DunDefDarkElfController.uc: 31, 71
  22. DunDefDarkElfController.uc: 62, 63, 71
  23. DunDefDarkElfController.uc: 74

Blog 5: Day Thirteen

Hey everyone,

It’s been a busy couple days of development! We got our 2-week milestone build prepared, cooked it with the Frontend tool, and Packaged it for release to our army of ace testers (ok, our friends and family ?:)). With the mechanics firmly in place, we could finally sit back and enjoy the gameplay, while taking some notes for difficulty tuning. It’s looking good and a pretty fun little mini-game, and what’s most exciting for me is that you’ll get to play it yourself in a couple weeks!

On the implementation side of things, here’s an overview of the major areas of progression, most related to getting that ‘milestone’ ready:

  • Added some more Kismet functionality: a latent action to toggle arbritrary postprocess effects with parameter interpolation (so you can get a smooth fade-in/out of an effect any time you want). I also added an action to scale any Kismet float variable by the number of players in the game (with an array of multipliers for the number of players) – this was important for multiplayer difficulty balancing, to ensure that there are more enemies and less time between waves when more players are present, but not necessarily on a fixed linear scale.
  • I added a new Tower type: the “Blockade”, which acts as a barricade that enemies are prone to either destroy or navigate around (using some rudimentary dynamic pathfinding). I also created an Attacker Registration system so that any targetable object can specify how many enemies are allowed to attack it at once (the rest will move on and look for other targets of opportunity). Thus when enemies encounter a Blockade, a few of them will attack it to bring it down, while the rest will proceed to navigate around it!
  • Added a game-optional Friendly Fire check, which UT already had, but since I’m not using the UT classes I wrote my own. Also added some additional logic to the player-start location choosing, so that the game will cycle through all valid start locations for each newly-joining player, so that 4 instantly-joining players are guaranteed to get their unique start spots.
  • I modified the UI Skin to add in my own ‘awesome’ buttons and fonts, and I added music and SFX to all of the game sequences and menus. For the SFX, I added an AudioComponent to my standard Effect Actor, so that all of my Visual Effects which are spawned instantly supported audio (and given that they were all archetypes already, this made adding audio to most of the game’s events a breeze – and the remaining few sound effects were generally tied to Animations via the Anim Sequence Browser).
  • I cooked and packaged the thing, baked like a cake.

So now that you’ve got the big picture, let’s review each of these aspects in more detail.

Kismet Functionality

One of the first thing I noticed in multiplayer was that the game was significantly easier – too easy, in fact, as having 4 players made it much simpler to take on larger groups of enemies. So in order to maintain a reasonably consistent level of difficulty in different-sized games, I decided to scale the number of enemies and (decrease) the time between wave groups based on the number of current players. However, I didn’t just want a linear scale, I wanted custom scales that I could tweak per wave and per number of players; this would ensure that I could find the ideal “magic numbers” in due time. Therefore I wrote a Sequence Action to “Scale Float for Player Count”, which has a user-defined dynamic array of floats that are values to scale by (each index in the array corresponding to a number of players minus 1).(1) The action also takes as input the Float Sequence Variable that will actually be scaled. Once Activated, the Action uses the value of GetNumPlayers() (from the Current World’s GameInfo), and then looks up the corresponding scaling-float-array index (clamping the index to the length of the array minus 1). It multiplies the input float variable’s value by this scaling value, and then activates the Kismet output link.(2) Done and worked like a charm once rigged up to my master Kismet variables that controlled number of enemies and time between group-spawns; now I could actually balance the game for any number of players ?:)

Also, I noticed on the forum that someone was wondering how to alter arbitrary PostProcess Materials based on volumes. PostProcessVolumes currently only support the pre-created Epic PostProcess Effects, but it is possible to add control over arbritrary user Material Effects through a bit of Kismet (or by creating your own custom Volume class and having your PlayerController call an event on it when it is touched/untouched). I chose the Kismet route, because I wanted to toggle my custom PostProcess events based on arbritrary game criteria, in my case whenever a cinematic was playing I wanted to activate a film grain material effect I created. There were a couple steps to achieving this, and the most simple approach was simply to turn the effect on and off immediately via the bShowInGame property on a PostProcessEffect. To get at the PostProcessEffect object, I iterated through all LocalPlayerControllers in the Current World Info (or used the Instigator if one was passed in), and used their LocalPlayer’s PlayerPostProcess chain to Find the PostProcessEffect that I was looking for by name. Once found, toggle the bShowInGame as desired.(3)

However, once I achieved this, I decided I wanted to actually support interpolating some of the effect’s values up & down over time, to create a smooth transition of enabling/disabling, rather than just immediate on/off. This was a little more challenging, but still entirely do-able. The first step was to make the action a SeqAct_Latent, rather than just a SequenceAction, so that it would get an Update() function to adjust the postprocess values over time.(4) Then, I opted to store the PostProcessEffect’s that were currently being “interpolated” in one of two arrays, a “FadeUp” array for effects that were in the process of being “shown” and a “FadeDown” array for effects in the process of being hidden. In the Update function, I simply iterated through each of these arrays, and added or subtracted to named Material Scalar Parameters each frame until the target scalar value is reached. Once the target value is reached, I remove the PostProcessEffect entry (actually a struct also containing the interpolation values) from the Up/Down array.(5) Once there are no more array entries in either Array, I return false in the Update() function which completes the latent action.(6) This worked nicely and got the smooth fade-up/fade-down that I wanted, but there were two special cases I had to handle:

  1. If a PostProcessEffect is already on a FadeUp or FadeDown array when attempting to add a FadeUp/Down for that effect, then it needs to be removed from the old array before being added anew. Otherwise we’d have two interpolations proceeding at the same time, which would mess things up. This is a case which can occur if rapidly triggering FadeUp/FadeDown inputs on the kismet (say when dancing around the edge of a trigger volume that is linked to the action via Touched/Untouched). (7)
  2. For proper split-screen support, the Action does support taking an Instigator and only altering its PlayerController.LocalPlayer’s effects (as an option, it can also alter everyone’s). However, Material Effects need unique MIC’s in order for them to have unique parameter values, and this is not currently possible to specify directly within the Postprocess Chain Editor itself. Therefore, I wrote a function to check if the Material Effect’s Material is unique (this is achieved by checking that current Material on the Effect is either not an MIC, or an MIC whose Parent is not an MIC – either case means, for me, that it has not been made unique yet). If it is not unique, I make a new MIC for it, set its Parent to the original Material, and apply that new MIC to the Material Effect.(8) Thus we get proper split-screen unique PostProcess alteration. Phew!

“Dynamic Blockade” object

Returning to gameplay, I wanted to add another “Tower” type to increase gameplay depth. I decided to add a “Blockade” Tower that would slow down enemies by requiring them to navigate around it (or sometimes attack it). Thus I needed to support dynamic navigation around an obstacle. My solution for this isn’t necessarily the most robust possible, but for my purposes it worked quite well. First, I created my Blockade object which is simply one of my damageable objects that implements the targetable interface (so enemies can attack it). The one special thing about this Blockade Actor is that it overrides the “Bump” event. When it is bumped, the Blockade checks if the Other Actor is an ‘Enemy’ type, and if so calles a “MoveAroundBlockade” function on its EnemyController, passing in itself and some parameters (its collision with and the impact normal).(9) That’s all the Blockade does. The real magic is in the Enemy Controller to respond to this “MoveAroundBlockade” notification.

But first, a side-note: Using the Bump event for this purpose has the advantage of not requiring any active logic, but the disadvantage of only triggering when the enemy actually touches the object – for my purposes this was ok, but if I didn’t want the visual drawback of enemies actually touching the blockade object before deciding to navigate around it, I could use an Trace check in the enemy controller’s movement logic to check for any Blockade-type actors along the forward movement direction. This would have the advantage of looking better (enemies would navigate around blockades before colliding with them, but the disadvantage of being slightly slower.

Now in the Enemy Controller’s MoveAroundBlockade function that I created, I first run some logic to check that the enemy isn’t currently Targeting this Blockade Actor, or already attempting to navigate around it, and bail out if so (it would be silly to navigate around it in either situation).(10) Furthermore, I check that the Blockade Actor is actually in between the enemy and its target destination (or next movement position when using pathfinding) with a Trace. If not, I bail out as well.(11) This is to avoid the enemy attempting to navigate around a blockade that has been touched with a glancing side-collision; we’re only interested in Blockades that are on our direct movement path.

If the MoveAroundBlockade function made it past those checks, I then pick an object avoidance point to get to, determined to be on the left or right side of the collision normal’s direction, moved out the distance of the Blockade’s collision extent. Whether Left or right, specifically, is determined by the enemy’s current “object avoidance direction” (always starts ‘right’ first).(12) The “object avoidance direction” will switch if the enemy can’t find a valid spot to move to. Validity is checked by seeing if there’s a ground underneath that spot (using a Trace)(13), and also by seeing if Epic’s FindSpot() command returns true (FindSpot is an Actor function which attempts to find a valid, non-geometry-intersected position at or near a desired position).(14)

If a valid object avoidance point was found, then I put the Controller into a “MovingAroundBlockade” state which simply has it navigating directly to this target position. If we’re not valid, I switch directions and try again(15) (if the other direction fails, I just bail… hopefully some other enemy destroys that Blockade soon!).(16)

The MovingAroundBlockade state simply uses “MoveToDirectNonPathPos” to reach the object-avoidance point(17) (assumes that no pathfinding is necessary since the object-avoidance distance is very small and we already checked to ensure there wasn’t any collision geometry around it, and that there is ground underneath it, etc). I set a cancellation timer of 4 seconds just in case the object-avoidance point couldn’t be reached for some reason.(18) When the enemy reaches that point, it returns to the Seeking state, which will put it back on pathfinding towards its original Target. Of course, it might then hit ANOTHER Blockade (if you, for example, had two adjacent to each other), which will cause it to enter “MovingAroundBlockade” for the next obstacle – continuing to travel in the same direction will hopefully eventually get the enemy free of your Blockades.

This approach, while quite basic, works decently in most cases. Even when there’s a fairly complex layout of Blockades, the enemy will effectively get a “hand-off” from one blockade to the next as he moves around each one in the same direction – if he hits an impassable barrier, he’ll switch directions and try the other way. The primary case where this fails is when the network of Blockades is complex and long enough that the enemy’s original target position is at some arbritrary position within a concave network of Blockades, the enemy may never reach that point depending on the circumstances. In other words, it’s not true pathfinding, but rather simple object-avoidance behavior without any overall idea of the pathing structure.

While sufficient for the time being (especially since the Blockades are destructible and thus an enemy is just as likely to attack them), a nice improvement will be to use Nav Mesh dynamic outline navigation, which is something I will be exploring in the coming weeks.

Attacker Registration

In fact, without some method to limit how many enemies will attack nearby blockades, the blockade-avoidance system would be useless because the enemies would always just attack the blockades, rather than attempting to navigate around them. To get the behavior that I wanted, which is for a few enemies to attack a Blockade while others navigate around, I needed to implement an “Attacker Registration” system to limit how many enemies could simultaneously target a given Blockade (or any targetable Actor). I achieved this simply enough by adding another couple functions to my “Targetable” interface, RegisterAttacker and UnregisterAttacker.(19)

Doing this, of course, requires implementing the new functions on all classes which use the interface. Luckily, the objects that I actually cared about Registration for (namely the Blockade and possibly other placeable Towers) were all within my damageable object class hierarchy, so I only had to do my true implementation of these new functions in one place (my Pawns also use the Targetable interface, but they don’t currently care about Attacker registration, so to them I only added stub versions of these new interface functions).

In my DamageableActor class, the RegisterAttacker function simply adds the Attacker to an Attackers array (if not already present on the array), and the UnregisterAttacker removes it.(20) Furthermore, I changed Targeting Desirability function to return -1 desirability (meaning not targetable) if the length of the Attackers array is >= the maximum number of Attackers (a variable I added and specified in the defaultproperties).(21)

Finally in my EnemyController, I added a call to RegisterAttacker/UnregisterAttacker on the new and old Targets respectively in my SetTarget function (which is called whenever the enemy AI picks a new Target)(22), and set the Target to None if the Controller is Destroyed, so that it calls UnregisterAttacker if the Enemy is killed.(23) And that was it!

Now my damageable Actors could specify how many Enemies could Target them simultaneously, and additional Enemies would ignore those Targets and move onwards . It worked well for the Blockades especially, which I limited to two Attackers. The rest of the Enemies around a Blockade would then attempt to navigate around it, which was what I wanted to see – making the Blockades an effective technique to slow down enemies or protect critical locations, but not invulnerable.

Friendly Fire & Start Point Selection

When I started to do more multiplayer testing, I noticed that friendly fire was a problem – we were mopping the floor with each other, when we were meant to be harming only the enemies (ok… it was a little fun to zap your “friend” in the back.. which is why I left it as an option). The UT classes have a built- friendly fire checking capability, done in that game by querying its GameInfo class to adjust the damage based on the victim and the attacker; GameInfo then checks their teams and reduces the damage accordingly if they are on the same time. However I wanted to do use my own method, and so instead of using GameInfo adjust the damage, I simply added a function to my Targetable interface called “IgnoreFriendlyFireDamage” which takes the Instigator as Input, checks the teams, and returns true or false if they are on the same team and the GameInfo is currently allowing Friendly Fire.(24)

I then modified the few TakeDamage declarations in my base classes to check the result of “IgnoreFriendlyFireDamage”, and bail out (not apply damage) if true.(25) The nice thing about this approach is that by overriding the “IgnoreFriendlyFireDamage” function, certain classes may choose to always disable friendly fire regardless of the overall game setting -- for example enemies will never ever damage each other. An unfair advantage to them, heheh.

Next I noticed that when 4 players were spawning, they would all choose the same (valid) spawnpoint and thus appear on top of each other as the game began. I wanted my game to cycle spawn points, so that each player could be guaranteed to spawn a unique location as long as there were enough spawn points. To achieve this I simply overrode ChoosePlayerStart() in my GameInfo class, and checked if the current Spawn Point being tested was on a “Used Spawn Point” array (if so, ignore it and check the next one). When a spawn point was chosen I added it to the “Used Spawn Point” array, and if no spawn points were found, then I cleared the Used Spawn Point array and checked one more time.(26) Bingo, spawn point cycling worked.

Of course, another way to achieve this would be to create a new Spawn Point (actually ‘PlayerStart’) class that had a bool for whether it had been used or not, and then to clear all these bools if no PlayerStarts were found, or even to write a custom spawn point selection system and not use any of the built-in functionality. One of the beauties of Epic’s class framework is that while there’s usually an “easiest” approach to a given task, there’s often an infinite number of customized ways to handle it, and the exact method you choose is up to you. * UI Skinning*

Now it was getting time to release my little milestone build, but the UI’s (in particular my buttons) were still using the DefaultSkin with its moon-rock-crater default texture. I wanted to give my game a basic thematic styling, so it was time to create my own UI Skin. First, I duplicated Epic’s DefaultUISkin resource into my own package (called it DunDefSkin.DefaultSkin), and then changed DefaultUI.ini’s “UISkinName” key to point to that. Furthermore, I added DunDefSkin to the StartupPackages list in DefaultEngine.ini, which is important to ensure that it’s loaded as the application starts up. Now that I had specified a custom UI Skin for my game, I fired up UnrealEd’s UI Skin Editor to replace that moon-rock with my own button texture (simply by swapping the Default Image Style ->Button Background Style’s ‘Texture’ value for all of its states). I also added a SoundCue reference for the “Clicked” event, so now all of my buttons had some nice audible feedback when you clicked them. I also imported a custom True Type Font and set it as the font used in the Default Text Style. Now that the UI’s looked at least a little theme-y, it was time to focus on Sounds before preparing the build!

Sounds and Music

No game is complete without sounds to provide good feedback for the various gameplay events. And music too, to get players in the right frame of mind for… whatever it is you’re going to have them do! Thankfully, both sound and music are easily integrated using Unreal’s pipeline. There are various ways to get sounds in your game, and I employed several (though not all) of them in this bit of work. First, I imported my Wavs and had the Editor auto-create Cues for them (very convenient), taking care that all of the sounds I wanted to play in 3D SPACE were created with “Attenuation” nodes, so that they had 3D positioning & falloff. Next, I added an AudioComponent to my visual effect (Emitter) Actor which was spawned in various Archetypes for all of my key gameplay events(27). This would allow each of these visual effects to have a corresponding sound designated in its Archetype. I went through each Visual Effect Archetype and set the desired cue into its AudioComponent’s properties, and boom all of my visual effects had 3D sounds. The few cases where no visual effect was used, I either set the soundcue to play in the animation (by using a “AnimNotify_Sound” animsequence notification), or just played it via code (using Actor.PlaySound(), which also has some nice FadeIn and FadeOut options). I also created a PlaySound kismet UI action(28), which simply calls WorldInfo.PlaySound() in case a UI needs to play a Custom Sound not defined in the overall Skin’s sound list. Within 30 minutes, my whole game had 3D sounds playing for all key events, and that certainly added the gameplay feedback.

Next I wanted music. Soaring, sweeping, dramatic music! Or something like that. Whatever the music, with Unreal it was as simple as importing the Wav (if it’s a long music track, you can use a lower compression quality so that it will be more highly compressed), and then playing it via Kismet’s “Play Music Track” action, which has fade values as well. Via code it’s as simple as calling WorldInfo.UpdateMusicTrack()(29) with the MusicTrackStruct parameter containing the new music values. I employed both approaches for different purposes, and now my game had nice fading music corresponding to all key events within the level. With this final bit of emotional manipulation added, I was ready to package for release, woohoo.

Cooking and Packaging

Being a release-chef is simple: You open up the UnrealFrontEnd, you tell it all the maps that your game uses (in my case, being also specifying the Entry Level, Menu Level, & my Transition Level used for the seamless travel). I also had to add my primary “UDKGame” script package to the list of StartUpPackages specified in my DefaultEngine.ini, since I was relying on this script package for game initialization rather than the UT scripts (note you can have multiple script packages and they can individually be referenced by particular levels – but whatever package contains your custom GameInfo should be in the StartupPackages). Then you click “Cook”, and the Frontend+UDK generate all of the optimized-release content files. Finally you click “Package”, and it asks you what your game name is, for the Installer’s user interface. Click OK, the installer is built for you (packing in any redistributables required for the game), and when it’s finished you get your installer EXE put into your main directory. Pass this baby along to your friends/family/beta-testers/publisher and you’re good to go. ABC-123 indeed.

Thus ended another couple productive days blazing through the development process with Unreal. The game’s really taken shape now, and we’re pretty much in a content addition & tweaking phase, so it’s reached a point where I feel confident showing it to people. Stay tuned and soon enough you’ll be able give me your own feedback!

-Jeremy

Blog 5: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDef_SeqAct_ScaleFloatForPlayerCount.uc: 10
  2. DunDef_SeqAct_ScaleFloatForPlayerCount.uc: 22-26
  3. DunDef_SeqAct_TogglePostProcessEffects: 70-79
  4. DunDef_SeqAct_TogglePostProcessEffects: 240
  5. DunDef_SeqAct_TogglePostProcessEffects: 171-233
  6. DunDef_SeqAct_TogglePostProcessEffects: 236, 248
  7. DunDef_SeqAct_TogglePostProcessEffects: 139-155
  8. DunDef_SeqAct_TogglePostProcessEffects: 137, 117
  9. DunDefTower_Blockade.uc: 21
  10. DunDefEnemyController.uc: 456-460
  11. DunDefEnemyController.uc: 492-509
  12. DunDefEnemyController.uc: 522-530
  13. DunDefEnemyController.uc: 536
  14. DunDefEnemyController.uc: 563
  15. DunDefEnemyController.uc: 538, 545
  16. DunDefEnemyController.uc: 549
  17. DunDefEnemyController.uc: 656
  18. DunDefEnemyController.uc: 648, 602
  19. DunDefTargetableInterface.uc: 26, 29
  20. DunDefDamageableTarget.uc: 65, 73
  21. DunDefDamageableTarget.uc: 61, 93
  22. DunDefEnemyController.uc: 259, 272
  23. DunDefEnemyController.uc: 217, 283
  24. DunDefTargetableInterface.uc: 21, DunDefPawn.uc: 67
  25. DunDefPawn.uc: 148
  26. Main.uc: 274, 286
  27. DunDefEmitterSpawnable.uc: 154
  28. DunDef_UIAction_PlaySound.uc: 13
  29. Main.uc: 156

Blog 6: Day Seventeen

Hello once again, fellow UDK developers!

As we’re entering the final stage of Dungeon Defense’s development, for the past couple days I’ve focused on adding some polish to the AI behaviors, and then turned my attention towards a couple of the meta-game systems that will provide replay value. Additionally, I integrated a bunch of the artwork that was recently completed; one of the most exciting parts of the development process is when you get to see your gameplay spring to life with great art, and so nicely rendered by Unreal tech.

ddblog6-1.jpg

Anyhow, let me give you an overview of what was done on the code side of things, and then I’ll go into more detail about each topic:

  • I improved the enemy AI by adding periodic “stuck” checks in case the enemy is knocked off the navigation network (i.e. from momentum imparted by damage) or can’t reach its target for some reason. In which cases, it will attempt to find a new target while dynamically getting back onto the navigation path. I also added an “Aggro” system (familiar to MMO users) whereby enemies will become more aggressive towards targets that have caused them damage, based on how recently and how much damage, while also weighting in all other targeting factors. This makes the enemies more lifelike, while also providing additional tactical depth (i.e. you may be able to lure a powerful enemy away from an important defensive point by distracting it with damage).
  • I added a basic Global UI Notification system, so that mission objectives and other game information can be conveyed on a full-screen UI even during a split-screen game. Multiple notifications can be stacked and will gradually fade off-screen over time.
  • I added a “Score” system whereby players earn points for killing enemies, completing waves, etc – each score addition can have an “Award Name” attached to it, which pops up a little text saying why you earned that Score (for bonuses, and special point awards).
  • I also tied the Scoring system together with a new High-Score system, which saves the top-ten scores and displays them on the main menu and game-over screens. When you earn one of the top-ten scores during a game, you are prompted to enter your name for your High Score entry upon ending the game. This works for multiple players too!
  • I added a basic options UI, which saves options, and also allows you to change them during gameplay.
  • Did much gameplay tweaking and balancing, various fine-grained bits of polish that will hopefully make it more enjoyable for you all ?:)

So to begin with, regarding the AI: the NavMesh system is extremely useful but it can’t automatically solve every edge case. Namely, if the character is knocked off the navigable area, pathfinding will fail, so what to do? Well, it’s pretty simple: when NavigationHandle.FindPath() returns false but your target is still not directly reachable, have your AI Controller switch to a nearby target (or any navigation node) that IS in direct line of sight, then walk directly towards that target (using MoveToDirectNonPathPos(), which doesn’t employ pathfinding) until the NavigationHandle.FindPath()returns a valid path again. That simple solution solved it for our game – now enemies could be knocked just about anywhere and would still return into the playable area!(1)

Furthermore, I added a timer check to see if the Enemy has moved an appreciable distance every second, and if not, I move him directly to the left or right before continuing with pathfinding.(2) This solved the situation of enemies getting clogged up when they’re all heading towards the same target, now they would effectively move around each other.

With these pathfinding problem scenarios dealt with, I wanted to make the enemies a little more reactive to objects that were attacking them, rather than solely basing their targeting decisions on distance and static desirability. Put simply, I wanted a basic “Aggro” system that would make them weigh targets higher that had recently damaged the enemy (and no, I don’t play WoW :).

This was not difficult, it’s just a bit of array management. First, I added a Dynamic Array of “Recent Attackers” to my Enemy Controller (and added an Attacker to that array via the Controller’s NotifyTakeHit event).(3) However, instead of only storing a direct reference to the Attacker, I made this an array of so-called “AggroEntry” structs containing additional information about the Attacker –namely when the Attacker last caused damage to our enemy, and the current “Aggro Factor” for that Attacker.(4) The “Aggro Factor” would be the sum of all the recent damage (as a percent of overall Enemy Health) that the Attacker had caused to the enemy. By iterating through the “Aggro Entry” array every frame when in the enemy’s targeting state, I would decrease each entry’s “Aggro Factor” over time (when 0, I’d remove that entry from the list).(5)

Meanwhile, when picking targets, I would see if any potential Target had an “Aggro Entry” for it, and if so, increase the potential-Target’s desirability rating based on the current size of its entry’s “Aggro Factor”.(6) I also added a fixed time period (10 seconds) before the “Aggro Factor” starts decreasing on any new entry, in order to prevent the Enemy from ping-ponging back and forth between Targets. And that was it, the enemies would now tend towards attacking things which had recently attacked them, yet still factoring in overall target-weights of distance and target-desirability. Increased gameplay depth for everyone!

Next, I realized that I wanted overall screen notifications of key objectives, rather than everything being split-screen. Therefore, I added a “Global UI” which is set to FullScreen render mode, not per “player viewport”, and I open only one of these UI’s upon gameplay start, in GameInfo.PostBeginPlay(). Now that I had my guaranteed full-screen UI, I wanted to be able to queue up multiple notifications, and have them seamlessly fade into each other, and add them through kismet. To achieve this, I added an array of UI Labels to my Global UI scene, and set their references into an editable array within my Global UI Scene’s variables (via the Scene Editor).(7) I then created a function in my custom UI Scene class called “ShowMajorNotification” which sets the text (and plays a “pop-in” animation) on the next Label widget in that array (storing the index of the last Label utilized, and incrementing it each time).(8) Thus the game would be able to display as many simultaneous messages as I created Labels for, in my case three. This worked well, and now all players would be notified of major events on the full-screen – once I added a Kismet sequence action that simply called the “ShowMajorNotification” function on my Global UI Scene(9) (accessed from Kismet via a reference I stored to that UI Scene in my GameInfo).(10)

ddblog6-2.jpg

I actually implemented the same system within my per- player HUD UI Scenes as well(11), so I would have the option of displaying UI messages either globally for everyone, or per-player depending on whether an “Instigator” (a Pawn with a PlayerController) was passed to the ShowNotification Kismet action.(12) Sweet :)

ddblog6-3.jpg

After that, I opted to implement a basic Score system to provide incentive for replay/bragging rights. This was simple enough, I stored the Score value in my Controller class and added value to it whenever an Enemy was killed (using the “Killer” reference in the Enemy’s “Died” function to determine who got the kill)(13), and upon a few other events within the game (such as completing a Wave of enemies).(14) However, I wanted Scores in this game to be a little showy, so I implemented a custom UI Label class for the Score indicator, which “counts up” (over time) to the true Score that the player actually has, like one of those old-school cash registers.(15) Furthermore, I play a little “ding-ding” color and position animation(16) on the UI Label while it’s counting up, which draws a bit of attention to itself and just looks nifty.

Also, I wanted Score additions to potentially have “Bonus Names” tied to them (such as “COMBO KILL x2”). To achieve this, much like the major notifications above, I created another set of UI Label to display a queue of texts and scroll-them/fade-away over time via animation.(17) I stored an array of references to these UI Labels in my custom Score Label class(18), giving the Score Label responsibility for setting the “Bonus Name” text (if any) on the next UI Label whenever Score is added.(19) All in all, this worked well and provided satisfying visual feedback for earning points. The key with UI Scenes and UI Controls is if you want to do something a little funky/unique, just subclass one of Epic’s controls and create whatever custom logic you want!

Now that I had my Scores, it was obvious that I needed a way to store the top-ten values, to compellingly display them to players in various UI’s, and allow the player to add his entry with a custom name when a new high-score is earned. For this, I made use of Epic’s handy “SaveConfig()” functionality, which is useful for basic game-data saving if you’re not concerned about security. The official word from Epic is that the next UDK release is going to have DLL binding functionality, so us UDK developers will be able to use whatever saving schemes we can cook up in C++ (an infinite world of native data processing possibilities), but that would be overkill for something as basic as this.

So, I created a “Data_HighScores” class (deriving from Object, not Actor), defined a “HighScoreEntry” struct in it to contain my High Score info (score, player name, wave reached), and created a ‘config’ array of “HighScoreEntry”(20) . Using the ‘config’ keyword in the variable declaration means that the value of this array would be defined in the INI specified in my class declaration. So in my “DefaultHighScores.ini”, I defined default ten entries for the High Scores array (guessing at what would be good values… I need to play more to find out what my top scores will be!)

These entries are automatically loaded by Unreal whenever the Data_HighScores class is instanced as an object, which I did within my custom GameViewportClient class’ Init function. Modifying the Data_HighScores’ config variables (in this case, the High Score Entries), and then calling SaveConfig() on it writes the data back into the INI, and thus saves it. (21)

Keep in mind that rather than simply storing such simple information, you could store checkpoint data, or any other general game-save information you may need (so long as you don’t mind your end-user taking a peek at it). You can also use the “PerObjectConfig” keyword to save data per object instance, useful in case the user can create multiple saves or dynamically add more save-able objects, which you can then iterate through using ‘GetPerObjectConfigSections’ to find out what save data entries are available for loading).

In any case, now that I had my High Scores Loading and Saving, I needed to actually determine when a Player’s Score should be added onto the list, and display the High Scores through the UI.

For adding to the list, when players get “Game Over” (which, in Dungeon Defense, will always happen eventually ?:)), I check if each player’s Score is larger than any entry on the High Score array.(22) If so, I open a UI including a Edit Box (once for each player who achieved a High Score) requesting their “name” for the new entry. When they click “OK” on that UI, I insert this player’s new “HighScoreEntry” struct (with the player name they entered) at the appropriate index into the High Scores array, reduce the array size back down to just ten entries, and finally call SaveConfig() so that the new values are written to the INI.(23)

ddblog6-4.jpg

The last thing I wanted to do with the High Scores was display them on the UI, which I achieved by creating a custom UI Panel class (a “HighScoresPanel”), and giving it an array of references to ten UI Labels, one for each High Score Entry. I called a custom ”OnCreate” function for my “HighScoresPanel”, in which I set the string value of each one if those referenced UI Labels to the corresponding-index High Score Entry’s data.(24) So now I had a reusable control that would always display the High Scores, which I could put into any UI Scene I wanted; good for me, because I planned to show both the High Scores both on the Game Over UI as well as the Main Menu.

ddblog6-5.jpg

Finally, employing the same config method to save data as the High Scores, I created a very basic Options UI that allows the user to change certain game settings. These settings are simply some Boolean values contained in my GameInfo class,(25) which I represented in the Options UI with checkboxes. In the Option UI class’ SceneActivated() event, I set the checkboxes’ checked values to the corresponding variables in my GameInfo class , and when the user clicks OK, I copy the checkbox values back into the GameInfo’s variables and call SaveConfig() on the GameInfo class (clicking Cancel simply closes the UI without copying/saving).(26) Simple and effective.

ddblog6-6.jpg

So with this bit of AI and UI work completed for some polish and replayability, we’re nearing the final phase of development. I plan to implement more content before release, including a nifty trap that the player can trigger to slow down enemies, but altogether it’s nearly finished. I can’t wait for you all to play this little game, and I’ll continue to keep you posted over the next couple days as we wrap it up!

Blog 6: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDefEnemyController.uc: 972
  2. DunDefEnemyController.uc: 715
  3. DunDefEnemyController.uc: 915
  4. DunDefEnemyController.uc: 11
  5. DunDefEnemyController.uc: 109
  6. DunDefEnemyController.uc: 201, 167
  7. UI_GlobalHUD.uc: 11
  8. UI_GlobalHUD.uc: 27
  9. DunDef_SeqAct_ShowNotification.uc: 41
  10. Main.uc: 233
  11. UI_PlayerHUD.uc: 51
  12. DunDef_SeqAct_ShowNotification.uc: 36
  13. DunDefPlayerController.uc: 309
  14. Main.uc: 345
  15. UILabel_ScoreIndicator.uc: 53, 112
  16. UILabel_ScoreIndicator.uc: 59, 115
  17. UILabel_ScoreIndicator.uc: 103
  18. UILabel_ScoreIndicator.uc: 32
  19. UILabel_ScoreIndicator.uc: 65, 103
  20. Data_HighScores.uc: 11-19
  21. Data_HighScores.uc: 35-64
  22. UI_GameOver.uc: 75, Main.uc: 186-209
  23. UI_AddingHighScore.uc: 18-30, Data_HighScores.uc: 35-64
  24. UIPanel_HighScores.uc: 16-30
  25. UI_Options.uc: 31-43
  26. UI_Options.uc: 164-165, 55

Blog 7: Day Twenty-Three

Hello everyone!

It’s been a super busy half-week since the last blog entry, and so much has been accomplished! We’re in the process of wrapping up our little game-demo, applying final spit & polish to everything that you’re soon going to see for yourself. Consequently, a lot of the changes have been oriented towards balancing and content integration; it’s always a thrill to see so many aspects spring to life with great media. Of course, I’ve also made a ton of functionality additions, so let me review the major ones with you:

  • Through Kismet, I changed the lava pits, which were previously just a Physics Volume that applied a ton of damage, to instead teleport the player back to safety while applying a small amount of damage – while outright KILLING enemies. This required the addition of a new Kismet condition to test the class type of whatever Touched the trigger volume.
  • Added nice camera tracing, with interpolation, so that the camera won’t go through walls, and changed the camera rotation method (in the mouse control scheme) to simply involve moving the mouse to the edge of the screen. Also scaled the camera FoV dynamically with the viewport aspect ratio, to ensure that players can see just far when using widescreen resolutions / horizontal splitscreen.
  • Added a floating particle-effect underneath the mouse cursor that indicates where you’re going to shoot, and changed its color via a Particle Color Parameter to turn red when over an enemy. Only the owner Player sees this Particle Effect in his view, using the ‘bOnlyOwnerSee’ Actor/Component option.
  • Added a “gas trap”, which causes enemies to Cough until the gas dissipates – a handy technique to slow them down while the towers wreak havoc.
  • Added the ability to sell towers inheriting the states from the tower-placement system. Of course, you only get back a percentage of what you spent on them and their current health – ‘depreciation’ ala Economics 101.
  • Made each subsequent joining player get a unique character Material colorization, so you can easily tell each other apart.
  • Added a bunch of options to the main menu: gamma, sfx, and music volume sliders, and resolution selector, and fullscreen/postprocessing toggle.
  • Totally changed the look of the game, haha! I basically amped up the cartoon aesthetic by employing a geometry-outlining postprocessing material and contrast adjustment. Gives Dungeon Defense a distinct feel that expresses the playful vibe I’m aiming to convey.

Ok then, let’s get started with how this stuff works. First, let’s review the kismet for ‘respawning’ the player upon falling into lava, yet killing enemies when they touch it. Here’s a picture of how it works, and then I’ll explain a couple things about it:

ddblog7-1.jpg

So as you can see, there are two Touched events (for two lava volumes), which then check if the ‘Instigator’ of the Touched is an enemy class, and if so apply massive damage to it. Otherwise if the Instigator is a player class, pick a random spawnpoint from an object list, teleport the player to that spawnpoint, give him a little damage, and spawn a teleportation visual-effect at his new location. That’s it!

The only concerns to keep in mind are that I disabled bPlayerOnly on the Touched events, so that enemies could trigger them, and also set their MaxTriggerCount and RefireDelay to 0 so they can trigger over and over again as rapidly as necessary. The “Is Of Class” condition that I wrote simply has an editable variable ‘name ClassName’, and uses the results of “TheObject.IsA(ClassName) “ to activate the True or False output.(1) All done! (Note this will work fine with any other Volume type, of course, I only used PhysicsVolume because I had already placed them in the level over the lava, heh).

Next I decided that I didn’t like my camera going through walls, so I decided to do a trace to check for collisions against world geometry, and interpolate to the results of that trace so that the camera movement would be decently smooth when sliding across geometry. I did this in my Camera class’ UpdateViewTarget() function, where the camera position is calculated, by calling a “CheckForCollision” function that I wrote.(2) In CheckForCollision, I did a trace from the camera’s ideal location to the view-target’s (Pawn’s) location, testing against world geometry only.(3) If the trace hits anything, I take the hit positions’ offset from the original (ideal) camera location, and start interpolating to that. I do this every frame, so the interpolation is constantly updating, as is the target offset from subsequent traces. This has the effect of preventing the camera position from going through walls, and based on the VLerp speed, provides a smooth transition between camera-collided locations. I also added a bit of the “Hit Normal” results to the collided camera position, so it’s moved out slightly in front of the collided surface, as well as offset a little in the Z direction, to ensure the camera is always minimally above the player character (this being a top-down game after all). (5)

Next, I noticed that when I was playing in wide-screen, or the viewport was horizontally split in a 2-player game, I could not see as far and my killer gaming skillz decreased as a result. Thus I decided to dynamically adjust the target FoV based on the current player’s viewport aspect ratio, as compared to a standard 4:3 ratio. I did this in the camera class as well, where rather than directly setting the FOV to the DefaultFOV in UpdateViewTarget(6), I wrote an “AdjustFOV” function that gets the PCOwner’s (PlayerController’s) HUD resolution, and hence aspect ratio, and scales the output FOV by how that HUD aspect ratio compares to 4:3.(7) However, I didn’t do linear scaling, as that would be too extreme --- I raised the aspect-ratio-scale-factor to the 0.4 power, which yielded a more gradual increase in FoV as the screen became wider, without getting too fish-bowled. I clamped the scalar to a min/max of 0.75/1.5 just in case. Now the game was a better experience in uber-widescreen or horizontal splitscreen.

Finally, I decided my previous control scheme of holding down the Right Mouse Button to rotate the view was not so elegant, and instead wanted to go with a more traditional method of “move to the mouse to the edge of the screen to rotate in that direction”. To achieve this, I went into my PlayerController’s PlayerMove function, and added a check of whether the current Mouse Position was within 3% of the Left or Right side of the screen (by comparing the mouse’s position against 3% of the HUD’s X resolution).(8)

If the mouse was indeed at the left/right edge of the screen, and the sign of the Mouse Delta was in the direction of that edge of the screen, I then applied the current Mouse Delta X to my “Rotate Camera” function. So scrolling the mouse left at the right edge of the screen, won’t rotate the camera left. I think this turned out to be considerably more natural than having to use the RMB to rotate the view, and frees up the RMB for later usage -- done and done!

Next I wanted to give the player better indication of where he was aiming at, and whether he was targeting an enemy. I decided to go with a Particle Effect in the world to indicate this, rather than a purely UI or Canvas effect, so that it would be scaled in 3D space. I added a ParticleSystemComponent to my Player class(9), and gave it a swirly-vortex particle template in my Player archetype. I set bOnlyOwnerSee on this component so that it would only be visible in the view of that particular Player. I also set the Scene Depth Priority Group of this Component to “Foreground”, so that it would not get obscured by any geometry and instead appear more like a UI element. And I set AbsoluteTranslation to true on the component, so that specifying its Translation would be in world space, rather than Actor space. Finally, I passed a reference to this component to my PlayerController, which would be responsible for positioning it at the pointer-target location.(10)

Then, my PlayerController class simply sets the Translation of this component to the screen-raytested location noted in prior blogs, and that was it: a cool particle effect indicator of where I was pointing.(11) However, I wanted this indicator to change color when pointing at an enemy. I already had checked for this case by testing whether the screen raytest hit an Enemy class in PlayerMove. So to get the particle system to change color, I added a “Color Parameter” to the ParticleSystem’s sub-emitters. I set the ma,e of this Color Parameter Module to ‘Colorizer’, and then from code called ParticleSystemComponent.SetColorParameter(‘Colorizer’, NewColor) to modify the color in-game. I changed the color value to become red when over an enemy, and white (multiplied by the particle’s inherent color) when not over an enemy.(12) Note that the Particle Color Parameter module only affects NEW particles that are subsequently spawned from the emitter, not particles which are already in existence. To achieve immediate colorization of all particles within the effect, you’d essentially have to use an MIC to dynamically change the complete colorization of the effect’s material. This wasn’t necessary in my case, because the system was rapidly emitting new particles with short lifetimes.

ddblog7-2.jpg

Next, I realized that in multiplayer it would be nice for each player to have a different appearance, a different color. To handle this, I created 4 Material variations of my base player material, swapping the color channels in each to create variations of the diffuse texture. I then added an array of Materials to my Player archetype(13) , and in my PlayerController ‘s PostBeginPlay function, I checked how many other LocalPlayerControllers currently existed, via the LocalPlayerControllers iterator. Based on how many existed, minus one, I decided which player “number” I was(14). Then in my PlayerController.Possess function, where the Controller takes over the Pawn, I used this player ‘number’ as an index to grab the desired Material from that array specified in the archetype. Finally, I called Mesh.SetMaterial() to apply this chosen Material onto the character mesh (in my case, on the 0 element, since the mesh only used a single material).(15) And there it was, now each player in the game looked unique!

Getting close to release, I knew that I wanted to add some additional options to the UI, so that changing the Resolution and various other settings wouldn’t necessitate users digging into their INI’s. Specifically, I wanted to add options to switch between common resolutions, fullscreen/windowed, disable the postprocess effects (in case some people don’t like them, sadness, or just have weaksauce video cards), and have sliders to adjust gamma, music volume, and sound effects volume. Let’s go over briefly how I implemented each of these:

  • Resolution Selection & Fullscreen Toggle: For this, I used an array of Checkboxes containing string data for my supported resolutions (“1024x768”, “1280x720”, etc). I set the ButtonClicked delegate on these checkboxes to see if any of the other resolution boxes were checked, and uncheck them if so (so that only one resolution can be selected at a time). I also prevented the regular “unchecking” behavior of the checkbox by setting its value to true upon each click, so you can only enable them, never disable.(16) The fullscreen toggle was just a checkbox with default on/off behavior. Finally, when the player clicks “OK”, I call the “SetRes [resolution][fullscreen]” console command with the currently selected resolution checkboxes string value, and the fullscreen value. The “SetRes” console command takes care of the actual resolution changing, including storing the latest values in the user’s INI.(17)
  • Postprocess Toggle: I also added a checkbox for toggling postprocessing, the value of which I simply store in a config Boolean in my ViewportClient class (since it appears there’s no global postprocessing config value anymore). In my ViewportClient init function, if the post-processing Boolean is false, I execute the Console Command “show postprocess” to toggle postprcessing off.(18) I do this every time the post-processing Boolean is toggled via the options menu as well, and call SaveConfig() on my ViewportClient class to store my postprocessing bool in the user’s INI.(19)
  • Gamma Slider: I added a Slider to my options menu, set its min and max values to reasonable gamma values, and execute the console command “Gamma [SliderValue]” every frame while in the options menu (because I didn’t feel like adding a script callback for it, ha).(20) I also store the current gamma value as a config variable in my ViewportClient class, since it doesn’t save otherwise, and also set this value as the active Gamma in my ViewportClient initialization (using the console command).(21)
  • Music & SFX Slider: For this, I needed to specify ‘SoundClasses’ on all of my game’s SoundCues, and set a ‘SoundMode’ on the AudioDevice control the Volume of those respective SoundClasses. I added Epic’s SoundModesAndClasses.upk to my game’s startup packages (so I could use all of their built-in soundclasses), and edited the SoundCue class’ default properties to have the default SoundClass be “SFX”. I specifically set my music cues to be of the “Music” SoundClass in the Editor, and set the ‘Default’ SoundMode on the AudioDevice in my ViewportClient’s initialization. In the Editor, I added two Effects to my ‘Default’ SoundMode’s Effects array, one for the ‘SFX’ soundclass and one for ‘Music’, so that I could individually control their volume levels. I then wrote a Set Volume function to alter the ‘VolumeAdjuster’ values of the Current SoundMode’s Effects array (with the indices corresponding to the “SFX” and “Music” soundclasses I set up).(22) Finally, I added SFX-Volume and Music-Volume config floats to my ViewportClient class, and in the Options menu I set them to the respective UI Slider’s values.(23) While in the Options menu, I call my SetVolume function to have the slider-driven values continually update into the AudioDevice.(24) And presto, real-time adjustable individual audio settings for SFX and Music!

ddblog7-3.jpg

As we’re nearing the finish line, I dropped a bombshell to my artist comrade Morgan Roberts. I wanted to toon-ify the game! Not exactly cel shading, which would be extreme for what I wanted, but specifically I intended to outline geometry and reduce the color contrast a bit for a softer look. This was easily achieved by adding two Material effects to our PostProcess chain.

First, to achieve the geometry outline, I sample 8 depths around the current pixel’s screen-position, and then average them. I compare this averaged “nearby” depth to the current pixel’s depth, and if there is a significant difference (greater than a threshold), I return a black color rather than the actual screen color. Thus, black lines are drawn along edges. With some tweaks to the threshold values, I got it looking pretty cool in short order.

Next, I wanted to make the game’s overall contrast ratio more brightly toon-themed. Specifically, I decided to bring up the low-intensity colors, while not squashing the high-end colors. To do this, I dotted the screen pixel color value with a value of 0.5, and used the result to lerp a scalar from 1.5 (to brighten up the low-intensities) to 1.0 (not much affecting the high-intensities). I then multiplied this scalar with the original scene color, and saturated it as well to yield an even more colorful view. You can see the results of this below, what a difference a little post-processing can make!

ddblog7-4.jpg

ddblog7-5.jpg

With a minimum of artistic changes, then, we transitioned from a sort of gritty-realism, to a toon-esque romp through a colorful fantasy universe. Such is the power of Unreal’s post-processing system, and its capability for real-time parameter adjustment using MIC’s. With these visual adjustments out of the way, we’re now in the final stage of testing and tweaking, before releasing Dungeon Defense to the community. I’ll be in touch soon with the results of that last step!

Blog 7: File References

Information discussed in this blog comes from the files listed below, which are part of the Dungeon Defense source code. Line numbers separated by a comma indicate multiple individual lines within the file. Line numbers seaprated by a hyphen indicate a range of lines within the file.

  1. DunDef_SeqCond_IsOfClass.uc: 14
  2. DunDefPlayerCamera.uc: 243
  3. DunDefPlayerCamera.uc: 278
  4. DunDefPlayerCamera.uc: 288, 295
  5. DunDefPlayerCamera.uc: 286
  6. DunDefPlayerCamera.uc: 145, 205
  7. DunDefPlayerCamera.uc: 354
  8. DunDefPlayerController.uc: 1550-1557, DunDefHUD.uc: 56, 62
  9. DunDefPlayer.uc: 494-504
  10. DunDefPlayerController.uc: 226
  11. DunDefPlayerController.uc: 1586
  12. DunDefPlayerController.uc: 1594-1596, 1606-1609
  13. DunDefPlayer.uc: 88
  14. DunDefPlayerController.uc: 270
  15. DunDefPlayerController.uc: 224, DunDefPlayer.uc: 114
  16. UI_OptionsMenu.uc: 140-152
  17. UI_OptionsMenu.uc: 71
  18. DunDefViewportClient.uc: 297
  19. UI_OptionsMenu.uc: 68, DunDefViewportClient.uc: 359
  20. UI_OptionsMenu.uc: 94
  21. DunDefViewportClient.uc: 299, 354
  22. DunDefViewportClient.uc: 318
  23. DunDefViewportClient.uc: 51-52, UI_OptionsMenu.uc: 39-40
  24. UI_OptionsMenu.uc: 95

Blog 8: Day Twenty-Six

Hello everyone and welcome to my last blog entry! (for now) It has been an intense four weeks of game development, but also a totally wonderful experience. I’ll let the pictures do the explaining.

We went from this at the end of Week 1:

ddblog8-1.jpg

To this at the end of Week 4:

ddblog8-2.jpg

As expected, Unreal performed beautifully throughout the process , and with its capabilities we were able to achieve so much in such a short timeframe. I hope you all are enjoying the Dungeon Defense demo as you read this (well, not at the same exact moment… unless you’re a really talented multi-tasker!). The team and I are eager to find out what you think of our little game, and to answer any questions you may have about how we put it together. So please join me on the UDK forums and I’ll be there to shed some light on anything that comes to mind.

In the meantime, to cap off this blog series, I’d like to take the opportunity to discuss some of my overall approaches to game and prototype development with Unreal. These are my own particular opinions, and may not be applicable to every circumstance or universally agreed upon, but I hope they may be some of use with your own creative endeavors.

Let’s call these “Jeremy’s Eight Crazy Rules of Unreal Game Dev Goodness“ (just in time for Hanukkah):

1. Nail Your Core Mechanics Early and Iterate, Iterate, Iterate

One thing I’ve learned over numerous projects is there’s no point in building the house before you’ve laid down solid foundations. In other words, get your gameplay mechanics to be fun and enjoyably working before you go full-bore into artistic production and level development. Such a statement might seem like a no-brainer, but in practice it’s often so exciting to jump right into the thick of full content development that you can end up forgetting to build those foundations first. You’ll save yourself a lot of headaches if you not only know exactly what game you’re creating before you start to produce costly assets (i.e. at least a design treatment), but also have said game very much playable by that point (this does not necessarily mean bug-free or visually compelling).

Furthermore, the earlier you get your gameplay mechanics in place, the more time you’ll have to iterate, which means to refine by repeated passes. Iteration can take place throughout much of your development cycle, but the more of it you can squeeze into your pre-production/conceptual prototyping phase, the better.

Of course as I’ve mentioned before, Unreal has some great tools to support rapid iteration. They include the Remote Control (for changing values in real-time), Archetypes (for having values be essentially data-driven rather than hardcoded), and Play In Editor (for playing within a level that you’re actively modifying in the same application instance). Make use of each and every one of them, indeed simultaneously (i.e. you can have Remote Control open in PIE), and you’ll be iterating faster than you can spell it three times fast. Your gameplay will thank you.

2. Use Placeholder Assets to Prototype Your Gameplay

Ever have a 3D artist model a character, a tech artist rig it, and an animator create a bunch of animations only to find out that none of the media was appropriate for what your gameplay actually needed? No? Then lucky you. I’ve made that mistake before, and needless to say: artists do not like it. And why should they? Programmers and designers need to make sure that before the final artwork is put into the production pipeline, they know exactly how the art is supposed to be constructed for gameplay purposes, and have clearly communicated that to the pertinent artists. I’ve found that the best way to do this is to use PLACEHOLDER assets – simple generalized versions of say, a humanoid character or a weapon, that can be used to represent the final asset. Ideally, the placeholder asset will have roughly the same dimensions, shape, and (in the case of a skeletal mesh) bone structure as the final asset, but depending on the complexity of the mechanic that may not be necessary.

Not only does using placeholder assets, critically, allow you to ‘Nail Your Core Mechanics Early’ (see Rule #1), but it also allows your artists to see the intended behavior functioning in the game before they go off to make the final media; placeholder implementations can be the most effective communication method possible. Furthermore, the artists themselves can usually swap out the placeholder assets for final ones directly, meaning THEY can iterate on the visuals within the context of the actual gameplay, rather than only in an abstract space. Thankfully, Unreal makes it easy to swap out temporary assets with final ones: just change a few references (Mesh, AnimSet, etc) in an Archetype or in DefaultProperties (don’t ever reference directly in code lines!), and you’re good to go. Thus Endeth Commandment #2.

3. Follow The Unreal Way

There are usually many, many ways to achieve gameplay results with Unreal, but often far fewer ideal ways. These ‘Unreal Ways’ usually involve making full usage of the functionality that the UnrealScript interface provides you, capabilities that extend well beyond a basic programming language like C++ or Java. As random examples…

[*]Want to have something delay or occur over time? Use latent state functionality (like Sleep or MoveTo) or Timers. Don’t do a bunch of Time-based if statements in a Tick! [*]Want to find out about certain Actors around your Player? Don’t do an AllActors and individually check distance from every Actor in the World, use OverlappingActors and give it your radius. [*]Want to have setup armor attachments all over your player character? Don’t create an Actor for each one – dynamically create and attach new Mesh Components (or even a custom Component class) to your Pawn! [*]Want to make a bunch of Materials that vary only by one texture? Don’t create unique base Materials for each one – use Material Instance Constants that share a Parent Material and just swap out a Diffuse Texture Parameter. [*]And, offhand, don’t… don’t… don’t… assume that structs are passed around by reference. By default, they’re always deep-copied, unless you use the “out” keyword in your function parameter. Deep-copying structs needlessly can be slow, memory hogging, AND potentially bug-creating if your logic assumes struct variables are non-unique references. Epic has devoted a whole UDN topic to explaining this. ?:)

UnrealScript and the Engine framework is built in a particular way to make game implementation a much more robust experience from working in straight C++ (like many, I’ve been there, done that). As you develop with Unreal, you’ll find more inherent functionality than you could have ever expected. If you’re struggling to implement something with Unreal, then you might simply not know about some handy functionality that already exists within the engine – when in doubt, that’s usually the case. Read Epic’s code, look at their samples, stock up on interesting UDN articles, perhaps even look at Dungeon Defense (not that it’s perfect ;), and you’ll identify usage patterns that point the way towards making full use of the super-powered framework at your fingertips. Which leads us right to…

4. Search Epic's Codebase!

The UnrealScript Engine-framework codebase is quite large. It’s also very well documented, but it can be daunting at times to figure out where to start; “reading” it from beginning to end is not particularly advisable unless you’ve got a constant supply of Red Bull hooked up to an I.V. (Though I would at least recommend familiarizing yourself with Actor.uc & Object.uc to begin with).

Ultimately, your best friend to turn a scary large codebase into a responsive encyclopedia of knowledge? “Find All in Files”, a search capability of a Code IDE like Visual Studio (which has a free version, Visual Studio Express) , or other text editing programs. By searching for keywords like, say, “Cursor”, “Force” (or whatever you’re generally looking for) within all or some of Epic’s code files, you’ll usually get a pretty good idea of the pertinent functionality Epic’s already provided to handle all sorts of common game needs. A good rule of thumb: before you decide to “roll your own”, search Epic’s codebase to check if they’ve already “rolled it” for you! You may be surprised just how often they’ve already done what you’re looking for… then again, if you’ve played Gears of War, maybe you won’t be surprised.

5. Use nFringe. Period.

Ok, I gotta hand it to the guys at PixelMine: nFringe freaking rocks. Its ‘Intellisense’ and code parsing is usually spot-on, and its syntax checking is also a huge help in reducing stupid ass bugs (as opposed to just stupid bugs). Using nFringe will boost your coding productivity massively (or at least has in my case), and it will also help you explore Epic’s codebase (and your own) much quicker. Through the Intellisense and member lists, you’ll be hopping around classes with ease and quickly examining their variables, functions, and states.

I can’t emphasize this enough: if you are new to programming with Unreal, nFringe will help you get a leg up much faster. There’s just one problem at the moment: Unreal has a powerful debugger, but nFringe is locked from accessing it unless you purchase a commercial nFringe license from PixelMine, which is currently only available to pro developers. C’mon PixelMine, make your ungodly awesome tool fully available to the masses and they’ll worship you as (unholy) deities! Or something like that. But yeah, download nFringe now (+ Visual Studio Express if you don’t have it), and start coding like you’ve never coded before!

6. Employ All The Debugging Methods At Your Disposal

With or without nFringe, there are still lots of ways to debug your gameplay with the Unreal framework, and you should make use of all the various techniques to achieve maximum results. Among the methods I favor are:

[*]Debug Draws (Spheres, Lines, Boxes, etc): these help you visualize what’s occurring in 3D space, useful if you need to see the results of a calculated 3D transformation or just to get an idea of size, etc. [*]Log statements: Ahhh logs: Spammy but so informative. Especially when the nFringe debugger doesn’t work! Ok, with logs you can immediately print pretty much any data type to the output window (which you can summon with the console command showlog), and concatenate multiple strings using the “@” symbol so that you get maximum information onto a single line. Just be careful not to put them in your Tick functions and leave them there – spamming the log can slow down your game. In fact, it’s best to have logs only active if your Actor class has the “bDebug” toggle set, which is an editable variable that you can switch at runtime should the need arise. [*]Unlit Mode, Wireframe Mode: In cases where you’re working on something graphical but the level lighting is screwed up, just press F2 to change to unlit mode. Or if you need to see through walls (cheater!), press F1 for wireframe mode. This can be surprisingly helpful to see what enemies AI are doing when they can’t see you (voyeur!). [*]The “Show Collision” console command will visualize all collision in the world, which is good in case you seem to be facing a collision related problem. Can’t get through that hallway? Maybe it’s not a game bug, maybe your level designer put a big ol’ invisible blocker mesh down in front of the entry way, and hid it… just to spite you. ‘Show Collision’ will reveal all! (more practically, it’s super helpful to see what the collision sizes of your Pawns are) [*]Use the Time Dilation setting on the Remote Control to literally slow down time in the game (not in real life… that’d be awesome but Unreal just isn’t that powerful yet). This can be super useful to see exactly what’s going on with visual effects and animation in microscopic gameplay detail to detect any strangeness. Helpful for resolving timing-related issues. [*]The Remote Control can also display all Dynamic Actors that have been spawned during gameplay if you click the “Clock” icon on the Actor List. This is useful to double-check that Actors aren’t staying alive which should be destroyed (i.e. are your projectiles hanging around after they’ve collided? etc). [*]All Kismet actions have an “Output Comment To Screen” option, which when enabled will print their comment to the in-game console display. This can be useful to get an idea of what actions are being triggered, and when. Or, for the pro Kismet guru, use the Console Command Kismet Action (to “say”), combined with my Concatenate String Action to print-out any Kismet variables that you want :)

By using these debugging approaches among many others, & one day (we hope) an nFringe debugger that everyone can enjoy, you will become a wide-eyed bug-squashing fiend. Which is a good thing.

7. Be Clever With Kismet, But...

Kismet, how do we love thee? You give level designers the ability to implement gameplay, and you give game designers the ability to prototype rapidly. You giveth… and you taketh. Kismet is a fantastic, unrivalled tool for level interactions, and even for prototyping certain level-oriented gameplay mechanics. However, it has its limitations: it’s not particularly object-oriented/inheritable, it’s not as fast as UnrealScript, and it doesn’t have all of the debugging capability of UnrealScript.

Therefore, trying to do everything through Kismet is not a viable approach to construct the final version of most games. By all means if you wish, prototype your gameplay with Kismet where possible (it certainly can do a lot), but bear in mind that you will likely have to rewrite much of it for your ultimate product. If you find yourself struggling to do something through Kismet, then in my view you should consider doing it through UnrealScript (or, consider writing new Kismet actions to extend its capability further).

My general rule is: if it’s something that occurs in the persistent design of a level, do it through Kismet. If it’s something that has to be dynamically spawned and relates to a dynamic object’s behavior, rather than the level itself, it’s better done through UnrealScript. This outlook comes from a lot of experience with Kismet, which I love dearly and have pushed to the ends of the earth and back. Don’t get me wrong: you could theoretically write an entire game using dynamically-spawned Prefabs, but past a certain point, my opinion is that it’ll get in your way. But don’t worry dear readers, my love affair with Kismet will never end: For Dungeon Defense, I drive the entire high-level gameplay logic through it.

8. It's All About The Fun

Look guys & gals, we’re indie developers (except for you Cliffy B, if you’re reading this, you’re a big fancy celeb now!). That means, by and large, we’re focused on the Fun and not the, how to put it, multi-million-dollar budgets. Of course, that may be a goal for many of us (you know who you are), and more power to you – Unreal will provide the capability to take you there.

But never lose sight of the fact that when you’re out to prove to the world how brilliant your game is, and why it’s worth everyone’s time, it doesn’t matter how pretty your graphics are (though they help), or how many hours of playtime you’ve got (oh Oblivion, eater of man-months), or how many polygons your main character has in her mammaries. What matters is whether the player is engaging in an interactive experience that gives them good feedback and rewards them in satisfying ways.

Ironically enough, sometimes the designer in the trenches is not always the best day-by-day judge of this. So get your friends, family, co-workers, and pet dog to play your game at key junctures, so that they can tell/bark you their feedback on how it’s handling. It won’t always be pretty, and even constructive criticism can sometimes be a tough pill to swallow, but you and your game will be better for it. Unreal is indeed the most powerful game creation technology in the world, but how you use it – whether you make a well-balanced, fun game or another… hmm… Monster Madness – that’s all up to you!

Take care everyone, and keep developing your dreams into realities :)

-Jeremy Stieglitz