Fragments of Him – Creating The Interactions December 18, 2014
This article will cover the interaction tool that we use to build Fragments of Him. This tool has seen several iterations before arriving at its current state. When we started building Fragments of Him, one of the questions we asked was “How can we easily implement content and gameplay without writing a lot of custom code”. Back then, the answer was pretty simple. We write a custom interaction tool.
How it worked
When we first started, all we needed was a simple way to trigger dialogue from either a click or walking through an area. That wasn’t very difficult, and a simple script with a couple of variables would do the trick. Then we figured that we needed to play object animations, character animations, play audio, enable the next interaction, apply and drive a player feedback outline, force a player to move out of the way for the character animation and more. As you can probably see, this became quite chaotic rather quickly. After patching in all these features to the tool we started with our tool ended up looking like this:
Not only did it look very messy, it would require quite a bit of overhead in thinking of all the little quirks in the tool to implement an interaction. In more complex cases, we needed to trigger multiple animations, with several pieces of audio, play the accompanying dialogue audio and then indicate what interaction is next. Instead of trying to fix the tool we had, we opted to create a new version instead. This new version should be faster to use and be more dynamic.
Designing a Modular System
One of the things that became clear when developing this new system is that it should be modular in its use. If we need to trigger 40 things after a single click, that should not be a problem. The new system would have a single script (the base) that would keep track of all the modules of that interaction. In addition to that it should trigger the functions of all the individual modules. Each of the modules can then have its own systems for playing and deleting, allowing for an easier way to create, maintain and clean up modules. Here’s roughly how I planned it out:
That seemed fairly simple and formed the starting point. We developed a couple of the basic modules to test out the callback system and gave it a whirl. While testing, we still used the inspector to drag in all the scripts and variables. While this isn’t too bad of a solution, it doesn’t give us any overview of the entire system. We figured that developing an interface around the new interaction editor would give us the overview we wanted.
The above screenshot is how the tool currently looks. Whenever we need to add a new module, all we have to do is add a button once it’s completed (and make sure the parameters show up properly) and it’s good to go. When an interaction is created, this is how it will show up in the scene:
To indicate where all these interactions take place we use the Unity’s colored icons. This makes it easy to identify where an interaction takes place. Additionally, it also makes it easier to find an interaction when looking for one in the scene. Sometimes we would forget to add an icon to an interaction and that could get confusing pretty quickly. Not only do we now automatically add an icon when creating an interaction, we also create and update a list that keeps track of all the interactions in the scene. To modify an interaction before this new tool, we had to find the interaction in the scene hierarchy or click on the icon in the scene if we thought of adding it. That turned out to be very frustrating and inefficient when trying to quickly check multiple interactions in a scene. The solution to this was to add a list to the interaction editor that keeps track of all the interactions in a scene. The currently selected interaction is coloured brown, the interactions that will enable after completing the interaction are marked as blue.
With our first (old) interaction tool, we spent a couple of days trying to get all the interactions in the first scenes (while adding features in meantime). This resulted into a lot of workarounds that were needed to get around limitations. The new system took a week (give or take) to build, and instead of spending a day to put in interactions, we can now do the same amount of implementation in roughly 2 hours.
If you have any questions or remarks, don’t hesitate to get in touch with me via Twitter.
Fragments of Him Release Date Pushed To 2015 December 10, 2014
I speak for myself and my company when I say that there isn’t much that is more difficult than
coming back on a promise. A promise, to us, is like an oath that we will not easily make unless we are very sure that we can live up to it and not disappoint those we have made the promise to.
You may recognize in your own life and ambitions that there are moments that you feel the need to
push yourself to achieve, even though the goal is really difficult to pull off. In order to push ourselves,
we have set and communicated an unrealistic goal that we will need to revisit and set right.
This goal has been setting a release date for Fragments of Him. In hindsight, the release date that we had set for ourselves (Winter 2014/2015) was rather optimistic considering the amount of unknowns we were dealing with at the time. We laughed in the face of reason and are now embarrassed for not living up to the expectation of meeting this release date.
The biggest question on your mind is probably, why is it taking so long?
There are several reasons causing this to take longer than expected and I will try to shed light on this
as best as I can.
At the time of estimating a release date for Fragments of Him, the script for the experience had
undefined elements in some of the core narrative arcs and that made it difficult, if not impossible, to
get a clear indication of when we can get Fragments of Him finished. We now know that the core of
the story that we want to tell feels whole and rigid. With that, we know the number of characters,
environments, scenes, and narrative branches we need in order to tell this story.
One of the biggest time sinks, by far, is that we are creating an experience with a lot of unique
elements in it that require artistic authoring to get it the way we want. An example of this is the 3D
models we need to shape the environments and characters and interactions that come with it. There
are a host of environments in the game for various scenes, throughout a range of times, with
changing seasons, portrayed by a number of characters that occasionally wear different outfits,
which are driven by a lot of motion capture data. We are undertaking this challenge with a
reasonably small team and we’re paying its price.
We are using new tools and techniques to make Fragments of Him look and behave as an experience that is unlikely to come from a small team such as ours. We can also tell you that over the past year we managed to become registered ID@Xbox and Sony developers with help from our fellow friends and developers. What this exactly means for the future we can’t say just yet, but it may give you an idea.
A realistic release date
With all this uncertainty about when this project is finally ready to show to the world we have
adjusted our estimates and expectations. Our burndown chart is now telling us that we are 387,5
working days removed from finishing what we have started by following our current course.
According to the almighty interweb, there are 251 working days in 2015. Luckily, this project isn’t being made by a single person and we can distribute this labor of love over the people that help us realize our vision. That leads us to assume that we should be able to bring you the title that you may have been eagerly waiting for somewhere in 2015. We have realized that we are not that good in setting a release date based on feelings, hunches and wishful thinking. We really want to bring you that which we have promised and are hard at work to realize this.
On behalf of the team and myself I sincerely and humbly apologize for not being able to live up to our promise and we hope you still choose to support us in realizing Fragments of Him.
The Fragments of Him Interface November 27, 2014
With the release of Unity 4.6 and its new UI system, we thought it was time to go into more detail on how we designed and built the interface of Fragments of Him using Unity 4.6. We have been actively using the Unity 5 beta so far, but most (if not all) of the UI systems should work exactly the same as in 4.6. In this blog post I’ll be going over most of the technical aspects of the menu in Fragments of Him. Let’s start by showing how the interface currently looks:
The menus in Fragments of Him appear relatively simple and its mainly for players to change game, audio, and graphics settings. Here’s the flowchart of the menu to give you an idea of how it’s structured:
With the options panel active, each of the following buttons (e.g. Gameplay) opens up a panel with the respective options.
Organising the Scene
When you see the menu from within Unity 4.6 then this is what you would see:
Let’s go over what we’re actually looking at.
1 – The strong lined white rectangle on the left (with the Fragments of Him logo in it) is the interface that is overlain on the game screen. It is where we’ll have to move objects to if we want them to be visible to the player.
2 – The Menu Panel is the first panel that we show when starting or pausing the game with primary options such as Start/Continue, Load Game, Options, and Exit Game.
3- The Options Panel is what you see positioned outside the white rectangle and this panel will appear when clicking on the options button. The menu panel will slide out and the options panel will slide in.
4 – When an option is clicked on, the Options Container will slide in which contains all the options such as gameplay, graphics, audio, and controls. Conversely, when leaving the options panel, the options container will slide back out revealing the menu panel once again.
In the scene hierarchy this looks something like this:
To give players the interaction feedback they need we are using the new button component that allows us to indicate how a buttons should behave in various states. All of our buttons have the same basic interaction setup that looks like this in the inspector:
We handle the transitions between button states with mecanim (you can see the structure of that below), this is done so that the art team has a little bit more control over the animations (and our Lead Artist Tino has this thing for smooth transitions between states). Do note that this is the Unity 5.0 interface for mecanim, which probably looks different in Unity 4.6. The list on the left are the triggers that trigger the animation states, On the right side you see that these states can be driven from any situation.
Hooking it all up
Until this point it was all pretty easy and straight forward. Then we tried to make this all flow into each other seamlessly, accommodating for whatever the player wanted to do with it. In our experience we noticed that we couldn’t animate the entire interface with only Mecanim. That meant for us that we had to solve some things with code.
At first we opted for letting the art team assemble and animate the entire interface using the Unity 4.6 features. We figured that it would them full control of how things would look and feel. However, back then we didn’t realise that Mecanim relies on the internal timescale to perform its animations. When we set the Time.timeScale to 0 in order to pause the game, it also meant that the interface would not animate as intended.
EDIT: Thanks to everyone tweeting me for a solution to this, apparently there’s a dropdown box in the Animator called “Update Mode” (see image below). Set that to “Unscaled Time” and it’ll work just fine!
To solve this problem, we went over a couple of different solutions. In the end had to settle for somewhat of a hack. To pause the game we would set the timeScale to 0.00001f. Then we set the animation speed of the interface animations to 100000 to make it look like everything in the interface plays out in real-time.
When you enter the options panel, the code takes over and all the animations are scripted. This is done by using iTween (iTween has a very useful ‘ignoretimescale’ parameter). For those curious as to how that works, this is the code that makes sure that the panel slides into view of the viewport:
iTween.ValueTo(gameObject, iTween.Hash( "from", 900,
The code handles the sliding in and out of panels and also makes sure that the player gets the correct feedback. Additionally, we accommodate the interface to be used with either a game controller, mouse, or keyboard. Each of these input interfaces also requires its own set of feedback systems.
Step by Step
Whenever the player presses the menu button, which is usually the Escape key on the keyboard or Start if you’re using a gamepad, the menu should pop up. What we do is simply turn on container 1 and 2 using GameObject.SetActive(true);. In addition to that we want to make sure that if the player is using a controller, the first object is highlighted. We do that with EventSystem.SetSelectedGameObject(continuebutton, null); and we make sure to trigger the mecanim animation using Animator.SetTrigger.
When you’re in container 1 and press the Options button, we use Animator.SetTrigger to move container 1 to the left, and move container 2 to the right at the same time. Clicking on a button in container 2 will call a script that moves container 3 into view. The options container is different than 1 and 2, as that one contains multiple panels. Whenever a button is clicked in container 2, the code will select the correct panel to be turned on (once again, with GameObject.SetActive(true);). This happens off-screen to ensure that the player doesn’t see any popping of the menu. When the player backs down from the menu all of the above will be done in reverse. At first the code would play and set the player back into container 2, where Mecanim will take over to animate the menus once again.
When we started thinking about the design and functionality of the interface we wanted to have dropdown menus for things such as the resolution. Back when we built the interface, we soon found out that there wasn’t an out of the box solution for dropbdowns that suited our needs. Eventually, we managed to come up with a solution that makes clever use of mecanim, masks, and buttons. Here is how it works in action
To get this to work we first created a list of all the options that we needed to have in the dropdown list as buttons and place it the way we wanted it. Then we mask off all the options we don’t need with a mask component. To make the dropdown slide in and out it’s simply a matter of animating the bounds of the mask to what we want to mask off. Finally, in order to trigger this dropdown animation we placed a button on top of the first dropdown option to make it slide down. Any selection from the dropdown list that is revealed will make the interface slide back in again. It might not be elegant, but it works.
Supporting different resolutions
Something that we didn’t immediately thought of when designing the interface is the various resolutions that the interface needs to support. This is where Unity 4.6 really shines as we can use the Rect Transform to anchor the various panels, windows, containers, and buttons that we use to make up the interface.
As a result, we can now use resolutions all the way down to 1024px by 768px while still having everything in frame and readable.
We aren’t completely done with the interface yet but apart from a few small things we are not expecting to be changing things too drastically. Hopefully you have gotten a little insight in how we used the new Unity 4.6 interface features and we would like to know how you end up using it.
Unity 5.0 Physically Based Texturing Workflow November 10, 2014
Unity 5.0 gives its rendering engine an overhaul by introducing what is known as Physically Based Rendering. In short, this method of rendering comes with a lot of useful features. One of them is the more accurate representation of dielectric and non-dielectric materials (metals and non-metals). And it enables energy conservation in a material which basically means that the amount of light leaving a surface will never exceed the amount that hits the surface. A linear workflow is a lot more important in combination with high dynamic range images and tone mapping. I’m not here to embark on a paper that explains all of this because many have gone before in doing so. Please check out the links below to understand PBR a lot better than I could ever explain it.
Physically Based Rendering theory:
What you can expect to get out of this blog is a method of how I usually create textures for an engine that renders using PBR. Hopefully the following tutorial will make creating these textures a lot less troublesome than it may have been before.
If you have any questions for me or tips on how I can improve on this workflow then you can hit me up on Twitter @Tinovdk.
Thanks for your time and happy texturing.
Gameplay, Interactions, Release Date & Internship Positions October 24, 2014
Let’s talk about how the interactions within Fragments of Him will work, I will be going over a couple of the mechanics that you’ll be seeing throughout the game in its current form. I’ll be referring to the prototype quite often, which can be played over on Kongregate. As always, anything shown in this blog is subject to be changed or removed as we see fit, at this point we’re trying out as many thing as possible to see what works best for the game.
Finding the object
One of the issues of the prototype was that it was difficult to find the interactable objects once you reach the completion of the scene. In the full version of Fragments of Him you will not have to remove every object, so finding interactions should be a lot less painful than in the prototype.
The first thing we tried with was placing icons on every interaction that would show up when made available. Below you can see a screenshot of our first prototype of this in an old scene. While this worked, it wasn’t exactly what we envisioned. We might return to this at some point in the near future or add it in as an optional thing (for those who have difficulty finding the interactions in the scene).
In the prototype, objects would get a yellow outline when you would hover over them with your reticule (yellow for being close enough to trigger the interaction, red for being too far away). We did like this system, but it proved to be rather difficult to develop a proper outline shader that suited our needs. What we experimented with can be seen below.
Currently, we are using an outline method that adapts its width based on the distance of the camera to the object, allowing it to be consistently visible to the player from varying ranges. We may experiment with additional systems and solutions as issues with this implementation may arise. How the colour of the outline provides feedback to the player can be seen below.
Blue outline – The player can interact with this object but is not looking in its general direction.
Red outline – The player can interact with this object, is looking in its general direction, but is too far away.
Yellow outline – The player can interact with this object, is looking in its general direction, and is close enough to interact with this object.
Fragments of Him outline shader
One of the major changes from the prototype is the reticule. In the prototype the reticule was actually a 3D sphere (some of you might even remember that it originally was even influenced by the scene lighting) which was simply very close to the camera. For the full game we’ve completely changed this, and are making the reticule not only better to use, it will also serve as feedback towards the player.
When not moving the reticule over an interactive object, it will remain as a grey circle (don’t worry, you can resize the reticule in the menu if the default is too large for you). When being too far away from an object and being in range for an interaction it will behave just like the outline. When you interact with an object, the reticule will shatter into fragments. When the reticule forms back into a circle, the interaction is over and the player can click on the next object.
Controlling the camera
If you played the prototype, you probably at some point got frustrated at the movement speed of the camera. This is not only because it’s the default Unity camera controls, but also because sometimes you just want to click on a relatively small object. To fix this we implemented a couple of new features. First of all you can obviously change the sensitivity of the mouse/controller in the settings. The second feature is something we’re not yet sure of, we implemented a slowdown when you’re close to an object (the camera will move slower when the reticule is yellow). This will allow for more precision when interacting with objects. I’ve tried to visualise it in the gif below.
Slowing down camera movement
Another thing we’re experimenting with is snapping the reticule to the object when you’re in range. Do note that this is still highly experimental (and very poorly coded, but that’s a different story).
Snapping the reticule to interactions
While this does make it a lot easier to interact with objects, it takes away the smoothness of the camera movement.
This is where we’d normally announce a release date for Fragments of Him, but instead we’re announcing that we’re delaying the game for another couple of months. I now hear you say “but why would you do such a thing?”. Well, that’s for a fairly simple reason. We’re making this game because we believe in it, and we want it to be the best it can be. The extra time allows us to create a more polished product and to add in all the minor things we thought we had to leave out of the final release. We will be announcing a couple of amazing things in the near future (give that a month or 2), and look forward to the next blog post that will demonstrate some of the process and results we are now able to achieve by applying a fancy motion capture system.
If you’re a 3D artist, enjoy creating environments and are looking for an internship, then we got a position for you! You can read all about it over here: http://sassybot.com/jobs/.