33 minute read

2. Literature Review

2.1. Procedural Animation

In literature, PA is based on the technical side of PA. For the following literature, the content is the theory and context of PA and why it's used in games. One of the first examples of procedural animation used in the 3D industry was in an animated scene in a short movie, Knick Knack (1989) (Paik & Iwerks, 2007), made by Pixar. In this short movie, PA was used on the snowflakes and bubbles, where it uses fluid dynamics. Animations have come on a lot since then, and there are different types of PA on top of fluid dynamics. PA, in general, comes in different types in degrees of how procedural it is.

Johansen states, "Believable characters in games must have this awareness and adaptation to context and environment to seem like they belong in the world they inhabit " (Johansen, 2009) This means that for a character to be believable, it needs to feel realistic in that world.

Johansen follows this by saying,

"Interactivity in games and simulations means that the characters' cannot always be determined in advance, and characters need to dynamically respond to the unfolding events in the game, whether controlled by the player, artificial intelligence, or something else." (Johansen, 2009).

To summarise this, Johansen is saying that for every function in the real-time experience, there needs to be animation to adapt to the current state the player is in. Using a PA approach, all these motions can be created to create a higher immersive experience for the player as the character feels like they are in that world.

Alan Zucconi, a procedural animation blogger, discusses that the methods to carry out PA are ragdoll physics, rigid body simulations, inverse kinematics (IK) and forward kinematics. These PA types utilise a simulating real-world physics, and these physically based PA methods are used in game engines such as Unreal and Unity. (Alan Zucconi, 2017). Zucconi follows up by saying ragdoll physics is predominantly used in games as a "dying animation," where each limb is given its rigidbody and degrees of motion appropriate to real-world motions to define how the limbs should simulate freely, such as simulating a character falling

Inverse Kinematics, or IK, is a type of PA where it simplifies animation. It's often used on humanoid characters where if you animate the foot or hand, where a game object is placed at the end of a chain of bones, for example, the end of an arm or leg chain, the bones going back up towards the body are animated with the constraints of the end object. Forward kinematics is the reverse of IK animations, where the shoulder effect affects the bones down the chain and animates those bones realistically. Forward kinematics is the process when keyfarmes are animated, most of the time.

A few examples of games with PA are a more realistic art style in Assassin's Creed 3 (Ubisoft, 2012) and a stylised artistic style in Fall Guys (Mediatonic, 2020). So both games have a PA system but are used in very different ways to achieve the same outcome. During a GDC talk by Aleissia Laidacker, the lead tech artist (Brewer et al., 2013), how PA was used in AC3 was discussed. PA was used as a way for Ubisoft could create many different kinds of motion for their characters and only some of the movement animation they needed for the game as they have a lot of varying movement mechanics such as movement, climbing, free running and tree traversal. It would also make the experience better for the player. It makes the experience better for the player as it uses a physics engine to calculate the character's motions in line with the parameters set by the animator, making the experience more visually believable to the player as the motion relates to real-life motions.

Another example of the use of procedural animation is in Fall Guys. Fall Guys is a stylised game which uses, in particular, ragdoll physics. Fall Guys, in general, is a very physicsorientated game, and the character should be expected to animate with how the character interacts with the obstacles. This makes the stylised environment believable as the game is grounded in physics, similar to the real world.

It isn't just humanoid characters that use procedural animation. One of the first uses of PA was in the racing genre. Many racing games, such as Forza Horizon 5 (Playground Games, 2021), use PA on the cars so that they match the varying ground surfaces. This is only a small section but is relevant as developers use procedural to make their cars more reactive to the ground of the track and make the racing experience a more believable experience.

2.2. Immersion

The dictionary defines "Immersion" as "the fact of becoming completely involved in something." (Cambridge Dictionary, 2023). Brown and Cairns carried out a study into immersion and what it means in games. Even when asked about "How immersed do you feel?" the concept of immersion can be misinterpreted as something else. (Brown and Cairns, 2004) Immersion is tricky to define and often subjective to a specific person. The first level of Immersion that Brown and Cairns talk about is Engagement This is how engaged the player is in the experience. A word often used is "access" (Brown and Cairns, 2004). If the player decides they don't enjoy a particular game or genre, the immersion levels will often be low for the player. On the other hand, someone learning the controls for the first time will be much more focused on learning so they will be immersed differently. The next level of immersion that is discussed is engrossment. Engrossment is something that the player feels when they have played the game for a while and feel involved. A "barrier" to engrossment is game construction. Game construction is a concept where the game features affect the player's emotions. To feel engrossed in a game, Cairns and Brown found that with testing, "Some game features mentioned by participants that form this quality were visuals, interesting tasks, and plot." (Brown and Cairns, 2004). The fact that visuals help players to feel engrossed is very beneficial to this study, as PA is entirely visual The next level of Immersion is Total Immersion. Cairns and Brown defined total immersion as presence in a virtual environment. Presence is a term closely related to immersion and is often used interchangeably When doing their study, participants described their experience as "being cut off from reality and detachment to such an extent that the game was all that mattered" (Brown and Cairns, 2004). To get that full experience of presence, the virtual world must have an atmosphere and a sense of empathy. Empathy could be with many features in a game that creates an emotional attachment. The atmosphere is created using all the elements a designer puts into the game to make it immersive, such as sound, graphics, plot, etc. Total immersion is a topic that isn't in the scope of this project as a lot of it uses many features such as sound, interactable characters and a bigger plot, which will retract away from the area of procedural animation that is to be researched, as participants will be immersed due to the elements previously described. Another definition of "Immersion" comes up from the word believability. Believability in the literature around game immersion is prevalent and will be investigated in the next section.

Thalmann, Kim, Egges and Garchery use the word "Immersion" to introduce believability

(Thalmann et al., 2005). They discuss that a player can "believe that the experience in the virtual world is a real experience if he or she is totally immersed in the virtual environment."

(Thalmann et al., 2005). In this definition of immersion, the player should believe they are in that world to feel immersed. They discuss that to feel more immersed, the contents of the virtual environment, such as the emotional, individualised and objective elements, need to be "realistic" (Thalmann et al., 2005). If an environment has these elements, it's often called presence. It is worth linking here, with the word presence coming up again, as more research papers use the word in the description of immersion. In the same paper, believability is discussed in how a virtual world is laid out for the player. Suppose the virtual environment is designed to look as believable as possible. In that case, it will make the whole experience much more immersive and believable, even in an experience that isn't meant to be realistic.

Another term used is "Interaction." (Thalmann et al., 2005). This can be anything in which the player interacts or how they control the character. The player's interaction can cause the user to feel more immersed in what the character is doing as you tell your playable character what you want to do. Thalmann, Kim, Egges and Garchery explain that immersion, presentation and interaction all work together to create a higher level of immersion in the experience. In essence, if there is a high level of immersion and presentation but a lower feeling of interaction, then the user will be less immersed in the space.

Carrying on the concept of "believability," Togelius, Yannakakis, Karakovskiy, Shaker discuss believability in a study using the space at the Mario AI Championships at the Asia game show 2010. The study discusses how believability doesn't have a clear and concise meaning. They state, "We have a family of related meanings denoted by the same word" (Togelius et al., 2013). They refer here that the meaning of "intelligence" isn't straightforward and depends on the person interpreting it. In the area of computer games, they discuss that if the player thinks the character is believable, it means that a part of the character has to be believed. The different parts of the character that can be believable are character believability and player believability. Character believability is how believable the character is in the space, and player believability is how real the character controls are. It is more difficult to create character believability as it requires a higher level of realism in many aspects, including realistic texture, movements and dialogue. Character believability is usually saved for big blockbuster films and not really for games, but it will start to become more prevalent in more real-time output as technology improves "Hardly any games, or interactive media of any sort, can reasonably aspire to character believability within this technical generation" (Togelius et al., 2013). The point to be made here is that for a character to be believable in the game, it needs player believability to come into the experience. Player believability in games will have a greater effect on the whole experience. Usually, player believability stems from a preconception that the character in the game is not real and is pixels on a computer screen. To create a sense of believability, the player is given the reigns to control the character.

Togelius, Yannakakis, Karakovskiy, Shaker later discuss how believability changes from 1st person perspective to a 3rd person perspective. When the player plays as a character from a 1st person perspective, it is thought that "gameplay and player experience are interconnected" (Togelius et al., 2013). Essentially meaning that the player is experiencing the gameplay as if they were right there, influencing how believable the interactions and environment are. However, when a player plays with a 3rd person camera, the player is watching over their character and looking at what they are doing instead of the feeling of being the character. In this way, a more reliable level of that feeling of believability can be achieved as the feeling of being the character which is already, without adding anything, a lot more believable.

In summary, immersion is a tough concept to define meaning for. To get a more comprehensive definition of the word would require research not in the scope of this research, as most of the concepts here are skimming the surface to get more definitions of what different researchers think the word means. Different ideas from more researchers help to explain it however, as it's a subjective term, people will have their own definitions of it. However, much of this section's literature discusses immersion with similar words and words that can mean the same thing. Immersion and believability are closely linked in the literature and especially in the game space, as believability helps the player's sense of immersion and vice versa, so discussing them is important when discussing "immersion"

2.3. Measuring Immersion

As discussed in the last section, immersion is a challenging concept to define, and testing immersion is also difficult The most used questionnaire to measure immersion is the immersive experience questionnaire or the IEQ, which measures "the subjective experience of being immersed whilst playing a video game." (UCLIC-UCL Interaction Centre, 2008). This questionnaire has been widely used in other research projects to measure immersion, and some of the questions will be used in the testing of this project. The questions ask the user on a scale of 1-7 about their experience in a way that doesn't mention the word "immersion." This is due to the same fact; as explained before, immersion is subjective and different people will have a different interpretations of the word. In this case, it measures different factors that affect the overall immersion levels. These factors, such as control, essentially measure how much control the player has over the character in the space. The questionnaire also uses "spatial" type questions which are to measure how totally immersed the user is in the experience. This is good for measuring a higher level of immersion, such as in VR or full-game experiences with all the visuals, sounds and plot

When measuring believability which is very much related to immersion, a very similar methodology is used to measure the believability. Togelius, Yannakakis, Karakovskiy, Shaker discuss methods of testing believability. They state there are three methods; subjective and objective data collection. Subjective is the "reporting" (Togelius et al., 2013) of a player's opinions. Objective data collection is a way to make the finding more factual and often measures the physiological responses of the player The final method is called "Gameplaybased", where the game makes a record of the response of the player from a gameplay point of view. To test for believability, all these methods could be used individually or all can be used to get a better quality of results.

Togelius, Yannakakis, Karakovskiy, Shaker go on to exlplain more about subjective data collection. This is the most straightforward and most straight to the point method of measuring believability. It can either be through open discussion whee the user discusses their experience and so is fairly open ending, or it can be a questionnaire based where the answers are more "forced" (Togelius et al., 2013). When a free response is carried out, the feedback is often more diverse and has more quality but is difficult to really analyse in way that takes everyone's opinion into account. However when a questionnaire is asked, its often a lot more narrow in scope and the results can be evaluated better. This is interesting as bearing in mind the scope of this project, a questionnaire would we a lot easier instead of an open discussion as the results could be analysed easier and not rigorous analysis techniques would need to be used.

Another section that Togelius, Yannakakis, Karakovskiy, Shaker discuss is how to ask the users about the experience. The method they explain in this section is the forced questionnaire and how the questions should be structured. They discuss 3 types of questionnaire questions; Boolean, ranking and Preference. Boolean is is a simple 2 option question, that is succinct, however doesn't provide a high degree of quality feedback. A ranking question is similar to the questions found in the IEQ, where the user is asked to rank on scale, what their answer is Finally preference is used to compare 2 areas of a game. From these option, there is no globally acdepted method of carrying out this type of questionnaire and they all have advantages and disadvantages. For example the ranking, does not take into account how people see it on the scale, but with enough users can that variable out. With preference, the subjectiveness is a lot lower and is asking for their answer based on a comparison, which is a fairer and more more accurate system.

3. Design + Implementation

This section will be on the design and process of the research application. It will introduce the core elements and design choices and then implement the design elements into Unity.

3.1. Design Requirements

The design requirements for the application will include the following:

• A character controller will simple movement with a controller for NPA and a controller for PA

• A small environment for the character to walk through. This will include different terrain types and opportunities for the character to utilise the PA systems, such as slopes/uneven ground, rocks, and trees.

• An objective for the player to complete. This will encourage the player to explore the environment and essentially makes it a small quest from a game.

• Two scenes with the same environment; one with an NPA controller and the other with a PA controller.

3.2. The Experience

3.2.1. Core Experience

From the design requirements, the core experience of this research application is a story as this is to encourage the player to move around the space, a small environment in which the player can move around, two scenes with the same character controller with the same movement mechanic, such as walking but with two different animation types; NPA and PA. A main menu for the application would also be a key and be part of the core experience, as it would contain the start screen with instructions and help controlling the game. It would also contain a link to the questionnaire for ease of access for the user.

These elements are the minimum for testing the application and evaluating the results.

3.2.2. Enhanced Experience

The enhanced experience was created to show what elements could be added to the application to add to the experience but not necessary for the application to be tested. The elements included are extra movement types, such as sprinting, sliding, walking on stairs and vaulting over walls. Other elements included adding sound effects.

Many of these elements weren't implemented into the final research application; however, at this stage, it was good to show the next steps to enhance the overall experience further

3.3. Implementation

3.3.1. Design

The level design was the first phase of creating the technical solution. It was developed with the core experience in mind with how PA could be best demonstrated. The character has usual basic movements, such as walking in any direction. As the game genre is 3rd person adventure, the adventure design is based on the environment and exploring the world, with a 3rd person camera and the character animation should interact with the world in a more immersive way than perhaps a first-person game, where you only see the characters hands or at most a reflection on a mirror.

The level design and the story went through a few iterations, where a hilly and forest environment was initially designed, as seen in Figure 1. The design behind this was with a flow from a forest to a hill in mind where the character will interact with uneven ground and more sloped areas, along with hand interaction along the way with trees and rocks. This kind of environment also plays a part in the story or the reason why the player will be interacting with this world in the first place. As this application is about testing immersion levels of the environment, the player shouldn't feel that's what is being tested; there should be another reason to walk around the word. For this reason, a small exploration story has been created

This "story" could be anything, as its purpose in this application is to give the player something to do. However, this design, uses old 'settlements' to guide the player around the environment to all the areas the designer wants them to experience. The background for this story is that the player plays as an archaeologist, exploring where historical documents show settlements in the area. The player needs to explore the area and document them, and then return to the base camp or car. The initial design was created in Unity using the terrain tool in Unity 2021.3 LTS This allowed for the terrain to be sculpted and easy changes to be made.

At this stage in the process, a simple character with movement and NPA was implemented to test the scale of the environment and elements in the scene. The initial game loop was also implemented for when the player walks to a settlement and updates a simple UI when they do so. By testing the scale of the environment, it was found that the area was unnecessarily too large as all the design requirements were met; however, there was too much downtime between terrain and interactive objects, and it took a long time for the player to traverse the environment, with a total of around 3-4 minutes for the full traversal time Another consideration at this point was that because the environment would be explored twice, it would mean the user evaluation would take a long time to be tested as the player would be exploring the area twice.

The next iteration of the design didn't deviate much from the initial design and story Again the terrain tool was in Unity, but as a concept of the smaller Island was made, a height map was made to make the process easier. It also meant that even though the terrain had used a height map to start, the terrain could be sculpted in areas that required it. Still, it condensed all the design requirements and core experience into a smaller environment in the form of an Island, as seen in Figure 3. This new design has all the terrain types and interactive objects needed for the PA hand interactions

Another advantage of having the environment as an Island meant the background assets could just be a sea and make sense to the player. This was an issue in the hilly/forest environment as the environment was bigger than the playing area to make the player think they were in a hilly and large environment that didn't just cut off at the playing area.

A similar story was created where the player has to find remnants of an old life to allow the player to explore the Island and find all the remnants. When all the remnants were found, the player would have the option to proceed to the next Island. As previously discussed, the background of the story isn't important. Still, the purpose of the chosen story is important as it allows the player to explore the Island and encourages the player to interact with the designed elements without explicitly knowing what is being tested. Each island playthrough lasts 90-120 seconds, reducing user testing but retaining the required design requirements.

3.3.2. NPA and PA systems

This section will describe implementing the NPA and PA systems into Unity. This section will deal with more in-depth code and animation techniques, so a prior understanding of animation and code and its terminology is recommended As implementing PA was an unknown in Unity, that was focused on more than the NPA system at this stage. In Unity, the way PA is created for any type of character is the animation rigging package. This package is an add-on found in the package manager that can be installed on a project. The animation rigging package is a fairly new package built for Unity, and documentation/tutorials were difficult to gather, especially for humanoid PA, such as hand placement and feet placement.

Animation rigging works in Unity by using the imported armature of the mesh and making a new armature from the imported armature. When this happens, constraints can be added to an empty gameobject for example, to manipulate the bones on the mesh. A well-used constraint used in this project was the two-bone IK constraint, as seen in Figure 5, which allows for inverse kinematics on the part of the body, such as the arms and legs.

When this constraint is added, a hint and target object are created, which are the objects that control the motion of the chain of bones. The target takes control of the last bone in the chain, and the hint controls how the bone chain will bend. In game mode, when the target is moved, the chain of bone selected in the root, mid, and tip also moves. However, at this point, there is no way to move the target without using the mouse, and code needs to be written to control the target as intended.

The feet were implemented first to understand how to implement a PA system with animation rigging, and there was more material in this area in documentation and tutorials. For this project, it was clear that individual solutions wouldn't be possible due to time constraints and knowledge of animation rigging at this point, so a tutorial was used in this section with the code already thought up. The tutorial was for a dragon with four legs (Leif in the Wind Games, 2021); however, the way the code was written allowed to drop two legs as the system used arrays and was modular, as seen in Figure 6

The script works by shooting out a raycast which, when it hits the ground, gets the hit normal and makes the target of the IK constraint share the same rotation as the hit normal, making the foot adapt procedurally to the ground at that time.

For much of the process, this worked well for what the requirements of this project needed; however, it did require knowledge of how the code actually worked as there were a few issues that did crop up and needed solving. For example, different applications and asset store assets have different armatures attached with varying local rotations in the armature; in this case, the foot bone had a different local rotation than what was initially written in the code. It needed to be changed to act how it was expected. In this case, it was an easy fix by changing a few lines of code with the correct quaternion values for the feet to match the terrain properly. Another problem elaborated on later in this section was using animation curves on imported Animation in Unity, as seen in Figure 7. This allowed for animation such as a walk animation to act how it would when the foot is not touching the floor to act in an expected way and follow the keyframes, but when the foot was on the floor for the keyframes of the animation, it would use the IK values and match the foot to the ground. As previously stated, combining the feet and hands created issues due to the use of animation curves which harmed the hand placement system, but this will be elaborated on shortly.

For the hand placements, finding good documentation or tutorials was even more difficult. The tutorial used here was project files from IHeartGameDev (IHeartGameDev, 2023), where they had a Patreon page with the project files on hand placements. This was great as it had a full system of lifting the hand to objects with the correct layer type and worked perfectly for what was required. As these project files were a work in progress for preparation for a tutorial for Youtube, the system heavily relied on the character controller from those files. Along with this, the hands didn't consistently go to touch some of the objects but did work most of the time. A process to transfer the code and logic to a simpler character controller was carried out, but it turned out that implementing the feet system to this new character system was the better option. Along with the new character controllers, a few features needed to be added, such as ground and falling mechanics, so the character could walk on slopes and react to physics more realistically Overall, this part of PA didn't take much time to implement; it was just about ironing out the initial issues of learning a new package.

As previously stated, combining the hands and feet systems were complex and needed a solution due to the utilisation of animation curves and combining different PA systems. With more research and brainstorming ideas, a feature used here was the avatar mask, as seen in Figure 8, which allowed certain bones to use animation and mask out others that didn't require those animations. Essentially, the implementation used two layers of animation in the controller, one layer to control the hand PA and keyframe animation without using any animation curves and the other layer to control the legs PA and keyframe animation, with the animation curves still working on these animations. Working with the hands and getting it working consistently took a lot longer to iron out the issues, and even then, the solution wasn't completely perfect.

For the NPA controller, all that was required was a simple animation, along with the character controller. For the animations in this project, Mixamo were used, as can be seen by the list in Appendix 7.2

3.3.3. Art Direction

The art direction for this application wasn't an essential feature as this project was based on PA and NPA, but it was considered for the user's enjoyment and attention. It could've been realistic or stylised as an art style, as it was found in the literature review that art style doesn't define how immersive a game is, as it depends on how the physics or controller is implemented. For a few reasons, a low poly style was decided upon and made for the Island If needed, it allowed quicker alterations of the level designs and allowed the PA and NPA systems to be focussed on in the production, as that took up most of the time. The last reason was another reason to use assets from the Unity asset store. In this project, all the art elements, including the textures and meshes, were from the Unity Asset Store, as seen in the list in Appendix 7.2.

3.3.4. UI and game systems

The UI and game system for this game was very simple. On Miro, a flow chart, as seen in Figure 9, of the whole experience was created to make the creation of the menu and user experience more seamless.

Following this, UI was created for the quest for each Island. As previously explained, this wasn't hugely important in the quest's aesthetics or substance but kept the player interested in exploring the Island. With this in mind, the UI, as seen in Figure 10, was simple and explained to the player what was required of them.

When the player walked up to the different remnants of life, some scripts were created to tally up how many remnants the player had found. This worked with a simple trigger collider that detected when the player had passed through and would add to the UI. A game manager was created to collect all the information and have all central in one script/manager, as seen in Figure 11. The script that allowed for the player to be detected was placed on the player gameobject, as seen in Figure 12, and detected when the player had found that objective.

3.3.5. Build

The build for this project would be targeted for Windows and put on itch.io for easy distribution. Using Itch.io allows everything to be in one place, with all the links to external cloud storage for the information sheet on that page.

4. Evaluation + Results

This section will outline the methodology used to conduct the research and present the outcomes of the user testing.

4.1. Methodology

After the user had played the experience, they were encouraged to fill out the questionnaire. The questionnaire would be split into 3 parts; the consent form, answers based on Island 1 playthrough (NPCC), and answers based on Island 2 (PCC). Both Islands would be given the same questions, except one question in the Island 2 part of the questionnaire. The questionnaire was carried out using google forms. To play the experience and fill out the questionnaire would take around 10-15 minutes to complete.

Google Forms was used as an online forms site, and information sheets, as seen in Appendix 7.3, can be shared via google drive, to make the whole experience as seamless as possible for the user. The questionnaire also took care of the consent form where consent was give. If consent wasn't given, then they wouldn't be able to complete the questionnaire. Another reason for using a questionnaire was because immersion is entirely subjective to the current user and can be different from user to user, meaning everyone gets asked the same questions, no matter their interpretation.

The questions asked also had to be carefully selected to get the opinions for what was required. The questions were selected from the IEQ (UCLIC-UCL Interaction Centre, 2008), as this is a well-respected questionnaire on measuring immersion, with the questions based on a 1-7 scale. The list of questions can be seen in Appendix 7.4. When all the users were finished, an average answer would be calculated to get answers for all the questions. As this research is an honour project, there is no need for a rigorous statistical evaluation of the responses.

The questions that were selected were questions that expressed how engrossed a player was or engaged a player was. The total immersion questions were left out as these questions wouldn't add any value to the answers for this research. Both island playthroughs had the same questions on them to make a comparison of how the questions were answered. At the end of the PA part of the questionnaire, an additional question was added to answer the secondary research question.

4.2. Results

This section will present the results of the questionnaire. As previously stated, the questions for Islands 1 and 2 were the same except for the last question in Island 2. All the questions had a scale of 1-7.

The experience was tested on 15 participants, and an average for each question was calculated to get a number to compare each Island to.

4.2.1. How much effort did you put into the game?

On a scale of 1-7, Island 1's average value received was 4 and Island 2 had an average value of 4.5. These values have no significant difference and show that participants used roughly the same amount of effort on each Island.

This question was selected as it shows how engaged the player is in the experience. This could be using the controls or completing the objective. With both islands posing a median average value, it shows participants were using a moderate amount of effort, so they engaged but not finding the experience too hard, so using more effort.

4.2.2. To what extent did you feel you were interacting with the environment?

On a scale of 1-7, Island 1 had an average value of 3.9, and Island 2 had an average value of 4.2. These values show no significant difference but does show participants did feel they were interacting with the environment more in Island 2. With both Islands posing a lower score on the scale, it means, more generally, the environment isn't particularly interactive

This question was intended to understand how the PA affected the players' experience in that environment. By showing a slightly higher average in the PCC island, PA affects how a player's immersion in an environment to an extent.

4.2.3. To what extent did you enjoy the graphics and the imagery?

On a scale of 1-7, Island 1 had an average value of 4.3, and Island 2 had an average value of 4.5. Both averages at very similar values show that both Islands have very similar levels of enjoyment in graphics and imagery.

These results, in terms of how similar the averages are, are to be expected as both the islands are aesthetically the same on purpose to eliminate the visuals as a variable for creating more immersion for each of the islands.

4.2.4. To what extent did you enjoy the character animation?

On a scale of 1-7, Island 1 had an average value of 4.5, and Island 2 had an average value of 4.6. This was a somewhat surprising result as the expected result would be that PA would make the character animation more enjoyable as it interacts with the environment more and make the character mor realistic in the environment. As it is, the average values from both islands are equally enjoyable at a moderate level.

4.2.5. To what extent did you feel as though you were moving through the game according to your own will?

On a scale of 1-7, Island 1 had an average value of 4.9, and Island 2 had an average value of 5.1 These averages are to be expected as both islands encourage the player to explore freely and find remnants in whatever order they would like.

4.2.6. To what extent did you find the game easy?

On a scale of 1-7, Island 1 had an average value of 6.3, and Island 2 had an average value of 6.5. The similarities in the values was expected with this question as the game should be familiar with prior 3rd person controller experience and quest systems. With Island 2 with a slightly higher value, means the participants could be getting familiar with the systems. This was a good question as it showed how easy the game was for participants, making them more interested and immersed in the game experience.

4.2.7. To what extent did you feel like you were making progress towards the end of the game?

On a scale of 1-7, Island 1 had an average value of 5.8, and Island 2 had an average value of 5.6. This average comparison is quite a surprising result in terms of Island 2 having a smaller value, as participants should know by the time they have played the first Island what the objective would be. Along with this, the UI states how many remnants the participant needs to find, so the value should be high on the scale they are.

4.2.8. What animations did you notice more?

This question was only asked after Island 2 as it was only the second Island with PA Out of the 15 participants, 8 participants felt they noticed the feet matching the terrain more. Four participants felt the hands rising and touching the objects, and 3 participants noticed both PA features equally. As stated before, this question was helpful in answering the secondary question, and the results here are surprising. As the feet matching the terrain is much more subtle, the hands were expected to be noticed more.

5. Conclusion + Future work

This section will summarise the findings of this research, its limitations, and how the research can be improved upon.

All the aims and objectives for the project were successfully carried out Following the literature review, a small experience was created with one Island for the PCC and another island with NPCC, to compare how PCC affects the immersion.

An evaluation was made with participants to questionnaire their experience and how they felt after playing the game. This questionnaire answered both the research question to an extent, but the results are not significant, and more research would need to be carried out in this area to get a more conclusive and clear result.

For research question 1, it was seen that using PA does effect the user's immersion in a virtual environment; however, from this research, it doesn't affect it by much. This could be for a few reasons. One reason could be that the scope of the honours project affects data collection Even though a relatively large number of participants did participate in the testing at 15 for an honours project, it is hard to see a visible difference as not many people have tested it, compared with other large-scale research projects. Another reason could be the implementation of the PA. When developing the PA systems, it was clear that there were some issues, particularly how inconsistent the hand placement worked. This could affect how the character in Island 2 interacted with the environment and affect some answers, such as question 8 for Island 2, as perhaps participants didn't experience it Finally, concerning the literature review, this research reinforces the idea that immersion is subject to the participants and not everyone notices the differences, such as adding hand placements of feet matching However, the comparison did help to get some consistency with the testing as the PA was isolated to get a better idea of immersion levels

To answer the secondary research question, question 8 on Island 2 was used along with personal experience of creating the application. The first point would be to get PA to work; a lot of research had to be carried out to find good tutorials and documentation on this area. This shows that there isn't a need or interest amongst Unity users for PCC, as Unity forums are usually full of documentation and information on all areas of Unity, but, with the animation rigging package, there is not much. Another point here is that even though the code was 3rd party, it was not perfect and worked quite inconsistently and took a lot of resources (time and effort) to get the system working to a level that could be tested with. Looking at the answers for question 8 in Island 2, it was seen that participants noticed the feet matching more than the hands placement. The feet matching did take a shorter time to develop and get more results out of it. When comparing NPCC and PCC workflows, it's clear that traditional keyframe animation is much easier and quicker to develop than PA system that could work but not get many results from the user, and they might not even feel more immersed with it.

The implications of this research is hard to conclude with. With Unity adding the animation rigging package only recently for example, it would perhaps show that there is a more need for PA in the wider community. For this reason, users of Unity would want some reason to use it and would it actually be worthwhile for their projects. This research would show that it doesn't add a significant result in the terms of immersion, it would need to be for another reason. However, PA is quite a mature technique, and if users had wanted it before now, it would have already used more in Unity, and packages would've been created earlier into the lifecycle of Unity.

As previously stated, having more participants in this research would take the research further, as the results could have a more significant difference. To improve the current application, more terrain types and traversal mechanics such as sprint, vaulting or climbing could be added to enhance the experience. Another potential area that could be explored would be different game genres and how PA affects immersion in those types of games. In the literature review, a small section was written about immersion in first person. This area would be interesting to see if the research backs up what the literature says about first-person games. Other game genres, such as RPGs, and action or shooter games, could be explored to see what effect PA has on immersion in those games.

6. References

Andersson, J. et al. (no date) 'A Video Game Using Procedural Animation'. Available at: https://publications.lib.chalmers.se/records/fulltext/251043/251043.pdf (Accessed: 26 January 2023).

Brown, E. and Cairns, P. (2004) 'A grounded investigation of game immersion', in CHI '04 Extended Abstracts on Human Factors in Computing Systems CHI04: CHI 2004 Conference on Human Factors in Computing Systems, Vienna Austria: ACM, pp. 1297–1300. Available at: https://doi.org/10.1145/985921.986048.

Brrewer, D. et al. (2013) 'AI Postmortems: Assassin's Creed III, XCOM: Enemy Unknown, and Warframe'. Available at: https://www.gdcvault.com/play/1018058/AI-Postmortems-Assassins-Creed

Cambridge Dictionary (2023). Immersion. Cambridge Dictionary. Available at: https://dictionary.cambridge.org/dictionary/english/immersion (Accessed: 2 February 2023).

Cheng, K. and Cairns, P.A. (2005) 'Behaviour, Realism and Immersion in Games', in CHI '05

Extended Abstracts on Human Factors in Computing Systems. New York, NY, USA: Association for Computing Machinery (CHI EA' 05), pp. 1272–1275. Available at: https://doi.org/10.1145/1056808.1056894

IHeartGameDev Youtube Channel (2023). Available at: https://www.youtube.com/c/iHeartGameDev/videos (Accessed: 3 March 2023).

Isikguner, B. (2014) 'PROCEDURAL ANIMATION: TOWARDS STUDIO SOLUTIONS for BELIEVABILITY'. Available at: http://irep.ntu.ac.uk/id/eprint/310/1/220423_Baris_Isikguner_September2014_excl.3rdpartym aterial.pdf (Accessed: 15 January 2023).

Jennett, C. et al. (2008) 'Measuring and defining the experience of immersion in games', International Journal of Human-Computer Studies, 66(9), pp. 641–661. Available at: https://doi.org/10.1016/j.ijhcs.2008.04.004.

Johansen, R.S. (2009) 'Automated semi-procedural animation for character locomotion', Aarhus Universitet, Institut for Informations Medievidenskab [Preprint]. Available at: https://runevision.com/thesis/rune_skovbo_johansen_thesis.pdf (Accessed: 26 January 2023).

"Fall Guys." Microsoft Windows, Nintendo Switch, PlayStation 4, PlayStation 5, Xbox One, Xbox Series X/S. Mediatonic, 2020.

"Forza Horizon 5". Microsoft Windows, Xbox One, Xbox Series X/S. Playground Games (2021)

(RE-UPLOAD FULL) Unity Quadruped IK using Animation Rigging (2021). Available at: https://www.youtube.com/watch?v=iMojuj0K63s&t=1371s (Accessed: 3 March 2023).

Thalmann, N. et al. (2005) Believability and Interaction in Virtual Worlds., Proceedings of the 11th International Multimedia Modelling Conference, MMM 2005, p. 9. Available at: https://doi.org/10.1109/MMMC.2005.24

Togelius, J. et al. (2013) 'Assessing Believability', in P. Hingston (ed.) Believable Bots. Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 215–230. Available at: https://doi.org/10.1007/9783-642-32323-2_9

"Assassin's Creed 3." Playstation 3, Xbox 360, Wii U, Microsoft Windows. Ubisoft, 2012.

"Assassin’s Creed Unity”. Playstation 3, Xbox 360, , Microsoft Windows. Ubisoft, 2014.

UCLIC-UCL Interaction Centre (2008) Immersion, UCLIC - UCL Interaction Centre. UCLICUCL Interaction Centre. Available at: https://uclic.ucl.ac.uk/people/anna-cox/researchgroup/ieq (Accessed: 5 May 2023).

Zucconi, A. (2017) An Introduction to Procedural Animations. Available at: https://www.alanzucconi.com/2017/04/17/procedural-animations/ (Accessed: 27 January 2023).

7. Appendixes

7.1. Ethics Form

7.2. 3rd party assets

Animations

Idle animation- https://www.mixamo.com/#/?page=1&query=idle

Walk Animation (From this package)- https://assetstore.unity.com/packages/essentials/starter-assetsthird-person-character-controller-196526

Unity Asset Store Assets

Banana Man (Character mesh and armature)https://assetstore.unity.com/packages/3d/characters/humanoids/banana-man-196830

Tropical environment assets- https://assetstore.unity.com/packages/3d/environments/natureenvironment-tropical-island-lite-low-poly-3d-by-justcreat-242437

Low poly rocks- https://assetstore.unity.com/packages/3d/environments/low-poly-rock-pack-57874

Other environment assets- https://assetstore.unity.com/packages/3d/environments/little-low-polyworld-lite-srp-urp-119111

7.4. Questionnaire questions

All questions were on a scale of 1-7. 1 being Not at all, and 7 being Very much so, with an exception on question 1, where 7 was A lot and question 5, where 1 was extremely hard and 7 was extremely easy

Question 7 on island 2 was multiple choice between:

• Hands rising and touching the objects in the environment.

• Feet matching with slopes and uneven ground of the environment

• Both equally

Island 1 Questions

1. How much effort did you put into the game

2. To what extent did you feel you were interacting with the environment

3. To what extent did you feel that the game was something fun you were experiencing, rather than a task you were just doing?

4. To what extent did you feel as though you were moving through the game according to your own will?

5. To what extent did you find the game easy?

6. To what extent did you feel like you were making progress towards the end of the game?

Island 2 Questions

1. How much effort did you put into the game

2. To what extent did you feel you were interacting with the environment

3. To what extent did you feel that the game was something fun you were experiencing rather than a task you were just doing?

4. To what extent did you feel as though you were moving through the game according to your own will?

5. To what extent did you find the game easy?

6. To what extent did you feel like you were making progress towards the end of the game?

This article is from: