Thursday 24 August 2023

Insights & Conclusion

- At the start of this project, my main idea was to make a game based on exploration. The player would sing to move about a world, collecting coins to give a goal to the game, but mainly just exploring the environment through singing.

- As I iteratively worked with a musician to play test the game, I became aware there was too much freedom in this when given only visual inspiration. The player didn't feel inspired to sing more notes, and there needed to be more interaction with the world around them.

- With my playtester being a gamer as well as a musician, the project became a lot more like a traditional game, adding challenges with harmony copying or platforming. I feel like the project could have gone in 2 different directions: this more gamified experience that I went down or a more artistic experience. With my skillset and interests, I was more inclined to go down the game route, but I can envision a version of Dynamic Landscapes that is more fitting to its name and what I was originally imagining.

- If going down the more artistic exploration of an environment route, I would have needed 3D modelling skills and a more artistic brain. I can imagine a game where the point is just to walk about this large artistic environment, where the objects are reacting to your sound in many different ways to explore, maybe adding achievements that you unlock (such as those random achievements you get on steam for knocking or a door) by finding new areas or interacting with objects as a goal. I feel like with this game you could have a musician going into it knowing they are there to sing a piece and use the environment as a score to play with, but with the more gamified route I took, I needed it to be that the player ended up singing a piece through playing a game, not necessarily actively choosing to sing.

- I feel the project in its current state needs much more work and development before it can be seen as a digital score. Feedback from my musician shows that it doesn't feel like you are singing when playing this game, more that you are just making noise. They mentioned a need for rhythm and more incentive to change the notes. I had attempted to achieve this by adding the backing music beat and followers with harmonies, but there's no need for the player to listen to this as their goal is just to move about using the length of notes.

- I believe there is potential here, it could become a way to help a player learn to harmonise using the followers, giving more interaction when you sing harmoniously with them or match their harmony. Going down the more gamified route, I feel there needs to be more materials leading the player to their song. With the game objective of finding the NPCs and completing the challenges, the player is more focused on that rather than what they are singing. With the controls being in the state they are currently in, there is no need to actually sing music, making the result an almost monotone rhythm of long and short notes.

- I would like the result of playing this game to be the player finding a song in the melodies they have sung. For them to be able to remember parts and patterns they can take inspiration from, or listen back to what they have sung and pick out particular sections that they enjoyed. For this to happen, there needs to be more interaction to make them naturally sing more melodically. Maybe the past-mentioned exploration of a world that is interacting back at you with exploration achievements could still be added to this along with the game mechanics, or maybe there needs to be more complex controls using pitch again (but in a less frustrating way than the old turning system) and dynamics added to the game.

- I enjoyed the addition of the extra challenge of quiet in the second level, it added a completely different feel to the game even though the same base mechanics and goals were there. I think this could be developed further by adding new mechanics to each level, slowly building up what the player can do, both to stop the game from getting boring (I always prefer games where you keep unlocking abilities such as a Metroidvania or the mechanics keep changing like in games like "It Takes Two" as it keeps the play interesting and new challenges automatically come up through the change in mechanics). The creepy forest level ended up being too much of a twist in direction between levels, though only adding 1 new challenge (keeping quiet) completely altering the genre of the game through the mood shift. I would need to add some levels building up to this so it isn't so juxtaposing, in that I could add more mechanics to build the challenge. Also by slowly building up more movement/general mechanics, it makes it less overloading on the player. I was worried about making the controls too complex and hard to take in, but this can be fixed by letting the player get used to one added mechanic at a time, like a game adding in new abilities and different uses for buttons on the controller as the player levels up.

- Lastly, I feel like VR was a good choice of mode for this project, though it did introduce its difficulties with things such as the microphone dampening, I think it was a great medium to work with. The fact the game was all around the player made it a lot more immersive, and especially if I add more interesting interactions with the world. It makes the player feel enveloped in the environment - helping create a state of total immersion in the game - and be less self-conscious about singing and more sub-consciously inspired by the materials around them.

Conclusion

Through my iterative design process, using another musician to test my project, the project was able to turn in a different direction from how I'd first imagined. This process helped stop my tunnel vision, work out mechanics that just plain didn't work, and add a new perspective and new ideas to the game. I believe there is potential for this project to become a well-rounded digital score, though it needs more interaction and added mechanics to help the player begin to sing rather than just make noise. The project currently is a good groundwork for a digital score to be created, and I'd love to develop it further to reach its true potential.

Wednesday 16 August 2023

Third playtest

Here are the points from the third playtest:

  1. The tester enjoyed how the NPCs ended up following you around and singing their notes.
  2. The second level was really scary, very much a horror game, you could have a choice to have the jumpscares on or off as it was quite scary when the monster growled suddenly.
  3. It was hard to know that you were being too loud, you could have a danger meter to warn the player they are about to go over.
  4. The backing music in the second level was buggy, it would either start off playing all the harmonies or when restarting the level after being captured by the monster, was completely silent with no backing.
  5. The tester thought that the major and minor harmonies worked well with each of the worlds, though the 2 worlds were very contrasting to each other. It felt like 2 different games, and it would be better to have some kind of middle level between them rather than the sudden juxtaposition of the new level.
  6. It would accidentally pick up notes when you didn't mean them, e.g. when talking, in the repeating NPC part, there could be a way to clear if an accident happens, rather than having to make sure you sang 4 notes each time even when you knew it would be wrong.
  7. Notes need to be very accurate. I could add some leeway for non-singers to make it easier. (From this comment I can see that my leeway system may need to be easier and the player doesn't get leeway quick enough.)
  8. Could add in timings to work in tandem with the note matching, so the player has to fit the rhythm too.
  9. The transitions are sudden and need a loading screen or hub to connect them.
  10. Starting movement with the long note is fine, but ending with the long note is inaccurate as you've already moved past where you want to stop. The tester suggested having a short note that's lower to stop and a short note that's higher to jump. They noted how you don't currently need the jump when not moving forward, so the short note could be used for something else or add a use for the jump in stationary.

Main points I would address in the next stage of development:
  • Fix the level background noise bug.
  • Add a button to clear your notes and restart in minigames.
  • Add more levels to make the transition more smooth and less jolting.
  • Add a warning sign for players to know they are bordering on being too loud.
  • Alter the leeway system or add difficulty levels for players with different levels of skill sets.
  • Add an option to turn off jumpscares.
  • Alter the movement system so the player can stop faster.
  • Add a new mechanic to the movement system to give the player an incentive to change notes.
  • In later levels, introduce rhythm into the NPC minigames.


Tuesday 15 August 2023

Notes on third stage of development

- For this stage, I have a week of development, so I want to make a new mechanic for the game, NPCs for the player to interact with. 

- My plan for these interactions is to have multiple NPCs milling about the level, some up platforming challenges for the player others just on the ground. When the player approaches one of these NPCs, they will sing 4 notes to them, and then the player must sing these back. They will keep repeating the notes until the player successfully sings the note pattern. After this, so in awe of the players singing talent, they become a follower. For the rest of the level, they will follow the player around and sing their note pattern in a loop. As the player collects more of these followers, each of the patterns start to harmonise into a series of chords as backing music

- My aim with these NPCs is to give the player a challenge to work at and a goal to complete the game and give them more inspiration to sing from. Currently, it's very easy for the player to get stuck singing one note, so I hope having the build-up of the harmony behind them will encourage them to either pick out notes to sing along to or harmonies on top, making the player more actively sing rather than just making noise of certain lengths.

- To start I had to compose the harmony lines each of the NPCs will sing. I did this on Musescore to easily export each line to an MP3 to use in the game. To build up these harmonies I used my a capella arranging experience to just create a melody line, then hum on top until I found harmonies to it that I felt sounded right. The only need for these chords was for them to fit the feel of the level landscape, which would be a bright, happy, open field, so I chose major happy chords. I also added a beat to be the starting sound for the player before any NPCs were collected so they didn't start in silence.


- As I had already made some NPCs for the tutorial, I used the same simple character design for these. I started by making them turn towards the player once they had been triggered, and from that, I created a simple sequence of events for the minigame to follow:

  1. Greet the player and ask them to copy the notes
  2. Sing the notes, displaying the letters in the speech bubble
  3. Wait for the player to attempt each note and display to the player what note they have just sung so they can see how well they are doing.
  4. Check notes match
  5. If they don't, repeat from 2
  6. If they do display a congratulatory message
  7. The main NPC disappears and reappears as a follower behind the player

- I set up a system so that when the player enters the mini-game, the backing noise will fade out, I did debate matching the notes an NPC sings to the timing of the backing but decided that may be too hard for a player to need to sing the harmony along with the others. Also, it was already a challenge getting all the tracks for the backing harmony to play in time with each other, so it would be hard to get this track playing with the backing each time.

- Setting up the note checking was easy enough as I could use the note recognition system I'd created for the old turning system. Once I'd created the minigame sequence, in trying it out, I found myself getting very frustrated as for some I would often be one semitone out on one note, and I had to keep reattempting until I got it exactly right. So, I decided to add a leeway system. When a player has a leeway, it will be automatically used to let them get away with a single note being a semitone out, so if the pattern was C G A C and the player sang C F# A C#,  with 2 leeway, they would successfully complete the mini-game. The notes the system shows to the player if they are a semitone out with a leeway correction are the notes they are being corrected to, so it's not confusing to the player when they are told they've got the note pattern correct, but the note display doesn't match. The amount of leeway a player has is calculated by (amount of attempts) % 2, so they get their first leeway on their 3rd attempt. There's space here to have difficulty levels in the settings to alter how quickly the player gets more leeway.

- When the NPCs follow the player, they will rotate to be behind the player as they are moving forward, so when they are standing still, they can turn around and see their follower collection behind them.

- Next, I created a new environment for the sunny forest level. Here, I needed some obstacles to let the player make use of the jump mechanic, so I added things like a log to get over a river I made to reach a new section and some floating platforms to get to an NPC at a higher height. I made sure to add 5 clearly different sections for each NPC to go explore. I made this environment by putting together models I found available for free online at OpenGameArt.org. This is a prototype to explore the mechanics of the game, so it doesn't look the best as I am just putting together a flat pack of models, but it will do for now.

- With the floating platforms, I had to keep testing them and altering the heights and lengths of the platforms to ensure there was enough space for the player to stop and they were the right hight to jump up. Due to the need to recognise a long note before stopping, the platforms have to be long enough to cater for the stopping distance.



- I also want to test out an extra mechanic for the game, using dynamics and volume. I had the thought of incorporating a Slender-like level in a forest where you must save the NPCs from a monster. This would give the next level a completely new feel and keep the game interesting for the player, adding the challenge of keeping quiet so as to not alert the monster.

- For this level the harmonies I made were minor and creepy, starting with a slow minor descending pattern instead of the upbeat beats of the last level (seen in the bottom bass line of the music)  


- I just dotted the NPCs around the map (with spotlights so the player can see where they are), adding the new note patterns and audios to them for this level. I didn't want to add any platforms or jumping obstacles to this level for the moment, as it's mainly to test out if a change in mood would work and see this new mechanic in action.

- I gave the player a torch in their hand to let them see around them, letting the area be dark and feel more creepy and ominous. I also added that it turns red when the player is too loud. The player has 3 strikes until it's game over, if they are too loud, the torch glows red, and a monster growl plays in the headset.



- I've also added a starting hut where the player will be told they are safe so they can test out how loud they can be. Here, the torch will glow red if they are too loud, but there is no monster growl, and they won't lose a life.

- I would like to add a warning colour of the torch to warn the player that they are bordering on being too loud and a life indication so they can see how many strikes they have left, but I currently don't have time before the next playtest, so that will come in later development if this mechanic works out.

- Lastly, I added an extra NPC that appears once the player has collected all the followers in level one, telling them about the new quiet rule of the next level and building the story of going to reduce the NPCs. I also added a finishing state, where all the NPCs saved are around the player to say thank you and sing the major chord pattern from the first level.


- After a bit of time has passed, the game transitions back to the start-up screen where the player can choose to play the game again.

Thursday 3 August 2023

Second playtest

Here are the comments from the second playtest:

  1. It's now easier to move around and more clear with new controls.
  2. The inclusion of sound in the game worked in making it less on the spot for the player to be making all the noise.
  3. The beat and bass worked well as background music, as it gave inspiration for notes to sing but wasn't too melodic to take away from the musician's choice.
  4. Always found themself singing the same note and needed a new incentive to change notes.
  5. They want more use of the jump mechanic - which makes sense because I only coded the mechanic, I didn't change the level from before the playtest.
  6. If there were more NPCs they would like to interact with them, maybe they could be milling about, but when the player sings something at them, they react, maybe singing the note back.
  7. Have both platforming (to use the movement controls more) and some kind of NPC minigame puzzle you travel to and interact with.
  8. Could have NPCs that sing together and make a big ensemble piece, my singing monsters or the Wii fit plus parade game vibe.
  9. It's missing the need to sing different notes. Could have a calibration at the start of the game to find the middle of your voice and use low and high notes to control something.
  10. The coins are good now as a base to learn the movement mechanic, but they can easily go and be replaced by something else.
  11. Volume wasn't very reactive, it felt random and didn't work well.
  12. Could have different combinations of melodies to use different powers, like in Zelda's Ocarina of Time, learning these new melodies could change what the short note does for example.


Main points I want to address in the next stage of development:
  • Add a reason to change notes.
  • Add NPCs with some kind of mini-game interaction, maybe including collecting them so they start to follow the NPC while singing.
  • Add a platforming aspect to the game to give use to the jump mechanic.
  • Either refine or remove the volume = speed mechanic.

Notes during second phase of development

- I have 2 days for this next stage of development, so I am focusing on tackling the big issue with the movement.

- I started by taking out the turning with 2 short notes mechanic, and plan to change to a simple, you move wherever you were looking when you started moving. I decided to do this rather than just simply moving where you are looking because it encourages more singing, having to stop and start in a new direction to navigate somewhere rather than just singing once and getting to the place you want to be.

- I felt the best use of the short note would be to jump, if this worked I could then I would be able to add obstacles for the player to jump over rather than just moving forward. Because there is now a new form of movement, being louder also affects the jumping, letting the player jump higher with a louder note.

- As the tester mentioned it was hard to understand what was meant by short and long notes and learning how to use them, I decided to have a tutorial level added. Here I have NPCs that demonstrate movement controls to the player (with instruction boards), one moving back and forth with long notes and one jumping with short notes. To continue to the game, the player needs to navigate towards a coin that has surrounding barriers they must successfully jump over to pick it up and proceed past the tutorial. This gives the player the space to practice the movement in a non-level environment, it also makes sure the player understands how to move before proceeding.


- Next, I needed to address the want for a mute button. I had never realised the need for this as I was testing the game by myself and had no need to make any extra sound, but when it was mentioned I realised it would be preferable to have one. I decided to make it the select button on the left-hand controller, so when the player is holding it down they are muted, their hand will turn red in the game to signal they are muted. I've added instructions on how to use this to the tutorial, explaining it as holding your left hand in a fist instead of naming the button, as I wouldn't expect the musician to know the button names.

- just in case the singer needs a recap of how to move and to give to option to add in settings in future development, I added an instructions panel that the player can toggle on or off with the right select button, also described in the tutorial.

- with the changes made to the movement, there is no incentive for the player to sing different notes. The tester mentioned how it was less encouraging to start singing with there being no sound coming out of the game, so I thought to help with both of these problems, I would add some background music to the main level. Just a simple beat and looping bass, so the singer could improvise over it and hopefully be encouraged to sing more notes to pair with the backing.

Tuesday 1 August 2023

First playtest

Here are the main notes from the first playtest:

  1. The main thing found was that the turning wasn't user-friendly, I believe I was able to use it myself as I knew what the program was doing behind the scenes so I found it easy to navigate. When the tester was playing, they just found the mechanic really infuriating as it was hard to be accurate with it.
  2. The tester mentioned that with using the VR headset it didn't feel natural to need to control the rotation with your voice, it would make more sense to just move in the direction you are facing and use the short note for something else, maybe a jumping or shooting mechanic.
  3. They asked for some kind of mute button to be created, as they were wanting to talk whilst playing, but it would pick it up as singing.
  4. It was hard for the tester to understand what was meant as the difference between a short and long note and they ended up singing a long note when wanting to turn as they were unsure how long "long" was.
  5. As the 2 short notes were hard to do with them not being intervals that people are used to singing, the tester said they would prefer it if there were specific intervals that did something, for example, a perfect 5th, so they felt more natural to sing
  6. As there is no sound in the game currently, the tester said they felt a bit of a spotlight on themselves and would like the game to be playing sound. For example, some NPCs that would be wondering about making noise that the player could then interact with somehow

Main points I want to address in the next stage:
  • Change the movement system to get rid of the complicated semitone rotation.
  • Add a mute button.
  • Add a tutorial.
  • Add sound to the game.

Notes during first development stage

 - As I haven't used VR before, I am working on a non-VR version as the most important aspect is the sound analysis, so I decided to get that working first, then convert to VR afterwards.

- After exploring what the VR headset can do and how applications use the technology, I have worries that the movement might make the user a bit motion sick, as they will be turning and moving without moving their body. I personally have never gotten motion sick from VR, so I cannot test it out myself, or if I get motion sick, I'll know it's bad haha

- I've started off treating the simple controls using keyboard inputs, to work out if this way of walking around is useful before implementing the note recognition

- I found this useful thread about detecting pitch from microphone input https://forum.unity.com/threads/detecting-musical-notes-from-vocal-input.316698/


- I edited code from post #10 to analyze sound from an audio source instead of a microphone, results were very jumpy

note: I am using musescore to create test cases of sound


- Looking at this code here: https://github.com/tbriley/PitchDetector/tree/master

- To create the forward motion, I didn't need pitch accuracy, so I was able to implant this easily from my previous notes, checking the length of a note, and moving once it had exceeded 0.5 seconds


- next, I decided to try implementing the rotation, even though the frequency wasn't too accurate yet, if I had the mechanics in place, after working on the frequency detector, I could easily implement it

- I found this useful website showing how to calculate the difference in semitones of 2 notes: https://www.omnicalculator.com/other/semitone

The equation is:

        n = 12 × log₂(f₂ / f₁)

- though I was able to get the frequency data coming in to affect the rotation, picking up when it was a short note and should be listening for the next one to form an interval, the fact the frequencies were so inconsistent meant the camera seemed to be turning almost randomly, as the data was so varied for just one note

- I went to look for other solutions to tracking the pitch and found a plug-in called "pitch detector" created for Unity (found at: https://github.com/tbriley/PitchDetector) This code converts the frequencies to midi notes so I won't need to calculate the semitone difference using frequencies

- through first looks, this code seems a lot more accurate and should be able to create better movement

- I was able to add to the unity project to implement my vocal-based movements, altering the code so that it looked at the midi notes rather than the frequencies. I also added breadcrumbs for the camera to drop as it moved so you could see the path it had taken.

- this plug-in worked a lot better, and I was able to create music to accurately create paths, like this square

Music Notation:

Path Taken:

- the note detection isn't perfect, sometimes it will be 1 or 2 semitones away from the note it should be, but overall it was much more accurate than the last implementation

-Though this plug-in helped me work out the movement design, I can't use if for the main project as it doesn't transfer to unity 2022 without bugs

- After searching around I was able to find this code for find a different pitch estimator here: https://github.com/nakakq/AudioPitchEstimatorForUnity

- this new code worked really well for unity 2022 and I was able to start on the VR version of the project


- I used the unity VR core initial set up to help me start the project seeing as I'd never coded in VR before

- I started by getting the movement working in VR with recorded sound as my input, altering the system I'd created for the nonVR version to work in VR and work with the new pitch estimator

- From this I started using the microphone as inputs, and found the controls weren't working as expected

        - only half the short notes I sang were recognised

        - when a long note was sung, it would quickly stop being recognised

 

- to work out what was happening in unity I created a makeshift debug panel that you could see on the oculus to track the variables and see where it was going wrong

- firstly I realised that the issue with the short notes was it was still tracking the note value while there was silence, so when it was processing the note to be able to rotate, it was reading the note value as NaN happening rather than the note that was last sung. I was able to fix this by only storing the note when the volume was above the noise threshold

 

- next the long note issue seemed to be a problem with the oculus, it employs a noise dampening process to take out background noise, which is being applied here, once the note goes on for longer than 1.5 seconds it blocks it out.

- Looking online I couldn't find any solutions to this, only people having issues with it, unless I can find a workaround, I need to work out a new method of notes to movement to move the player

- I came up with some alternate movement methods:

  1. Moving forward for a couple of seconds at regular intervals, then the player is just controlling the turning
  2. To move the player sings a long note and starts advancing forward, they continue moving until they sing another long note
  3. The player sings a long note to move forward and moves for a certain amount of seconds (the volume will let the player control how far they go with how fast it is)

- I didn't feel option 1 would work best for this project, it removes the need for the player to sing to move, which limits the amount of singing the player will be doing. I want to encourage the player to sing as much as possible with the controls so am not going to use this

- I don't want to use option 3 because the player would take a while to get used to and learn the controls (knowing how far they will go with how loud they are) and it could easily get repetitive and annoying to have to keep singing the same loud note to get a long distance

- I've chosen to do option 3, as I feel like it will be most intuitive for the player while still getting them singing. Though the original method of movement would work best, I need to work with the limitations I have.

- It was quite simple to get this method working with the code I'd previously written, though I am worried that the long note at the end won't be responsive enough, as the player has to sing a long note to stop, they will only stop once the long note is recognised as long, not one they start singing. I am hoping the player will be able to learn to preemptively sing to stop, but I shall see if that happens in the playtest.

 

- Next, I've created a simple world for the player to move about in, I've used a free model package from OpenGameArt.org and assembled it into a simple scene. There's a hill to navigate towards and a bridge over a pond in a forest to manoeuvre to. I've also added simple coins that the user can collect, to get them to move around the space.





- now the game is ready for a first playtest

Tuesday 4 July 2023

Alpha design

 

Description:

For the first version of Dynamic Landscapes, I plan to simply have the musician be able to move about the virtual space with their voice. Controls consist of:
  • Long note: move forward for the duration of the note
  • 2 short notes: rotate the direction of the player dependent on the width of the interval
  • Dynamics: louder notes will result in faster movements, quieter making them slower



Code Structure:

Scripts:
  • Player_Move
    • Set_Speed(int)
      • set up the speed 
    • Move_Forward()
    • Rotate(int)
  • Voice_Input
    • Read_Volume()
      • Read the volume of the input and alter the player speed in Player_Move
    • Interpret_Input()
      • Read input length and pitch to be able to move player
    • Move_Player()
      • Use Interpreted data to move player accordingly
  • Game_Manager
    • Start_Game()
      • reset all variables and player position to start game
    • Stop_Game()
      • exit playing start

Objectives for coding:

  1. Set up simple landscape using unity shapes
  2. Move character round space using keyboard
  3. Read microphone input
  4. Recognize long notes and short notes
  5. Move forward with long notes
  6. Link speed to dynamics
  7. Rotate with short note
  8. Change degree of rotation based on interval
  9. Create a more interesting landscape

Websites that seem helpful for coding:








Monday 3 July 2023

Consolidating Project Vision

Project Description

Artistic Vision:

Dynamic Landscapes will be a VR project. The musician will use a VR headset to view a digital landscape, e.g. a sunny field or a dingy forest. Here they will use their voice to move around the space; long notes will move the person forward, and pitch intervals will rotate the person's perspective left or right a different degrees depending on the interval size. The musician will use their voice to navigate the space, exploring whether different settings or game goals create different genres of music.

I will create this using Unity3D with C# scripts analysing the spectrum data to get information about the notes sung.

A possible way to perform the score would be in a theatre/performance space, having the musician wearing the VR headset on the stage and a large screen/projector displaying their view of the virtual landscape in real-time behind them.

Transforming Creativity:

This project takes a virtual landscape and uses that as a musical score, conveying the feelings of the music it wants to produce, while letting each performance be different, as the musician chooses which way they want to go and what sounds they want to produce.

A composer can create a new landscape to communicate a new musical idea, an atmosphere for the piece, while the musician improvises and plays with that theme. I think this can help transform how music is seen as a creative idea, rather than the composer setting out the notes and the musician choosing how to emote with them, the musician is choosing the notes and composer setting the atmosphere which can create new unique pieces.

Inclusivity:

This score allows anyone to create music, without the barriers of needing an musical education to understand a traditional western score. Anyone can compose by drawing out a landscape and have that translated into a 3D model of their world for people to perform. Having simply described sounds made by a person means anyone able to make vocal noise can play the piece without vocal training.

Drawbacks to this project include that it is visually based. Someone with limited/no sight could not perform one of these scores, though they could compose by describing a scene to someone/a generative AI, and hear another musician perform the piece.

Technological novelty and approach:

I will be using Unity to create a VR app, either creating 3D models myself using Blender or obtaining assets online. I will then write scripts to detail object behaviour using C#, taking in audio input and analysing it using the spectrum data to work out the characteristics of the notes and translating them into movement.

As a stretch goal, to add more agency to the composer over the piece, I will create goals/mini games the musician can choose to complete, to guide their exploration and subsequent music create in a certain direction.

Aims & Objectives

Aim:

-          To seek new knowledge in how a VR game can be used by a composer and musician to create a digital music score

Objectives:

  • Create a VR project where the musician can move about the space using their voice.
    • The program will display a virtual world
      • The program will display multiple worlds, movements
      • The player will be moved to a new world after a goal is completed or a certain amount of time has elapsed
    • The program will pick up vocal input.
      • The program will recognize a note’s duration, volume and pitch.
    • The program will move the player based on the vocal input.
      • The program will move the character forward for the duration of long notes.
      • The program will rotate the player based on the interval width of 2 short notes.
      • The program will alter the speed based on the volume of the input.        
  • The player may choose to complete objectives inside the landscapes
  • The score will be iteratively tested on an external musician, collecting feedback to improve
  • A new phase will be designed with the musician’s feedback
  • The score will be performed at the end of development
  • The composer's experience will be recorded
  • The musician's experience will be recorded


Methodology

I will use an agile methodology during this project, continuously bringing in a musician to use the technology, evaluate their experience, and take feedback to adapt and alter the project. This will help bring new points of view to the project, stopping a tunnel approach from happening if I only stick to my ideas. It will also help give the app a user focus, making sure it's understandable from the beginning, not just as an afterthought, analysing how much digital knowledge is needed to interpret the score.

The structure of each phase (alpha, beta, final) will be as follows:

1.       Design

a.      The design stage starts with (if applicable) looking at feedback from the last demo with the musician. I will analyse what went well and what needs to be improved, translating these into new features or upgrades to add to the next implementation. These may include ideas to make the score more interesting or interactive and ways to improve usability to make controls more understandable or intuitive. If in the first stage, instead of getting features from the previous demo,  I will come up with the original features I see fit for the first design.

b.      Next, these new features/upgrades will be realised through diagrams and plans. I will plan out how the features will be integrated into the previous design, adding to or redesigning the last iteration’s diagrams. After this stage, I will have a full idea of what I want the project to look like at the next demo – the end of this iteration.

c.      After is the research section. Here I will look into the new techniques I plan to use and find useful resources to help me. I may also look for assets if the new features include editing/creating new digital landscapes.

d.       Lastly, I will write up a step-by-step plan of what order I plan to implement the new features.

2.       Code

a.      The coding stage includes coding the project, surprise, surprise. Following each of the steps decided on at the end of the design phase.

b.       If all steps are completed, I will start working on stretch goals before the next demo.

3.       Demo

a.      In the demo stage, the musician is brought in to use perform the score. They will be using the most recent working version of the project, if new elements are in the middle of being implemented, it will revert back to the previous commit of working code.

b.      The musician will be filmed playing the score and interviewed afterwards to gather feedback for the next design.

c.      The interviews will be semi-structured, more of a conversation between the interviewer and me.

d.      They will also be asked to perform a Stimulated Recall Method (SRM). The SRM involves replaying the piece sung by the musician during the demo as the musician dictates their stream of consciousness throughout the process, recounting the decisions they made and any meaningful experience they had.

e.       This concludes the demo stage and will cycle back to design.


Insights & Conclusion

- At the start of this project, my main idea was to make a game based on exploration. The player would sing to move about a world, collectin...