- As I haven't used VR before, I am working on a non-VR version as the most important aspect is the sound analysis, so I decided to get that working first, then convert to VR afterwards.
- After exploring what the VR headset can do and how applications use the technology, I have worries that the movement might make the user a bit motion sick, as they will be turning and moving without moving their body. I personally have never gotten motion sick from VR, so I cannot test it out myself, or if I get motion sick, I'll know it's bad haha
- I've started off treating the simple controls using keyboard inputs, to work out if this way of walking around is useful before implementing the note recognition
- I found this useful thread about detecting pitch from microphone input https://forum.unity.com/threads/detecting-musical-notes-from-vocal-input.316698/
- I edited code from post #10 to analyze sound from an audio source instead of a microphone, results were very jumpy
note: I am using musescore to create test cases of sound
n = 12 ×
log₂(f₂ / f₁)
- though I was able to get the frequency data coming in to
affect the rotation, picking up when it was a short note and should be
listening for the next one to form an interval, the fact the frequencies were
so inconsistent meant the camera seemed to be turning almost randomly, as the
data was so varied for just one note
- I went to look for other solutions to tracking the pitch
and found a plug-in called "pitch detector" created for Unity (found
at: https://github.com/tbriley/PitchDetector) This code converts the
frequencies to midi notes so I won't need to calculate the semitone difference
using frequencies
- through first looks, this code seems a lot more accurate
and should be able to create better movement
- I was able to add to the unity project to implement my
vocal-based movements, altering the code so that it looked at the midi notes
rather than the frequencies. I also added breadcrumbs for the camera to drop as
it moved so you could see the path it had taken.
- this plug-in worked a lot better, and I was able to create
music to accurately create paths, like this square
- the note detection isn't perfect, sometimes it will be 1 or 2 semitones away from the note it should be, but overall it was much more accurate than the last implementation
-Though this plug-in helped me work out the movement design, I can't use if for the main project as it doesn't transfer to unity 2022 without bugs
- After searching around I was able to find this code for find a different pitch estimator here: https://github.com/nakakq/AudioPitchEstimatorForUnity
- this new code worked really well for unity 2022 and I was able to start on the VR version of the project
- I used the unity VR core initial set up to help me start the project seeing as I'd never coded in VR before
- I started by getting the movement working in VR with recorded sound as my input, altering the system I'd created for the nonVR version to work in VR and work with the new pitch estimator
- From this I started using the microphone as inputs, and
found the controls weren't working as expected
- only half
the short notes I sang were recognised
- when a long
note was sung, it would quickly stop being recognised
- to work out what was happening in unity I created a makeshift debug panel that you could see on the oculus to track the variables and see where it was going wrong
- firstly I realised that the issue with the short notes was
it was still tracking the note value while there was silence, so when it was
processing the note to be able to rotate, it was reading the note value as NaN
happening rather than the note that was last sung. I was able to fix this by
only storing the note when the volume was above the noise threshold
- next the long note issue seemed to be a problem with the oculus, it employs a noise dampening process to take out background noise, which is being applied here, once the note goes on for longer than 1.5 seconds it blocks it out.
- Looking online I couldn't find any solutions to this, only people having issues with it, unless I can find a workaround, I need to work out a new method of notes to movement to move the player
- I came up with some alternate movement methods:
- Moving forward for a couple of seconds at regular intervals, then the player is just controlling the turning
- To move the player sings a long note and starts advancing forward, they continue moving until they sing another long note
- The player sings a long note to move forward and moves for a certain amount of seconds (the volume will let the player control how far they go with how fast it is)
- I didn't feel option 1 would work best for this project, it removes the need for the player to sing to move, which limits the amount of singing the player will be doing. I want to encourage the player to sing as much as possible with the controls so am not going to use this
- I don't want to use option 3 because the player would take a while to get used to and learn the controls (knowing how far they will go with how loud they are) and it could easily get repetitive and annoying to have to keep singing the same loud note to get a long distance
- I've chosen to do option 3, as I feel like it will be most intuitive for the player while still getting them singing. Though the original method of movement would work best, I need to work with the limitations I have.
- It was quite simple to get this method working with the
code I'd previously written, though I am worried that the long note at the end
won't be responsive enough, as the player has to sing a long note to stop, they
will only stop once the long note is recognised as long, not one they start
singing. I am hoping the player will be able to learn to preemptively sing to
stop, but I shall see if that happens in the playtest.
- Next, I've created a simple world for the player to move
about in, I've used a free model package from OpenGameArt.org and assembled it
into a simple scene. There's a hill to navigate towards and a bridge over a
pond in a forest to manoeuvre to. I've also added simple coins that the user
can collect, to get them to move around the space.