Rhythming ⏹️
At the end of my last post, I mentioned how I would integrate melody and timbre into the rhythm grid gameplay. However, after going back to the drawing board, I ended simplifying the rhythm grid instead, to focus on, well, rhythm.
Now, there is a more robust technical structure to support the rhythm grid gameplay, as well as keeping everything else on the sound-led island in sync. As of now, Rhythmer, the music system for the sound-led island, generates five different rhythms using a custom implementation of the Euclidean algorithm for rhythm generation. I use this algorithm frequently since it’s easy to implement and makes for complex and interesting rhythmic structures. Multiple generators can be combined to create polyrhythms, which players can not only hear, but also see when visualized in some way. The rhythm grid is essentially a visualization of a given rhythm, so if we have multiple grids that use different rhythms, we can see their movement offset and overlap at different times.
To recap, the general gameplay of the rhythm grid is as follows: Adjacent cells of a grid are activated, so the player can step on them, forming a path over time. Typically, the player must first observe the sequence in which the cells are activated, to get an idea of where to expect the next cell to step upon. The next 3 cells after the current one are highlighted to make it easier for the player to know where to jump next.
Adding polyrhythms makes the gameplay significantly more difficult, as it requires players to track two grids that move at different times. After several unsuccessful tries of traversing the polyrhythmic grid, I ended up reducing the tempo of the island’s music, so that everything would move slower and give players more time to react. On that note, I’m thinking about adjusting the tempo of the music system dynamically, according to the gameplay the player engages in. Or possibly, similar to how players can change the speed of daytime progression in the ecological island, I could add an interactive element that allows adjusting the main tempo of the island’s music system. This would allow players to choose whichever level of tempo-based difficulty their comfortable with, without forcing difficult gameplay on them. Since one of the main driving factors behind Soundgarden is to make everything about it accessible - this writing, code, assets, etc. - why not make gameplay accessible as well?
Progressing with progress ➡️
With the arrival of the rhythm grid gameplay, we now have something very ludic in here. There will be more areas with other music-led gameplay which makes me think about an overall progression that the player can achieve on this island. Similar to how collecting rocks eventually allows players to play with daytime tempo, or opens pathways on the ecological island, completing a music-led gameplay area could lead to unlocking new areas or gameplay variations. Maybe there could be something like control panels that players can unlock. These panels could allow players to change some parameters. For example, they could adjust the parameters of the Euclidean rhythm generators, leading to changes in all elements that depend upon them, propagating throughout all the different areas on the island.
(Still trying to move) beyond the beat 🤔 Despite my previous ruminations on Kiki and Bouba, thinking about timbre and melody, I haven’t actually come up with a gameplay mechanic that utilizes these ideas on perceptual features. However, I’ve also been meaning to look further into utilizing spatialization of sounds to inform gameplay. Blind Drive is a rare example of a sound-led game that is not rhythm-based, and that extensively uses panning sounds to provide locational information to the player. While the panning is not necessarily realistic - things that are supposed to be in front of you will be hard panned in either direction to signify in which direction to evade them - it suggests a clear course of action. One of the issues with having a player listening closely to sounds is that oftentimes the visual impact of the game environment, as well as the affordances of navigating it are too distracting for such a task. Blind Drive solves this by removing visuals from the equation altogether. So the question remains: How can we facilitate a deeper awareness and understanding of sound within a highly visual environment?
Possibly, we can use visuals to support the sound, pointing out certain aspects of it. Maybe we can help players to pay attention to certain characteristics by training them on the connections between specific visual features and sound features. For example, showing a certain color or geometric shape could mean for the player to pay attention to the occurrence of a specific timbre, musical note, or rhythmic pattern. It would make sense here to look into perception research, as well as to listening practices and exercises, possibly even meditation styles that foster awareness of aspects of one’s inner world and surroundings. On that note, what if we can teach awareness of sound, as well as listening techniques through gameplay?
Side note: On performance ⚙️
I received some feedback about periodic stuttering in Soundgarden. After taking a closer look, it appears that Soundgarden has an inconsistent frame rate that slows down almost periodically, 2-3 times per second. This is of course inacceptable for any real-time application, let alone one that features music and sound-based gameplay. Especially any gameplay that uses rhythm will be heavily affected by this. I was thinking about this when I was playing Hi-Fi Rush recently. In this game, everything is synchronized with the music: The player character’s walking animation, environmental animations, and, when fighting enemies, the points at which the player has to press buttons for maximum attack efficiency. For me, the biggest strength of Hi-Fi rush is the almost instant feeling of flow it creates once I got into the groove and forgot about the fact that I have to constantly push buttons to fight enemies. Pressing buttons becomes as intuitive as dancing to a DJ set, almost as if the controller disappears and instead creates a direct connection between my musical sense and the action on the screen. The game runs consistently at 60 fps. If there were fluctuations in the frame rate, even if the music playback was unaffected, the inconsistency of the game’s speed would make it more difficult to get into the flow so quickly, if at all. When it comes to immersion and flow, it seems that a stable and consistent performance is even more important in sound and music-based games than it already is in general. Conversely, this means that I should make it one of my top priorities with Soundgarden to ensure a consistently good performance across different hardware, at least until that is the case.
So what’s next? A deep dive into performance optimization in UE5, as well as implementing a new gameplay mechanic that is not rhythm-based.