Camel 101 is a portuguese video game company that just released their new game Those Who Remain. Eduardo Magalhães, responsible for all the sounds in the game along with João Mascarenhas (the other sound designer and composer), talked to us about his process of creating this world through audio and, more specifically, how Sound Particles played a part in that.
Can you tell us a bit about Those Who Remain?
Those Who Remain is a psychological thriller combined with survival and puzzle/problem solving components, where you cannot fight or harm any of the monsters and the only thing you can do to avoid them is run and stay by the light. The story takes place during the 50’s in the United States of America, in a city called Dormont. Edward, the main character, who is under a huge level of emotional and mental stress due to a series of traumas caused by the loss of loved ones, betrayals and unsuccessful relationships, ends up being caught inside an imaginary interconnection between the real world and a dark world, where these traumas come back to haunt him in the form of several different spiritual entities. When faced with these metaphorical spirits, Edward has to make decisions that will influence how the game proceeds and how it eventually ends.
The game is made by Camel 101, a Portuguese Indie videogame company. How many were you in this project? And how did it affect the whole process?
The team working from beginning to end was comprised of 5 people. This enabled a more Indie approach and philosophy: the decisions were taken much more lightly, and they were very faithful to what we had envisioned for this game. Normally, when Indie projects start having a lot of people, a lot of investors and sponsors, it is very easy for the decisions to start deviating completely from what the developers initially had in mind. In our case, everyone had a very close and direct relationship with each other, so there was a lot of space to present ideas, discuss them and fundamentally to trust each other, which in turn gave everyone a more pragmatic approach when getting things done.
Of course, we all know there are markets and player types that we must provide games with, but at the same time we also want to test our ideas and philosophies, and that’s the main effect that the Indie approach had while we were developing this game.
For instance, as a very personal remark, I often get bored hearing the sound of a videogame I’m playing because most of them follow the same recipe. And I personally prefer the Indie perspective, where our sensibility will shape a series of decisions that will sometimes overcome standards and make a videogame sound unique.
How does an Indie company, especially in Portugal, get its videogames released for Xbox or PlayStation?
Nowadays, developing a videogame is not as hard and expensive as it once was. Those Who Remain was developed in Unity, and today anyone can get their hands-on Unity for free, they simply have to download it. When I first got interested in knowing how games were made and how game engines worked, the idea of downloading it for free was unthinkable. Today, you can access the Unity Assets Store and get assets with tremendous graphical quality, good mechanics and ease of implementation very easily and affordably.
Then, getting a videogame into consoles is a matter of having a good game (for instance with Nintendo, they are the ones who decide they want your game, and it isn’t you who decide that you want your game on Nintendo), having the money to hire someone to import your game into the console’s API and to pay for the developer kit that enables that import.
Another variable that may have to be taken into account is the portfolio: if the game developers already have titles publicly available, then it will be easier for the game to get into the platforms, not only because you are already known but also, and more importantly, because you already know how the whole process works. In our case, we had already developed other videogames and knew how everything was supposed to be, and that helped tremendously.
At which development stage do you, as a videogame developer, start interacting with publishers?
There are many different strategies. For instance, in the US and UK, it is very common for indie videogame companies to capture the interest of publishers at an embryonic stage of the project. For instance, if there’s a game with a completely innovative and out-of-the-box mechanics systems, eventually, it will easily get the attention of potential investors. Also, there are games which almost automatically turn into a classic even before they’re released just because of the concept they establish and the impact they will have on the industry, sometimes literally defining how games are going to be created from that moment on.
In our case, the publishers came on board at a very mature development stage, we already had 4 fully playable levels. We still had to change a lot of things, including the audio, based on their inputs, but nothing compared if they would have joined us at an earlier stage.
I’d say that if you already have some playable levels, it is only matter of contacting the right people, such as reviewers and opinion makers, and also go to videogame gatherings in order to showcase the game. It is also important to make sure that the those showcasing levels don’t have any bugs, because bad reviews are really difficult to overcome, and the pressure can be huge since most of the times there’s a lot of money and work involved in these projects.
Regarding Sound Particles, can you tell us about your experience with the software?
I was introduced to Sound Particles a few years ago at a sound design conference in Lisbon, and I’ve been using it since then, both for commercial and academic purposes. Due to the fact of the software being free for teachers and students, I’ve been able to easily introduce Sound Particles to all my students, either as a sound design tool, spatialization tool or simply as an exploring/problem solving approach.
The first project in which I used Sound Particles massively, was a theatre play involving dance and video with a specific speaker layout covering all 3 audience levels, where the verticality of the sound was thoroughly explored. At the time, I had 28 channels at my disposal, combined to create several different audio zones throughout the venue. So, I used Sound Particles to create custom multichannel virtual microphones to mimic the real speaker layout and then created about 180 different exports (of course not all of them were used in the end) to be reproduced in all of those speakers, which ended up providing a 3D experience to the audience members.
How did you use Sound Particles on Those Who Remain?
At the time I entered the project, I didn’t have a particular set of tools in mind, because they were still figuring out and experimenting with some of the sound concepts from the initial levels of the game. Thus, my initial main concern was to test and check all items from the beginning, making a total remake of the sounds and the implementation strategies that lasted for the first 3 or 4 months since the moment I had joined the project. After this process, I started to use Sound Particles mostly on the Otherworlds, which are portals to other dimensions where the player has to solve specific mechanics that involve inverted worlds and physics. And since these have such peculiar characteristics, we decided to reinvent and approach their sound in a different manner by using Sound Particles.
I used it mostly with voices. We had a certain number of actors doing different characters, some of the same actors voicing more than one character, some actors doing many different takes. I decided to take these variations, process them separately with many different audio plugins and imported them into Sound Particles in groups of 7 or 8 audio files. Inside Sound Particles I set up sessions with quasi-random movements in every direction as the voices were speaking; some of them would fade out with distance attenuation and then come back again; I also did some sessions with the particles going back and forward at different speeds as if they were crossing the player. And basically, this is how I use Sound Particles 70% of the time: I process some sounds in various ways, then I import them into Sound Particles, and I add movement to the particles (acceleration mostly), finally I’ll render everything with a 3rd Order Ambisonics microphone. Sometimes I’ll add some static groups of particles with different densities that occasionally are emitted, and time controlled via the random delay audio modifier, or I’ll use a very distant AB pair to completely distort the stereo image. I love the virtual microphones concept in Sound Particles, I explored a lot the possibilities of reprocessing by exporting renders in B-Format, reimporting them as audio files into particle systems and creating new material from them. I actually used this technique for the last level which had a corridor with ghosts that needed a different sound density than the other voice sound effects I created before.
Also, at a specific moment of the game I used it to create a 3D reverb for a gunshot using impulse responses associated with many particles spread as a sphere, which resulted in a very dense and eerie reverberation.
Here are some audio examples of what Those Who Remain was able to achieve with Sound Particles:
Topics: Sound Particles, New releases, Audio Software, Sound Design, Interviews, 3D audio, Surround Sound, Gaming Audio