When we arranged to interview Petri Alanko for the Time+Space website we never imagined the wonderfully thorough and interesting responses we would receive to our questions. The composer behind the soundtrack for Remedy Entertainment´s hugely popular video game Quantum Break, started his musical journey when he first played piano at his Grandmother´s friend´s house. Since then he has amassed an enviable collection of hardware synths, which he uses alongside his own ´self-made´ plugins and software from iZotope, Spitfire Audio, ProjectSAM, Vir2 and the company Petri describes as "magical" - Zynaptiq.
Here Petri tells us about his journey to where he is today, his studio tools, how he created the central themes of Quantum Break through his music and how one track, in particular, resulted in him receiving a speeding fine...
Hi Petri, thanks for agreeing to talk to us today.
It’s a pleasure, I believe in the fact that opening up opens doors, too. As a young kid, I felt I couldn’t get any music tech help from just about anyone, so I decided back then that if I ever had a chance to share my bag of tricks, I’d do it. Maybe someone considering a composing decision will be given enough courage to take the right route.
First off, tell us about the journey to how you got to where you are in your career today.
Oh, man. This is by no means a short story, but… it all began with an idea that I wanted to make music, to somehow “have an orchestra” at my fingertips, to be a conductor, to compose… something. As long as I can remember, I’ve had that in me. Now, I’m not exactly the most extroverted human being there is, and am thus lacking the need of limelights (although performing feels great) but since my very early years, I knew I wanted like to be involved with “that stuff” coming out of the tv set speakers when I watched the then movies, tv series, everything. I started off by playing the piano - I was given playing lessons after having heard Maria Callas’ aria compilation LP playing, and at the same time I tried reaching the notes she sang. An acquaintance of my grandmother told her they should really get me a proper musical education, that there was “something truly odd” in me. Also, as a very young kid, I had coloured “sounds” on the paper and scared off the kindergarten teacher. In my youth, there wasn’t much talking about synaesthesia or other related phenomena, so you can imagine her shock then. So, piano lessons, here I come
When I was 10, I heard something odd on the radio: Kraftwerk. Their material pushed me towards synths, and I sacrificed a lot of time and effort into getting more information on the machines they were using. Luckily the radio DJ back then provided a lot of information about them, as if he’d been to their studio (which years later proved to be b*s*), but his knowledge of their gear was surprisingly correct as it turned out. So, I was sitting there, writing down the terminology in front of the radio and the next day walking into a music library trying to find out everything I could, which wasn’t much. There really was no literature about them except for one book written in Finnish (since they had no English books, then): Osmo Lindeman’s “Elektroninen Musiikki”. I was probably the only one borrowing that book constantly from that day onwards. I started saving for a synth and my father located one second-hand from a Polish restaurant musician, who was wanting to stay in Finland with his newly-found girlfriend - so he sold his battered Roland SH-5 to my father, who cleaned and fixed it thoroughly. I was so happy with it. Soon after, the electric organ was traded in for a Roland Juno-6, and it didn’t take too long to get my first drum machine, Boss DR-55. A few years later, after one summer job (cleaning up a metal tech warehouse), I was able to get a TR-606.
I had lots of lessons between ages 5 and 19, piano, church organ (I was about to become a concert organist but was too lazy) some cello, some trombone, some guitar… many things, music theory, solfége, harmony and counterpoint, music analysis. I literally spent my youth in three places: school, music lessons and downhill skiing, the first one suffering most. However, since I’ve always had a good ear, I remembered the school lessons by hearing so I had more time for the remaining two. After I’d graduated rather nicely (to all the teachers’ shock) I moved to study in the University of Jyväskylä in Finland: musicology and thoretical physics. However, that didn’t last long as I’d put each penny into gear, and soon after the studying started, I was broke so I had to find myself a job. Back then there was a serious depression in Finland and pretty much all the jobs available at the time were shop clerks so I went into a music shop and asked whether they’d need a synth guy there. After a few questions, I got the job.
However, one day a guy walked in and we started talking. It soon turned out we had similar thoughts on music and liked the same artists and bands. The more we got into the discussion, the more the guy got interested, and when I revealed my then set of gear, he sort of jumped a little. “What the hell…?” And after a few seconds came the question “…you don’t happen to need work, I’ve got a lot of studio jobs to do?” After a mere 20 minutes since he’d entered the shop, I’d terminated my job, and later that day was already moving my gear into his studio. It all started that very day. After I’d connected the gear together (I had lots: Roland Juno-6 and System-100m with 40 modules, SH-101, MC-202, SVC-350 vocoder, Yamaha CS-15 and 30, Korg Wavestation and M1rEx, Yamaha TX16W and Ensoniq Mirage samplers, Roland SH-5… what else? Oh, the damn Roland R8) we started working on what was to become Finland’s first remix album - they really weren’t that common in the 1990s. That album threw us into a constant touring routine, which I got a bit tired of very early on, and thus spent 1 or 2 months per year on holiday, just to get rid of the “tour bus smell”. When I quit gigging, I’d done about…I don’t know exactly, maybe 2,000 gigs...in 18 years, with three years no gigging at all. Most of the places weren’t that spectacular, and there even were some pizzerias-turned-into-night-club gigs, but there were those 25,000+ people festival and arena gigs, too.
In early 1999-2000 I became interested in the IT world and even worked for a short while as a concept designer, trying to figure out new services for a company I was working for back then. I was actually quite happy with it, but missed music due to my busy days - and when an old record company acquaintance of mine called in 2001 and asked whether I was interested in producing a certain album, I first said yes as a hobby but soon after I was in again, as a profession.
Very shortly after that, the newly-found IT friends started forming companies of their own, specializing in game design (thanks to Nokia’s important role back then) and they all called me. When the small games were done, some people went forward with more ambitious plans, doing bigger games - and they continued to call me for help. One thing led to another, and a friend of mine tipped my name to Remedy, as they were searching for a guy who’d be capable of composing modern orchestral music. Turned out their project was called Alan Wake, and it looked phenomenal right from the start. I got incredible kicks from the very first cinematic draft and made them a good demo as well as I humanly could. Obviously they liked the results, as I got the gig… and now, after 11 years, I’m here, having done a lot of stuff for various gaming companies and media houses. Maybe it paid off to hit the keys of a piano at my grandmother’s friend’s house.
Wow, that is certainly some story! We read that you are also involved with sound design and creating sound libraries, were these commercially released libraries or for your own personal use?
Solely for my own personal use. I don’t like scripting Kontakt, and with Reaktor I’m semi-good but again, the UIs aren’t much good. Same applies to all Kyma’s software: form follows function, heh! But, strictly speaking, I’ve done a lot of work for the library I’ve got now, and there are some very interesting leftovers - or rather, material that didn’t get used in a certain project. Of course, I do use commercial libraries, too, I’m not all for reinventing the wheel especially if a certain library does its job nicely. But in most cases, the libraries I create are context-related and rely heavily on the main audio guidelines and brainstorming meetings. For instance, for a certain project, I recorded a lot of so-called “spy radios” but removed all tonal content from them leaving just the dissonant interference noise and started time stretching those to ridiculous levels. Some of the sounds I made with that ended up, too, in Quantum Break - as pads! I like the happy accidents that happen along the way. Sometimes the rudest sounds provide the most beautiful movement, and I rarely leave everything relying on one layer only so each element has a lot of movement, albeit subtle. I think one of my 20 TB drives is about full now with processed and unprocessed pieces and snippets.
Talk us through your studio, what key pieces of gear are in your setup?
Right now, what I love most are my eurorack modular and my analogue synths - due to my background, which started before MIDI arrived. But each synth, no matter analog or digital, must have a character. I’m not a fan of synths without personality. I like design flaws better than something that leaves me emotionless. In 2015 I invested quite heavily in the workflow and interfacing my modular into the digital setup using many Expert Sleepers modules as well as their control plugin, and got myself a Roland System-1m which is a breeze to use and it fits in really nicely in my otherwise analogue setup.
I feel like I’ve always belonged to two worlds, I’m a hybrid that way. Analog purists don’t really pass the torch, they’d rather use it themselves, in their own caves. The same applies to most genres: those who are crossing the borders make the liveliest examples of music, those who remain niche-centered, dry out eventually. I like to keep my mind open and not restrict my gear. If I make restrictions, I’ll make my “dogma” rules for each project. For instance, during Quantum Break’s 4.5 year gig, I didn’t use Nord G2X, nor my Access Virus at all. Also, I canned my SH-101 from that production, etc. The roles of my equipment are defined in the very beginning, and I like to stick to those rules throughout the project. However, nothing’s carved in marble, though, I’m not stubborn that way. I’ve just found that I can get more out of what I have by making restrictions. What I really trusted during Quantum Break were my Prophet-6 (I had the P-5, too, but soon after the newer one arrived, fiver had to go, it had become too costly to keep alive) Oberheim Matrix-12, Native Instruments, mainly Reaktor and Kontakt, but I used their little plugins as well.
Then there were Arp Odyssey and MiniMoog Voyager, Studio Electronics SE-1 and Omega-8…Symbolic Sound Kyma and Hartmann Neuron VS (running inside Plogue Bidule, and I devoted my laptop for Neuron, running some 12+ instances there), Roland V-Synth XT. Those were present in about 80% of all the cues. However, I leased out the Matrix-12 towards the end of the production, gave it to a session guy and replaced Matrix with Arturia’s plugin which, when treated badly in a right way, produced the exact same sound as my Matrix. I’ve also got an Oberheim Xpander, but that had less use, and somehow it sounded a bit different from Arturia’s plugin. My Matrix-12 was thoroughly repaired and overhauled, all caps replaced, new power unit, new keyboard, new filters, new oscillators whereas Xpander was as is. Although their internal architecture was about the same (well, 6 voices against 12) they sounded like two totally different animals. I could immediately tell them apart. All analog stuff goes in through either a Neve preamp or my Neve analog summing mixer. I like character devices, I’d say.
Of course, I used a lot of code, too, a lot of iApps, and my iPads played a surprisingly crucial role throughout the whole Quantum Break production. Lots of granular apps, odd controller applications, Lemur from Liine Software, for instance… and the more I played with non-keyboard interfaces, the more the playing affected the organic feeling, and I guess some of Quantum Break’s feel comes from me sliding my finger on a surface, be it one of my four iPads or Haken Audio Continuum.
And, as a proud owner of three Universal Audio Apollos (two Quads, one Duo for the little setup) I’m willing to say they are The Sh*t. Nice, quick, clean, interfaces with enough DSP, and their Unison technology is… well, incredible. It sounds like I’m their mannequin, but the truth is, I’ve paid dearly for my setup, each piece. No vouchers and freebies, here, mind you.
Congratulations on the superb soundtrack for Quantum Break. How did you get involved with this project?
We’ve got a history of 11+ years now behind us, and I’d like to think it’s beneficial for both. I’ve become accustomed to their functions and personnel and I also know their quality levels quite thoroughly, as well as their story-centric attitude which allows me to write the best out of me. I hope it was mutual love and trust and shared quality pursuits that landed me this gig, but they also know I don’t give up easily, if ever at all. I’m very controlled, always keep to schedules and budgets, and wouldn’t deliver a piece that I wasn’t happy with.
In early 2011, Saku Lehtinen, Remedy’s Art Director, approached me, asking whether I’d be at all interested in their new project… “If you’re intere-“ “SHUUUUT UP! I’m in!” And soon after, something I wrote for Beth surfaced. They had a marvellous piece of cinematic then, something they’d planned to use for pitching the project to different publishers, and obviously we did quite a nice job then as Microsoft decided to back us wholeheartedly. It felt really good to be there right from the start, to see everything yourself: how the project formed, how the story was brought alive - and how the business side was ran. I love Remedy, they’re open that way. If you want to learn, they’ll enable you.
>So, soon after the green light, the production kicked in and I composed a bunch of themes that - to my and everyone else’s surprise - survived through the whole development cycle, to the very end, although, of course, some instrumentation issues had to be fixed. I’ve learned to trust my intuition, and it really produced good results there.
You mention on your website that you had to ‘apply my emotional anchors to loss, love, disappointment and payback’ to the composition of Quantum Break. How did you set about translating those four pillars into sound?
It was sound selections, instrumentation, harmony, countermelodies, open chords… I tried to take everything into account, everywhere, all the time but I didn’t really have to remind myself, it was more like an autopilot, instinctive process. What’s easily the hardest one of those to write, is disappointment. It’s so close to anger and resentment - and actually, all three are related: first comes disappointment, it develops into anger (if not treated properly) which transforms into resentment (again, if not treated) and that last emotion would kill you if you are not careful. Disappointment is something that each of us handles differently, according to something planted in us before birth. I’ve watched my own kids and observed their differences for quite a few years now, and although their upbringing is similar, the reactions to emotions are incredibly varying.
Soundwise, it’s easier, the cold hand of disappointment can wipe off any smile from any face, and its echoes ricochet for a long time after the first strike - it returns in waves. The other three pillars are more constant and clearly colored in their own way, but in Quantum Break’s case, payback wasn’t that easy, as I wanted it to be “payback by correcting things” instead of “payback by destruction”, so the payback becomes a positive force, an enduring strength, something that kept the love alive. Usually, I pictured it in Quantum Break as a rising motif, a series of harmonies or a rising theme, and when a step is taken, it’s good to rest a little - a lot like what happens in “Remote Warning”. Another, more forceful payback theme is “Doubt, Despair, Hope”, but it has the sorrows - and the love - in it, too.
“I Kept Waiting” is the emotional equivalent to purest disappointment on the Quantum Break soundtrack, but also there are the relieving tones and the embrace softening the coldness. Just imagine someone who waited years and years for someone, for nothing.
In March, you Tweeted about the plugins that you claimed ‘survived the dogmas’, including NI Komplete 10, iZotope Iris, and your own self-made products. Why did these make the cut, as opposed to other plugins?
Well, there were others, naturally, but those were crucial for the end result since my library was built around Kontakt and Reaktor samples - a lot of Kontakt processed by Reaktor, by the way. There are some tracks that feature about 75% NI. Iris’ UI is brilliant, and they should make the UI into a product of its own and license it to NI and other sample playback manufacturers. It’s a brilliant piece of code. Most feedbacking sounds, by the way, on the album were done with Iris or Iris2, about 20% were played in Ableton Live. I think their survival was related to their stability, and although I had a mysterious issue with NI plugin opening times, they were very stable. Also, the user interfaces are very clear and leave no room for what-is-this-ism.
Also, I must say my selection of plugins is a bit different from “an average pop song production” as sometimes I put a Zynaptiq PitchMap after a long reverb… in a send. I tried to find as many granularity plugins as I could but most of them were quite buggy and with an “artsy interface” - what’s wrong with people? Not everyone wants to play with a mouse. I, for instance, back in the Alan Wake days, already made a Reaktor patch that was granular cloud-based, but keyboard controlled, and was able to use the modulation wheel as a virtual “bow” to play the sounds back and forth. Now, during Quantum Break I tried to refine it further, thanks to increased processor speeds and RAM, and I still think it’s quite a nice little instrument, a great tool for pads with the right amount of movement provided by the granularity. But honing that into a more ready-built product requires someone with much sharper Reaktor skills. I’d like to have so much more in it, and I can’t really invest my time into it right now, no matter how much I wanted.
Could you tell us a little more about the ‘selfmade’ products you mentioned?
One I did for Quantum Break was a simple distributing plugin: it sends the signal to different Logic buses and/or sends according to keys I press down. It’s very simple, but I couldn’t find anything like that on the market, and it’s the one sending effects around here and there in “Disappearance”. I originally got the idea from Kraftwerk’s live effects processing practices, and there are some plugins capable of creating sounds inside them according to pressed down keys (such as The Finger by Tim Exile), but I wanted Logic buses and my effect chains and the flexibility.
Also, another one that’s used everywhere on the score was a granular chorus with a barberpole phaser/filter, a great source for sounds that require “something odd” or “little movement”. Little things like that, something I need to have at a certain point of a production. Now, if I could have a playable convolution engine with three-way crossover and tempo sync for the convolutions, I’d be happy. But my limits are there. Also, I’d like to have delay feedback time-in-time units, say feedback taking a bar and 3 eighth notes…
What do you particularly love about Iris and which features do you most frequently use?
UI, very clearly. I can test things out really, really quickly and remove unwanted particles from a sound. I’ve got a wishlist for the filter and the effects but the basic engine is the simplest yet the coolest there is. In “Still Waters” the first 50 seconds are Irises and a BreakTweaker, just Izotope only, heh, by the way. Well, excluding the sends, of course. The ability to layer with ease, each with different loop points… instant sound design food! I think Irises were used in the sound design, too, but I cannot say this for sure.
Do you use any other iZotope plugins?
All except the ProTools-only products. Oh, and I haven’t updated my RX4 Advanced yet, but that’s on my to-do list: to go through the plugins and fix the lot. I’ve been an Ozone user since version… wait, three? I’ve got 7 now, and I love the Vintage EQ/Limiter plugs, very well put together, that. RX is my basic sample editor which is on 24/7, it boots when my computers are booted. Also, I’m a big fan of the Imager plugin, and in some Quantum Break cues that was the tool that made most of the layers fit into each other nicely. BreakTweaker appears here and there, and I’ve got a few upgrade ideas for it but I’m sure I’m the only one needing them…
You used Zynaptiq’s Pitchmap for the QB soundtrack, 603 times in fact, according to your Twitter feed. For those who aren’t familiar with this software, please could you explain what it does and specifically how you used it?
Yes! The Magic Company, Zynaptiq! Their plugins are just magic, doing something I used to try to do for hours and hours. PitchMap was easily the most overused plugin, like, counting everything together. It corrects individual frequencies to certain pitches, if needed, or it can correct a certain individual frequency or note to certain note, leaving others intact, so it’s not really an auto-tuning plugin, although it could be used as one. In some cases I made several scales for different sections and switched them on-the-fly and even “played” or “reharmonized” the reverb wash so that the notes wouldn’t clutter whilst dying away.
As I loved to create myriads of convolution tables, I also loved to abuse them before putting them into use, and that’s where their Unveil came in handy. For instance, I recorded myself handling a piece of tinfoil, and ran it through Unveil and Unfilter, then loaded the result into Logic’s Space Designer or NI Reflektor, and…well, I was in a wonderland for a long time. Plugin wonderland. It can take hours, literally, and you just scratch the surface. This time we’re living in…this is marvellous, so full of astonishment and wonder.
Oh, one other thing I must praise: Morph. I was a fan of Prosoniq’s Morph plugin but ever since Morph 2.0 came out, I was morphed into an audio idiot. I did some cues where a cinematic music piece morphed into a licensed song little by little, but I can’t, unfortunately, tell if they had used any of those. For instance, the scene where Jack climbs up from the dock, from the crashing containers, and Beth is waiting for him - they get into the car, and Toto’s Africa begins to play… the preceeding cue’s piano morphed into Toto’s intro drum loop but since I haven’t had time to play the game yet properly, I can’t tell if it’s there or not.
And that really wasn’t the only thing I did with Morph. I felt at some point they were everywhere.
You’ve also stated that ‘all tracks were sketched with plugins, but I quickly replaced everything with synths’. And that you felt it was ‘crucial to print them old school.’ Could you explain why you felt that way and which plugins did you use to sketch out the tracks?
Usually, I start with the absolute minimum, and since I’m using Logic, it’s Logic’s built-in synths, but they were all replaced - except Alchemy 2, which is brilliant. I was an Alchemy user back in the Camel Audio days and it’s awesome, a poor man’s Synclavier. So, the bad-sounding plugins had to go - let’s face it, the Logic stuff really isn’t that great, they’re more than a bit dated, and lack the depth of, say U-He plugins and in most cases, whenever I felt plugins were appropriate or “handy”, it was usually U-He that I ended up using.
If I decided to replace a track with a synth, it was due to a need for depth and “feel of flesh” that was required. And in some cases, what I had in mind was a bit more complicated than just a three-osc patch from ES-2. In some cases I used a semiacoustic guitar, drop-C# tuning, played with a toothbrush or a Lego motor or…anything, then mangled the signal with modular synths and recorded it through the Neve preamp - not because egotrips were needed, but because the sound felt right. I have this theory (that I’ve proven several times) that you can basically do everything with plugins, too, but I happen to know my arsenal through and through so sudden changes in sound or something I’ve got in mind are much, much faster to do with a hardware instrument. Also, since I’m using a eurorack monster with enough OSCs and VCFs and so on, the required amount of “beef” or “fat” or whatnot can be accessed in a flash, a matter of knobturns, and each knob move is always predictable, producing predictable coherent mayhem. With plugins, the “sonic error range” is much wider, i.e., what plugin combination worked yesterday, doesn’t work today because the source sound was a wee bit higher. Precise algorithms have their issues whereas with “dinosaurs”… well, they’re nice pets, they behave when needed, and attack when told to do so.
I’ve got quite a huge set of plugins, actually, and it’s a bit black hole-y to go through each and every one. I’ve prevented that by having these play around evenings where I push and shove each instrument or effect plugin to its limits, to become accustomed to it, to find how I can abuse it creatively, to find out what can and cannot be done with it. I’m not much of a fan of presets but with a suitable set you’ll get enough starting points for some quick ideas. I’m not an anti-plug nagger, I’m very much pro them.
Side note: one reason for hardware synths was the scope of the project, by the way. Remedy was using real actors so why wouldn’t I consider using real synths, too, once I had those at hand? “Why did you climb the mountain?” “Well, I got these shoes, and the weather was lovely- heck, it was there.”
Spitfire libraries combined with ProjectSAM provided lots of weight to semi-orchestral material. I didn’t want to keep anything “orchestral” as such since we had our audio/music guidelines discussion in…was it already in 2012? But some things are really nice to do with orchestral sounds and that’s where my hard disk orchestra came in really handy. Combining my own samples with them provided a lot of building blocks for just about anything thrown at me, and using them is a breeze. Predictable results each time, superb sounds, very high-quality stuff, and since I’m a fan of guitar sounds, Vir2’s Acou6tics and Electri6ity came in really handy too. And, just to make things even more complicated, some of the library tracks were mixed through my modular; Vir2 stuff also got processed by some amps I happen to have in my garage and one of them has a really nice triple rectifier…They’re not only handy to test ideas, they’re very capable of producing final mixes, and if some of those libraries were used, they usually remained until the end. No need for replacing stuff.
With Kontakt, sometimes I lowered the Acou6tics tuning by an octave and ran it through my amp setup, to get this huge BRAAAAHM growl underneath the sfz/fff brass/lower string stuff. If I’m ever going to be stranded on an island, give me a solar panel and my laptop+hard disk combo, and I’m not coming back. By the way, Spitfire’s slightly oddball series of the newer, smaller libraries (the Icelandic libraries, for instance) - oh HOLY CRAP how much do I like them. I love when people have a similar attention to detail and quality that we have at Remedy. It’s very much respected as it takes lots of stamina and time to get things right, sounding and feeling right. Anyone can throw together a sound bank but only a very tiny handful of sound library developers can turn the audio files into a playable, believable instrument, and that’s what’s ticking with ProjectSAM, Spitfire and Vir2. They’re not libraries, they’re instruments.
What’s your personal favourite track from the QB soundtrack and why?
I must say, in this order: “Suite for Time And Machines”, “Dodging Bullets” and “A Whisper”. They change in their order but those are my favorites right now. They each contain a lot of good memories and a lot of good vibes and - despite the minor key - positive energy. “Suite for Time and Machines” was a bit of a brat to mix since some plugins don’t respond nicely to tempo or time signature changes and I had to do a lot of odd stuff to make it right. It basically contains almost the whole first act in a musical form, by the way.
“Dodging Bullets” almost got me a super-expensive speeding ticket, when I checked the mix in my car. I’ve got a really nice sound system in my car (nothing fancy, it was like that when it came from the factory) through which I’ve listened to probably the most music made by someone else other than myself, so I’ve trained my ear there. I had three mixes and “mix3” caused my right foot go down, down, down…and then I saw the blue lights, shortly afterwards, red ones too, and damn, I got pulled over. The officer asked whether I had a good reason to speed, and the other guy kept checking my tires, lights, walking around, then stopped behind the older, more talkative guy. I just - without thinking, really - said “Well, I listened to a final mix of a song of mine, and, well, I didn’t concentrate on the speed, really, and…” “You do music?” “Yep.” “What kind?” I played a short bit of “Dodging” to them, and I saw the younger one’s face light up. “Whoa, an angry riff!” For a short while I thought they let me slip, but unfortunately that wasn’t the case, and they took me in the back seat of their car where I handed over my driver’s license and car papers. Since Finland has a progressive fining system, it’s “the more you earn, the more you pay”, which is why IT millionaires can get a fine of 112.000 euros, which is a huge amount. No such fear here, though. But, I got fined a hefty fee anyway. So, one could say that piece is a dangerous one and can put you in the backseat of a police car.
“A Whisper” should feel like the nervous moments before the first kiss, the first touch, the first embrace. It’s almost like a promise of something, promise of not hurting the loved one, something that soothes you and makes all the bad world go away. The weight on that song is huge. I cannot spoil the game for you, so I need to keep my mouth shut, but when the time comes, you’ll understand what I was after. It would be great if your readers playing Quantum Break would meet that moment as a bit of a surprise, which is why I should stop explaining right now…
Later this year you’ll be performing electronic dance music and trance classics at the Helsinki Festival with the Helsinki Philharmonic Orchestra and Choir, can you tell us more about that?
It was a long time in the making. 10 years ago Club Unity had its 10th anniversary, and to celebrate all the classics, a good friend of mine, DJ Orkidea (Tapio Hakanen, Nokia’s ex-head of audio design) asked whether I could make a little surprise for the partygoers - as the night was going to be a bit more special, with all the nice dresses and suits, it would also have been cool to get a lounge pianist there, playing trance classics with a grand piano, smoking a cigar, sipping single malt and so on. Class!
Very shortly after I’d begun the arrangement of the songs, I realised they might benefit from a slight orchestration so I made quite a few orchestra tracks to back the piano playing; I was playing to the click - and it worked perfectly. When the night came, I decided to record the piano (it had MIDI out) on top of the versions. And I was nervous as hell, as my then laptop (running Logic) wasn’t exactly the stablest one, so each time a song pointer approached the right hand border of the screen I was ready to shout effs. But thankfully nothing. It all went really well although I really had no time to smoke the cigar and drink the single malt. After the gig, however…later on, when Armada Records wanted to publish the outcome, I kept the live piano recording as is, only cleared some dozen or so wrong notes… so in essence, Trancelations 1 is a live recording. Last year, Trancelations 2 came out and about a few months later, Orkidea started talking about “an orchestra… maybe at the Music House… you at the grand piano… with a choir…” which I thought was just him brainstorming - but heck, the guy went talking to Juhlaviikot (Helsinki Festive Week), and then one day this January, I got a call: “Er, Petri… look, I hope you don’t mind, but we’d like to have you at the grand piano again at Unity’s 20th anniversary…which happens to be a part of the Helsinki Festive Weeks. With Helsinki Philharmonic… at the Helsinki Music House… Petri? Petri?” “Tapio what the effing eff. YOU DID WHAT?”
I guess I need to include a couple of surprises there, new stuff. So, in short, it’s going to be a huge concert and I’m already nervous. Better start honing my keyboard chops, then.
What else is coming up for you in 2016?
I started writing an album already in 2015, and since there are quite a few tracks not used in Quantum Break - not really leftovers, but rather victims of storyline changes, and they’re really worth publishing, I’ll make them a part of that album. I still mean to add more beat, more cinematic music, more lavish and lush atmospheres and a few epic moments. But I think it’s going to be a nod towards the trance/club side of my musical roots. I’ve released single tracks here and there, but I’ve managed to avoid the album format. Maybe it’s time, who knows? I’d love to have as much syncable material on the album as possible, but…let’s see. There’s no real deadline for it, not yet.
And finally, what’s the one piece of advice you would give to someone wanting to create music for video games?
Find your own sound, or do your own sound: harmonies, chord inversions - but don’t force it. Each second is your advertising time, so make them count and never use fillers. Either way, it’s crucial to have your own recognizable handwriting. If the music works as is, it definitely works in the context, too, but be prepared to create a huge load of stem files for directors, heh…They’ll usually use only about 1/8th of your tracks anyway, so make each stem worth a song themselves, too, just a bit smaller.
I’d suggest you find the storyline’s strong points first and try to define the final style through them. Once you’ve got those in your pocket, it’s much easier to draw almost anything out of there, and remember to think how they’d react or feel if they were real persons - and how they got there. And finally: never underestimate even the smallest gig. It may lead to one of the biggest projects of your life, I’m a prime example of that. Just remember one thing: If you do music as your living, don’t ever do anything for free. It’s your time, your electric bill, your rent, your food you’re paying for - watch your back and get the 50% first, 50% after it’s done. Cab drivers don’t drive for free, neither do carpenters build houses for nothing.
Great advice! Thank you so much for your time Petri and for your really interesting responses to our questions.