Blog

John Enroth – Composing for Film and TV


September 5, 2023
by Graeme Rawson

John Enroth has been composing music for TV, Film and Game for 20 years as part of the team at Mutato Muzika – Founded by DEVO’s Mark Mothersbaugh – working on huge internal films like Thor: Ragnarok and Apple TV’s recent future retro hit Hello Tomorrow!. We got the chance to ask John a few questions about his work and how GForce virtual instruments help him produce high quality music, quickly and consistently.

Hi John, thanks for taking the time to talk to us. Can you tell me a bit about your background and how you came into composition for film and TV?

I was always into music and from a relatively young age always wanted to write music for TV and Films. I played in band and orchestra in school as well as bands outside of school and got a degree in Music Composition from Arizona State University. After none of my musical projects worked out I went back to school for music engineering at The Conservatory of Recording Arts and Sciences so I would be able to become a composer’s assistant.  Luckily at the time a gentleman named John McJunkin was the head of the intern department and a HUGE DEVO fan so when I said I wanted to intern at Mark Mothersbaugh’s Mutato Muzika he made it his life’s mission to get me in there and I was determined to stay as long as I could.  Still haven’t left after 20 years.   

Your CV is hugely varied, from computer games, through indie comedies, to blockbuster movies. Which project did you most enjoy working on and why?

It’s really hard to pick a favorite as they are all so different, but the one that continues to stand out, especially as I still see it’s impact years after the show has finished, would be Regular Show.  It was such a creative show to be a part of and every episode was basically a brand new score genre, often many different per episode, to fit the crazy plots that were presented. I can’t even begin to count the amount of genres we did for that show.  Everything from Hollywood Blockbuster movies to classic rock to hip hop (there’s an episode with Tyler the Creator and Childish Gambino) to synth pop to jazz…it was truly an adventure I’ll never forget. I learned a lot about scoring from doing that one show.

Can you talk us through the process of writing music for a project? What stage in the overall project do you come on board, and how closely do you work with creatives from other departments?

It really is project dependent and each project is different.  For video games we mostly just deal with the audio lead who tells us what kind of music they need and how much of it, there might be a meeting or two with the producer or the creative team but once the palette of the game is established we just deal with the audio lead.  For television we usually just deal with the show runner in post production and it just depends on the show when we get involved. Sometimes it’s while they’re shooting and we try and come up with the music so the editors have the score to cut to, and sometimes we’re working with locked picture and just start writing. It’s about the same for movies as well, but movies seem to be more thematically driven.

How much of your process is planned and how much stems from on-the fly experimentation?

When a project first comes up it’s all experimentation but with heavy input from the creative team involved with the project.  Once a project gets going, especially with TV shows it becomes kind of an auto-pilot thing because the sound of the show and character themes are established.

Music plays a really important role in world building. A lot of that, I imagine, is in the sounds you choose to use. Can you tell me more about the musical sound design element of composition? Would you choose sounds early to inform the melody, chords etc, or the other way around?

This, as always, is project dependent. We’ve recently done a lot of Medieval or shows set in the “classical” period so the score is usually very grounded in acoustic instruments that are period appropriate.  With some of the more modern shows we did, like Dirty John for instance, the music was very synth and sound design driven but we also tried to stay away from too much sound design so as to not step on the toes of the sound supervisor.  In cases like that we get to work hand in hand with them to make sure our roles are clearly defined and we can play off each other’s work, which is a lot of fun.

How has your workflow evolved since starting out?

When I first started out I didn’t use templates as I didn’t want to use the same sounds over and over again. This was very helpful when we scored more commercials as it helped to make each one more unique, in my opinion, but as I got more into scoring television I quickly got to making templates once the sound of the show was established so I could get to writing as quickly as possible. As far as overall workflow is concerned I’ve been almost entirely “in the box” since I started at Mutato.  I did a lot more tracking back then for things that needed it like Guitars and Woodwinds because I can play those, but I’ve been using virtual instruments for my entire professional career.

Do you use a lot of virtual instruments in your work? What benefits do virtual instruments give you as a composer in the modern production landscape?

Almost all of the initial writing is done with virtual instruments as it’s much easier to get ideas down and change them around.  We usually keep as much of that as we can, because of the flexibility, but since we’ve been doing those Medieval and “classical” shows we’ll track real strings and instruments before playing for the show runner and then re-record and fix what is needed. The main benefits of virtual instruments is the absolute flexibility of them and in some cases playability not capable with a real instrument. The down side of this is it has shortened deadlines considerably because there isn’t need for tracking until cues are approved.

Which GForce Software instruments do you find yourself reaching for and why?

The first GForce Software Instrument I go for is the M-Tron Pro.  It’s the first one I ever used, I can’t even remember when because it’s been so long, and it’s still amazing now.  Usually when a show asks for “something different”, like “Hello Tomorrow!” or “Summer Camp Island“, the first thing I’ll try is M-Tron Pro. In the case of “Hello Tomorrow!” they asked for “future retro” sounds to match the show’s aesthetic, and the vintage tape vibe that is so amazingly captured in the M-Tron Pro helps give the “retro” and the onboard effects or other plugins help to give it the future vibe. With Summer Camp Island the direction for the witches of the show was very “Disney Fairy Godmother” and the M-Tron Pro helped with the vintage and adding effects to help make them magical just fit the show so perfectly in a way that using some traditional libraries just wouldn’t do.

What do you think will be common place in music technology in 5 or 10 years time that isn’t yet?

hmmm…I actually have no idea..I wish I had an idea so I could start now!!! HAHAHA! My guess is that AI will probably play a more important role in sound design and/or playability of virtual instruments. The amount of processing power available now is incredible and it’s sometimes harder to play a sample library than it would be just to learn to play the instrument you’re attempting to replicate on a controller. That being said, more and more libraries seem to have an “all in one” patch that can change articulations on the way you play.  Some are much better than others but I can see built in AI engines that have been fed tremendous amounts of data being able to help facilitate easier playing based on how you play your controller for even more realistic sounding performances of virtual instruments.