Have you ever heard the expression, 'it's not what you say, but how you say it'? We can apply this principle to the expression of musical instruments: it's not what you play, but how you play it. This and deliberate sound selection are the basic principles behind the concept of instrumentation.
Instrumentation in music brings life and emotion to a written composition, making it an essential part of being a musician, whether you're a singer, producer, or instrumentalist. Below, we'll explain how instrumentation works and share how you can use these classical techniques to give your art enhanced expression to engage your listeners.
Defining Instrumentation in Music
Instrumentation refers to the particular combination of sonic textures and instruments within a piece of music, designed to support the main idea of the song. In its simplest form, instrumentation refers to a producer or composer's specific choice of instruments for a composition.
In addition to sound and instrument selection, instrumentation can also speak to how different elements are expressed: For example, an electric guitar played with vibrato creates a completely different feeling than a guitar played cleanly. Different instruments also fall into different frequency ranges, so instrumentation depends heavily on the audio spectral balance across multiple sonic elements.
Essentially, instrumentation is selecting instruments and dictating the intensity at which they are expressed with the end goal of producing a balanced, cohesive mix that supports the melody of a song:
Why is Instrumentation Important?
When you think about instrumentation, your mind might jump immediately to orchestral music. Can you imagine if a composer wrote a piece in which the bass voice was 5x stronger than the melody played on the first violins? You could probably guess that this piece would feel unbalanced, making it difficult to follow the main melody of the music.
The principles behind instrumentation give us a framework for building balanced, cohesive music that honors the main idea of the song. Most songs can be distilled down to drums, bassline, and the melody pulling everything together. If we think of instrumentation as the choices designed to support the melody, and not overpower it, we are in great shape.
The Ingredients of Instrumentation
So, how is great instrumentation created? Here are the main components of instrumentation every musician should consider:
Sound Selection
The largest component of instrumentation is sound selection. It's not just about picking out the individual components you like; sound selection boils down to having a prerequisite knowledge of where certain instruments lie across the frequency spectrum (don't worry, we'll cover this below), and using that knowledge to inform what you bring into your composition.
You might have to make certain compromises, as just because something sounds good on its own, doesn't mean it will sound great with the rest of your music. If it's not supporting the idea of the melody, it needs to go.
Texture
Texture speaks to how smooth, rough, warbly, or continuous a sound may feel. While these are more subjective terms, the key is to consider how the different textures of various instruments will interact with one another when performed in a composition.
Dynamics
Dynamics speak to how loud or soft something is, specifically in relation to how loudness and softness shift over time in a song.
Articulation
Articulation speaks to the way in which instrumental or vocal parts are expressed. For example, a vocal could be sung with hard consonants or soft ones, it's just a matter of chosen articulation.
Timbre
Timbre describes the tone in which instrumental parts are expressed.
Range
Range speaks to the distance between the lowest elements of the frequency spectrum and the highest ones. Your instrumentation will affect the range of your track and, therefore, the listening experience for your fans.
Blend
Blend or balance speaks to the ratio between one sonic voicing and another in your composition. Great instrumentation produces a consistent blend across the different elements of a song.
What is the difference between instrumentation and orchestration?
Instrumentation and orchestra music are very similar, both speaking to the specific sound and instrument selection within the context of a composition. However, orchestration is a slightly more specific term, speaking to sound selection when building orchestral music, typically within a live setting.

By contrast, instrumentation is a broader term that incorporates a wider swath of production and compositional curation methods.
For example, you might say that picking certain synthesizers and drum parts within your digital audio workstation is a type of instrumentation. Whereas with orchestration, you would also be working with multiple instruments, but in a live setting, like figuring out the sound balance for a string quartet.
Where Do Different Sounds Fall Across the Frequency Spectrum?
In order to craft a balanced musical piece, you have to have a basic understanding of where instruments and sounds generally lie across the frequency spectrum.
Why?
If not, it's easy to go overboard in certain sections, resulting in poor instrumentation. Here is where you can expect instruments to fall across the frequency spectrum for reference:
Drums
Drums are surprisingly versatile in terms of frequency spectrum coverage. Kick drums are the lowest, with the thump resting somewhere at 60-80 Hz. Snares and toms rest primarily in the middle of the frequency spectrum, with the brilliance of hi-hats and cymbals extending out to the upper end of the frequency spectrum, at 5 kHz+.
Strings
Strings cover the mid-range of the frequency spectrum, but certain string instruments have elements that extend lower or higher. Bass guitars and upright basses cover some of the boominess of the lower end of the frequency spectrum, while violins and mandolins will extend into the upper frequency range, covering some frequencies between the upper mid range of 2.5 kHz - 5 kHz.
The electric bass can take on some of the lower end frequencies, while still occupying significant space in the midrange of the spectrum.
Vocals
Individual vocal ranges will vary, but you can expect them to generally fall around 200 Hz to 15 Hz, depending on whether the singer is male or female. Male voices can cover a frequency range between 100 Hz to 8 kHz. Female voices can cover much higher, though they start closer to the middle of the frequency spectrum, ranging between 350 Hz to 17 kHz.
Piano, Keyboards, and Synthesizers
These instruments are all incredibly versatile in terms of range, allowing them to cover most of the frequency spectrum. They will not cover all of the ultra-low sub-frequencies or extra-high frequencies starting around the 5 kHz mark.
The original piano range, however, is more limited than say, electronic instruments like keyboards or synthesizers. Just be mindful that these sounds are wide-ranging and, therefore, should be used carefully.
Where Do Orchestral Instruments Fall Across the Frequency Spectrum?
If you have dreams of being a conductor, orchestration begins with understanding where specific instruments fall across the frequency spectrum. Familiarize yourself with where the following instruments rest so that you can create excellent balance in a live music setting:
Double Bass
The double bass is where you get the rumble and rich sub frequencies within the orchestra. From 40 - 300 Hz, these instruments provide the foundation for the low end.
Cello
The cello rests above the double bass, adding low to low mid frequencies up to 1 kHz. It can support the melody by outlining the tonic tones of various chords.
Flute
The flute, known for its bright, sometimes shrill sound, rests in the mid-range, but can extend into the higher mids up to around 4 kHz.
Violins
Violins often hold melody and stand out in a mix due to their piercing, clear texture. They can span from 200 Hz up to 10 kHz or higher for those brilliant, high tones.
Piano
Much like in modern production described above, the piano is special in that it can span almost the entire frequency spectrum.
Clarinet
These warm wooden instruments hold their place in the mid-range, occupying around 100 Hz - 2 kHz.
Oboe
The oboe is known for its nasal-like tone, providing an interesting texture within the orchestra. The oboe rests around 250 Hz - 4 kHz, primarily living in the mid-range.
Horns
Horns tend to have a fairly large range, though they most strongly occupy the upper mid-range of the frequency spectrum. Horns are quite expressive and can easily overpower your balance when played too brashly, so proceed with caution.
Drums
Similar to the drums in your DAW, orchestral drum elements occupy the full spectrum when played together, with certain elements occupying higher or lower ranges than others. The same ranges apply for the kick, snare, and cymbals as described above. The timpani tends to lie in the lower to mid frequencies around 50 - 400 Hz.

How to Use Modern Instrumentation in Your Songs
Even if you don't see yourself conducting classical music or utilizing traditional instruments, we can still carry the lessons of instrumentation into the modern music-making landscape. Here is how you can bring instrumentation principles into your digital audio workstation:
Sound Selection and Design
Just as you would with orchestral music or with live musical instruments, the VSTs, instruments, and even samples you select have a huge impact on your final composition. Prioritize the sound that best represents your melodic idea, and then build around it. Any sound outside of your main idea should be selected to emphasize, not overtake, that initial sound.
Sound Layering
You can experiment with stacking the same sound or playing with different sounds, playing similar parts across several octaves. Stacking layers can help individual parts feel more wide and robust and allow your mix to feel more cohesive. Just be careful not to overdo it: your melody should always remain the star of the show.
Automation
Automation builds more of a dynamic experience for your listener, creating an experience that is more akin to how you might experience a live musical instrument. Experiment with automating parameters like volume, filter, and individual components of your instrument over time to program a more realistic performance expression in your songs.
Velocity and Other Expression Parameters
The challenge that comes with producing music via software rather than performing and capturing it is that you can lose some of the imperfect aspects of a performance that make something sound "human".
One example of this could be changing the velocity, or intensity at which notes are expressed. When an instrumentalist performs a part in a song, they typically are not playing every note evenly, even though that's often the default baked into MIDI parts. Take your time to randomize and adjust expression parameters to create less of a robotic feel.
Instrumentation Brings Music to Life
Instrumentation boils down to understanding the sonic character of the individual components of your composition and selecting them accordingly, with the end goal of crafting a balanced piece where one voicing does not totally overpower another. The expression that's baked into your selected sounds can certainly shift the overall emotion of your piece, making instrumentation a powerful, dynamic tool.
Hopefully, this guide made it easier for you to master the art of instrumentation. Enjoy building balanced, well-expressed compositions with instrumentation, whether you're a composer, producer, or instrumentalist.