26 Apr 2018
Product

Learning the Tool #4 - How can I scout for music based on mood?

This week we show you how Instrumental’s A&R scouting platform can help you scout for music based on mood.

Utilising Spotify and social API data with applied AI processes, our platform unearths the fastest growing and most exciting new artists and tracks. Currently, in closed beta, we are already powering global A&R teams from across the industry.

Learning the Tool: How can I scout for music based on mood?

Our A&R music scouting platform is currently tracking over 180,000 artists and 830,000 tracks on Spotify. However, when scouting for artists and tracks, we know that most of our clients already know what they are searching for. This is why we have made use of Spotify’s track characteristics, to help you sort artists based on the mood of their tracks. Please see the image below to see where these are located on the dashboard.

Learning_the_Tool_4

Below, we have broken down each characteristic

Acousticness - A confidence measure from 0.0 to 1.0 of whether the track is acoustic. 1.0 represents high confidence the track is acoustic.

Danceability - Danceability describes how suitable a track is for dancing based on a combination of musical elements including tempo, rhythm stability, beat strength, and overall regularity. A value of 0.0 is least danceable and 1.0 is most danceable.

Energy - Energy is a measure from 0.0 to 1.0 and represents a perceptual measure of intensity and activity. Typically, energetic tracks feel fast, loud, and noisy. For example, death metal has high energy, while a Bach prelude scores low on the scale.

Instrumentalness - Predicts whether a track contains no vocals. "Ooh" and "aah" sounds are treated as instrumental in this context. Rap or spoken word tracks are clearly "vocal". The closer the instrumentalness value is to 1.0, the greater likelihood the track contains no vocal content. Values above 0.5 are intended to represent instrumental tracks, but confidence is higher as the value approaches 1.0.

Liveness - Detects the presence of an audience in the recording. Higher liveness values represent an increased probability that the track was performed live. A value above 0.8 provides strong likelihood that the track is live.

Speechiness - Speechiness detects the presence of spoken words in a track. The more exclusively speech-like the recording (e.g. talk show, audio book, poetry), the closer to 1.0 the attribute value. Values above 0.66 describe tracks that are probably made entirely of spoken words. Values between 0.33 and 0.66 describe tracks that may contain both music and speech, either in sections or layered, including such cases as rap music. Values below 0.33 most likely represent music and other non-speech-like tracks.

Valence -  measure from 0.0 to 1.0 describing the musical positiveness conveyed by a track. Tracks with high valence sound more positive (e.g. happy, cheerful, euphoric), while tracks with low valence sound more negative (e.g. sad, depressed, angry).

These characteristics are particularly helpful for sync teams or music supervisors, looking to find tracks based on mood. When combined with the BPM and track mode (major/minor) filters, users are able to increase the granularity of their search.

For example, if you are looking for an upbeat instrumental house track for a sneaker advert, you may look for a track in major, with a BPM of around 118-135BPM, with high values in valence, instrumentalness, danceability and energy.

Last week, we shared how filters can help you scout for a specific genre of artist on our A&R scouting platform. Check back here next week to read the fifth instalment of our Learning the Tool feature.

If you have any queries about Instrumental’s A&R scouting platform or want to find out how it can power your music scouting efforts, please get in touch via hello@weareinstrumental.com