How does the Jukebox work?

Build with:

Next.js
Postgres
Spotify Api
Youtube Api
Tailwind.css

1. Searching Songs

The searching of songs is done with the Spotify Api.

const urlParameters = new URLSearchParams({
q: q,
type: "track"
}).toString();

const requestUrl = `https://api.spotify.com/v1/search?${urlParameters}`;

await fetch(requestUrl, {
method: "GET",
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json",
},
});

2. Generating jump-graph

  1. Retrieving base analysis data from the Spotify-Api including every beat (start time and duration) and segments, where the loudnes, pitches, timbre and the overall music is similar over a timeframe
  2. Connecting every beat to the (potentially multiple) segments, that take part during the beat
  3. Comparing all beats with each other to find similar ones. Here, the loudness, the pitches and the timbre of each segment is compared and a weighed sum determines the closeness of two beats.
  4. A graph is constructed where every beat is mapped to multiple other beats that are similar
  5. The graph is filtered and further processed for playing and jumping

3. Getting the audio file

  1. Search youtube videos (using the YouTube-Api) with the same name as the song and the artist
  2. Filter these videos and find the one, where the video length matches the song length best
  3. Download the mp4 youtube video and extract the sound file

4. Playing and jumping

Playing uses the browsers AudioContext for cross-browser support.