Case Study - Lofi Bytes: AI-Generated Lofi Music Web App
Lofi bytes is a web app designed to allow users to create their own AI-generated lofi music using transformers and LSTMS.
- Client
- Launchpad (UC Berkeley ML Club) Internal Creative Project
- Year
- Service
- Web development, CMS
Overview
Lofi bytes is a web application developed by a team of six engineers from UC Berkeley's Launchpad club, allowing users to generate personalized lofi music tracks using MIDI samples and adjustable background sounds such as rain, fire, and cafe ambiance. The project began by exploring MIDI data as a more efficient option than digital audio for generating music quickly. Due to a shortage of suitable lofi MIDI data, the team compiled a dataset by converting YouTube lofi piano compilations into MIDI files.
The model training process involved starting with a basic LSTM model and later adopting the MusicTransformer architecture, which employs self-attention mechanisms to capture complex musical relationships. The achieved results, though modest with approximately 40% evaluation accuracy, showcased characteristic elements like sustained chords and new melodies. The dataset's inherent noise posed limitations on the model's performance, prompting considerations for refining the dataset and enhancing melody and chord generation.
On the technical side, the backend, built with Flask, exposed the machine learning model as an endpoint. Through the React-based frontend, users could upload MIDI files to initiate the lofi music creation process. The frontend interface featured sliders for adjusting various elements such as drum beats, rain, cafe sounds, and fire, all integrated using tone.js. Future plans include expanding the range of available instruments and sounds for MIDI music.
Read more at published Medium article: https://callaunchpad.medium.com/building-a-lofi-beats-generator-9cada5fbdf5a
What we did
- Frontend (Next.js)
- Transformer Training/Finetuning
- Flask API
- LSTMs