Source: Dennis Da-ala Mirilla/Technext

Infinite Echoes, the nine-track album by the rapper and music executive Eclipse Nkasi, follows the story of Leo, who is on a quest to find “a new sound that will unite people across cultures and backgrounds.” The tracks are tantalising while echoing the type of themed song that might accompany a flick titled Lost Lovers.

But Infinite Echoes is not even an album created by a human being per se. After three days of working with multiple AI tools, the rapper brought the tracks to life.

How it all started

When AI was still unpopular, not in the generally accepted sense today, but really in the what-is-that-about sense, Eclipse, who was the head of promotions at the record label, Chocolate City, had been all over it. He had worked on some of the songs of the stars of that bygone era of the boys’ quarters Nigerian music scene; DJ Jimmy Jatt and MI Abaga.

He signed a record deal in 2012 and later wrote a book titled No Rice On Sunday. When social media blew up, he helped musicians promote their songs on and off the internet. Now he has created the first AI-generated afrobeats album.

“I took a personal journey to exploring tech. I was getting more interested in the intersection between creativity and technologyEclipse Nkasi on his early foray into the tech world

He spent the last few years figuring out what the core AI tools were and what were derivatives – AI tools gotten from strategic engineering of other AI tools. At the end of 2022, when ChatGPT became more popular in Africa, Eclipse was ready to push the boundaries.

“The idea of AI was something that interested me and I kind of made it a personal mission to really dig out the tools, going through as many as I could find, going through videos, testing tools, and that sort of just stuck with me,” he said.

At that point, he was ready for the change that AI would bring to the creative industry. But in traditional music circles, it was easily dismissed as a trend that would pass on. It didn’t. In the past few months, top record labels including Sony and Universal Music have been campaigning for a screeching halt in how AI tools have been used to generate music.

There have been covers of classic tracks being sung in the likeness of other musicians in almost indistinguishable ways. We’ve seen an original feature Heart on My Sleeve, by ‘Drake and TheWeeknd,’ that was AI-generated. And only this week, Spotify said it had removed thousands of AI-generated songs from its platform.

“I felt and I was saying that the world had changed,” Eclipse said. “But people were slow to realise it. And I think that musicians were maybe arrogant not to think that it was possible for AI to generate something that would be arguably indistinguishable from the human creation. I didn’t have all the answers, but I thought it was possible.”

Eclipse Nkasi made the first AI-generated Afrobeat album, now he wants the music industry to embrace AI

When someone, veteran music hit makers no less, who has seen stars rise and fall, say in 2019. that AI was only just a phase, it held so much weight because the characteristic features of the work have analysed epochs in the art. Art has always been about phases and trends. But this is AI, capable of adapting to new trends. What these record labels didn’t know, but which events of these days seem to caution, is that the rise of AI had only just started.

Equally, they argued that the content from AI would just not be as good and audacious as human content. They said it would lack soul. Alas! Infinite Echoes is nothing but soul.

“When I thought about working on the project, I said you know what, we will set out a few parameters and we will test out what is feasible. And to be clear I didn’t know to what expect as we go. We criticised it, ‘the Youreka Experiment.’”Eclipse Nkasi

“We gave ourselves the parameter to deliver the album in three days and spend no more than $500 on the AI tools. And for anyone who works in music, $500 is nothing compared to the average money we’ll require to build an album and you’ll definitely need at least three months. We did it in three days.” he said.

With the team, he started to work with ChatGPT and Sound Raw mostly.

“I told it I want to make it an afrobeat influenced, hip-hop, with a little bit of dance hall. By this time, I was describing a project based on the kind of music I will make as an artiste myself. I asked ChatGPT to make the concept note. It made sense. We were testing speech, songwriting, music production, all of that. We generated our track list and really got to work,” he said.

AI wrote the song, and then Eclipse Nkasi had to infuse minimal engineering to blend the work and help it make more sense, like switching the third verse to the first verse here, some rearrangement there, and blending the work for the AIs together. He shortened some lines, others he elongated. Then he had to do the tedious work of teaching the AI the melodies of the songs.

“This is a test. This is us saying, ‘What can be done?’ And so we created music that is listenable. It’s enjoyable as a project. It may not be my best work, but it sells,” Eclipse Nkasi said.

A new argument for music execs

Now the argument in the music scene is more tilted toward the ethical use of AI. Before, it was more around if AI could be creative because its creation is more or less a remake of something it had learnt on the internet.

“If anyone ever tried to write music in their early days, they were sounding like the music they use to hear. Humans will always process information based on existing context,” Eclipse said.

But he acknowledges that there is a dilemma about how to deal with the fact that the songs of musicians were used to train AI without the artiste’s knowledge.

When AI started to disrupt the creative scene, writers and artists alike, it went on with little hassle, partly because those scenes don’t have the deep pockets to fight the disruption. But Universal, Warner and Sony Music are just as big as the tech companies with their army of legal experts.

But Eclipse believes that the ethical use of AI tools is where the line will be drawn.

“Generating Kanye’s voice is blatantly wrong. That’s stretching it. But in the way I’ve used it,” he said that’s the new frontier. B But he cautions that in terms of how he has used it, “I don’t think that is what is really going to happen.”

AI is not built in a way that it erodes the human influence but incorporates it. That’s my opinion,” he added.

In the past few months, musicians have been selling their masters. In January, Justin Bieber sold his own for $200 million to the investment company Hipgnosis Songs Capital. Last year, Justin Timberlake sold his entire catalogue to the same company for $100 million.

For Eclipse, this represents a rising shift in the industry. “It heavily disrupts their business model of buying catalogues and masters.”

“The storytelling possibility is now limitless. If we use AI ethically, and people are not bound by their own limitations, they are able to tell better stories, you know. Maybe, this may not be as valuable for the music industry as it currently is. But that is life. Time really just changes everything.” he added.

His advice for those seeking to make something out of this commercial music brewing dispensation is to put some of their eggs in the AI basket. “You can’t stop everybody from developing AI,” he said

“It’s like an avalanche that has started. It’s pretty out of my hands. But what I’m hoping is that just as we found software like FL Studio and Logic Pro that helped us make music faster, I think it is evolving yet again. I’m happy for a version where these tools guide you. They make your process faster. They are sort of like your co-pilot. And not necessarily push one button and a whole album comes out,” he finished.

Source: Dennis Da-ala Mirilla/Technext