The release of Artificial Intelligence (AI) to the mainstream in early 2023 has changed the technological landscape, particularly in the form of the well-known ChatGPT. AI offers unfathomable possibilities, and it doesn’t take long to realize that even the current, imperfect version of the tool can shape the future. However, with rapid innovation can come new avenues of exploitation. AI tools have been responsible for intriguing phenomena that have caused many legal and ethical concerns within the media industry. More specifically known as “AI Covers”, AI software has seen the ability to mimic the voice, sound, and tonality of many famous artists, leading us to ponder what can be done to stop this blatant and harmful plagiarism, and if the same solutions can help battle the growing use of AI plagiarism in online schools such as iUP.
The increasing prominence of artificial intelligence in the music industry has led to many concerns over the legality of AI songs as well as the future of music as a whole. AI voice generators, including the well-known “Covers AI”, can impersonate the sound of a musician’s voice. The software listens to audio clips and breaks down the elements of speech, then reconfigures the pieces of voice into a new recording (Gholipour, 2017). This is dangerously easy to do. A multitude of free voice covers are available at the touch of a button and can cause disproportionate amounts of havoc. People can use artificial generations of an artist’s voice to copy different songs – which not only risks the authenticity of the music but could pose a threat to the stars themselves in terms of artistic ownership and financial losses. And this technology can be used for even more devious purposes – such as fraud and mass panic.
This innovative software has given some artists reasonable concern and frustration that something so personal as their sound or style of music can be exploited and used for the benefit of others. The music industry is a tough business to get into. You need to have something that stands out about you, whether it’s your voice, skill, lyrics, or the ability to adapt and create new and unique qualities in your music…but AI completely skips ahead of that process and allows anyone to create something relatively good in minutes and from the comfort of their home.
As we move further into the use of AI in music, it is important to spread awareness of how to use the technology appropriately without the risk of legal repercussions and copyright. Softwares like “Covers AI” provide the ability to use these voice generators but lack originality and morality. Being what is seen as the theft of an artist’s intellectual property and style, AI needs to be addressed both in legal and broader societal scope.
Brian Sexton, iUP’s music professor who has been teaching for the past 30 years, emphasizes that, similarly to any other form of technology, artificial intelligence is ultimately “a tool” at the end of the day. Nowadays, even if anyone “can throw some things together to create a song,” he makes it clear that it won’t automatically be “something of good quality.” He mentions that “going back to the Beatles, they would have to meet at a studio together in one studio, write it, figure out everything, put it all together, and then pay an engineer and studio for all their time, and a record label would foot the bill.” Today, that isn’t the case, as “people can make a good song so much faster with free software in their homes,” which has made it easier than ever for an artist to make a name for themselves. However, it’s not all sunshine and rainbows. As Mr. Sexton put it, if AI innovation in the music industry begins to affect “somebody’s ability to earn money, that’s an issue, and they’re going to have to come to terms with that and figure out how to do it.”
Despite this, Mr. Sexton does use AI “all the time” as a part of his job as a teacher, using it to “help build questions for quizzes, to help schedule events in my calendar, and to make agendas. It’s a tool, and tools can be used for both good and evil.” Regarding students using artificial intelligence to do their assignments, Mr. Sexton says that it’s “clear when students do use it in classes,” stating that “it’s clear when someone who’s been writing authentic material all of a sudden turns in something that is using vocabulary that’s like years above where they’re at. There are always signs.”
When it comes to the future of artificial intelligence, he explained that “AI is going to keep evolving, not just in music but in software people use for school.” He ends things off by saying that “it’s best to use that tool to make your authentic life easier, more productive, or more enjoyable. That’s why I don’t think AI is going to replace the human component; it’s just going to enhance it. If you try to replace it, that’s when bad things will start happening.”
Jayden Mason, a music producer who has been making music for other artists for the past 6 years, explains how less famous producers “use AI to put these big artists on their beats” in order to gain more popularity, because even if it “might not be the real artist singing or rapping over it,” the production is still real. By using artificial intelligence, these producers are “getting more views, more clicks, and more people are buying their products now because of that,” which is what leads him to think AI can actually be used in a positive way to help get lesser-known producers the attention they deserve.
A perfect example of this is an AI album named “4th Dimension” using the voices of Travis Scott, Frank Ocean, Drake, and other big artists. This album shocked the internet with its quality, as everyone had positive things to say about it, such as a YouTube comment exclaiming, “This is actually insane. I’m actually a big fan of this. I don’t like the idea of AI replacing artists, but to be honest, it’s like I’m getting more music from my favorite artists. This album was made really well.” Here, the producers, with the help of AI and mainstream artist voices, had an opportunity to show the world their production style and build a fanbase, and they took it! This AI artist, Moving in Silence, enthusiastically said, “I want to genuinely thank every single person supporting us; without y’all, none of this is possible. As we move away from AI and into our own original artistry, The creative vision and passion you have for us will continue to grow and improve. I can’t wait for you all to see what we have in store for the future.” Jayden continues to appreciate any type of music, whether it was enhanced or not by AI, such as this album, as he audaciously states, “So if any music comes out, I’m going to listen to it because it’s still good music, but if they have to face legal action because of what they did, I just have to make sure I download it and save it before it’s taken off.”.
Jayden continues by mentioning how “most music is being imitated anyway. Almost everything is sampled these days.” He said how some producers will say, “‘I made this great beat I spent a week making and I put so many hours into it or so many layers and instruments’ and the beat that sells is the beat that took them 10 minutes to finish using AI software.” However, he is unsure whether or not AI will negatively impact the music industry, considering how popular it already is. Despite this, he mentioned how he “could see originality taking even more of a decline than it already has. No matter how optimistic you want to be, there are always people who are going to use things the wrong way, and as it grows, people are going to find different ways to use it.” Jayden still keeps a positive outlook though, saying that if AI is used in the right ways, “AI can be a big help to any industry,” although it could still be used in harmful ways by “people that use it for their wants and desires.”.
So what are the possible solutions? Some have suggested adding alerts and notices to all AI content. This could be realized through various disclaimers, community notes, and tags that would adequately be able to inform the consumer of the use of AI (although this may be treating the symptom rather than the root problem.) Unfortunately, the reality is that a wide-scale halt to any AI-produced content is unlikely to occur due to how popular the content has become, only pointing to why a plethora of warnings is likely to have the best impact regarding confusion around AI content.
Teachers are also facing the threat of AI in their classrooms, particularly those at iUP. Our digital school environment lends itself to the use of AI in schoolwork, clubs, and more. Many students tend to use ChatGPT and the like on essays, projects, and even iHoot articles. However, the teaching community has developed solutions for this problem. Normally, teachers can tell when a student uses AI in their work. These types of assignments are usually much higher in skill level than one could expect from the student previously, and use many uncommon or difficult words. In addition, AI is usually riddled with bias and misinformation (Elgersma, 2024). After all, it learns from the internet.
Despite this, there are occasions where plagiarism is not as easy to spot. Perhaps the student is a 10th grader with a 4.0 GPA or a senior in AP English. How do you figure it out then? Well, in addition to plagiarism checkers, teachers now regularly scan assignments from students for evidence of AI use. These scanners, available for a small fee on the internet, find AI-generated text and even sometimes AI-paraphrased content (College Transitions, 2023). The suspected text or multimedia is inserted into the checker, and soon the result pops out and educators are free to take whatever action they will. With great rates of accuracy, this software has become the go-to solution for teachers everywhere.*
However, teachers are also taking preventive measures. In-class discussions about the reliability of AI and its dubious uses have skyrocketed in recent years. Students are now regularly informed of the flaws of AI, such as its spreading of misinformation, the environmental impacts of using it, and the ethical concerns of where it gets its information. iUP’s student populace was told early in the year that any use of AI in lessons would result in both a failing grade and a meeting about academic honesty.
As AI becomes more widespread, it doesn’t hurt to become more informed about its uses in the context of a classroom. In fact, iUP Students were asked about the impact of AI.
How do you think AI has affected music?
“I don’t think it should be allowed on the market. To teach music students, yes, but for mass consumption or even on unpaid platforms, it shouldn’t be allowed. I mean, parodies of singers covering other songs are funny, but I think that’s as far as it should go.”- Anonymous
“There will always be controversy about AI but at the moment the AI music I heard is pretty bad, I feel AI could only be used as ghostwriters. AI ghostwriters will probably be a big controversy source in the next few years.”- Anonymous
How do you think AI has affected online schools like iUP?
“I think it’s (AI) great but I don’t like the fact that it is gonna be grading our tests for STAAR.”- Anonymous
“I remember my teacher mentioning that some students may be using AI to write their assignments, as he could recognize the college-level writing style. It concerns me that things like this could become more widespread and lead to more cheating, which would be a significant challenge for schools.” – Anonymous
How do you think the music industry/online school can be protected from AI?
“I am going to ignore the school part and just mention how platforms can discourage the AI touched up and altogether AI composed music and just promote more real and genuinely talented individuals, saving the industry from an AI takeover.” – AJ Ball
“Have an AI checker embedded in Canvas, for each submission.” – Mrs. S Williams
“I think influencing for it to be stopped and educating you and the people around you about the negative impacts and scams that can come with it.” – Anonymous
“The thing about AI in the music space, I think it shouldn’t create new sounds, it should only be used to enhance the preexisting music, which is how it is mostly being used right now.” – Cody Ball
AI has already shown its capabilities in this world in the past year, including the dangers of its usage. Since the abilities of AI are forecasted to advance even more from here, that means that the problems society is already facing with it are more likely to get worse unless the right action is taken. As musicians, teachers, and students have seen firsthand, it’s impossible to ignore what AI can do, whether that is a good thing or a bad thing. Nobody knows what the future holds, but as technological advancement has always been a constant in our society, the presence of AI will certainly be one as well.
*iHoot also uses plagiarism and AI checkers on articles published.
Sources:
1. Christine Elgersma, “ChatGPT and Beyond: How to Handle AI in Schools.” common sense education, March 6, 2024, 1. https://www.commonsense.org/education/articles/chatgpt-and-beyond-how-to-handle-ai-in-schools
2. College Transitions Contributors, “10 Best AI Detection Tools for Teachers & Professors.” College Transitions, October 13, 2023, 1. https://www.collegetransitions.com/blog/best-ai-detection-tools-for-teachers-professors/
3. Bahar Gholipour, “New AI Tech Can Mimic Any Voice.” Scientific American, May 2, 2017, 1. https://www.scientificamerican.com/article/new-ai-tech-can-mimic-any-voice/
Veronica Lerma • May 3, 2024 at 6:22 am
Thank you for making these kind of articles which are not only relevant, but a good read. I’d like to think I don’t live under a box, but I had no idea of the extent AI’s influence has in the music industry and the educational environment. Thank you for such an informative article!