On this episode Rock School Proprietor Podcast, hosts John Kozicki (Michigan Rock School and RockSchoolProprietor.com) and Mandy York (Music Time of Milford) are joined by Sam Reti, founder of Muzie.live,. Sam previously visited the show to contemplate how advancements in AI could change how music lessons are delivered. Speculation on that episode has become a reality, but even quicker than anticipated! Sam and the hosts explore the rapidly advancing world of AI in music education, discussing how a recent update to ChatGPT now means that AI can provide real-time feedback on things like technique when playing guitar.
Reflecting upon a prior conversation with Sam, they delve into the unexpected pace at which AI has evolved, offering insights into its role as a teaching tool rather than a replacement for human instructors. They share practical recommendations on how music teachers and studio owners can leverage AI to enhance their teaching, improve productivity, and attract more students.
Listen in for a thoughtful exploration of how AI can be integrated into music education while preserving the essential human element that makes music so special. From innovative lesson notes to creating engaging marketing content, discover strategies that will help music educators remain relevant in a tech-driven world.
To hear the previous conversation about AI with Sam Reti, check out Episode 17: AI and The Future of Music Lessons.
Join our private Facebook group, “Performance-Based Music Programs and Rock Schools,” a community for like-minded professionals to connect and share insights.
Episode Transcript:
John Kozicki (00:01.404)
Welcome to Rock School Proprietor podcast. I’m John Kozicki.
Mandy York (00:05.43)
and I’m Mandy York.
John Kozicki (00:06.766)
And Mandy, we are welcoming back Sam Reti He is our first return guest. Sam, how are you doing today? Yeah.
Sam Reti (00:16.45)
Great, that’s an honor. Thanks so much for having me back. I’m excited.
John Kozicki (00:20.84)
Sam is the founder of Muzie.live, which is an online video lesson platform. And Sam, wasn’t very long ago that we had you on the podcast. The first time it was only three months ago. But what we were talking about is like this very timely and quickly changing topic of AI and what it’s going to do in music lessons.
So we thought we’d have you back on. For anyone who hasn’t listened to that, that was episode 17, AI and the Future of Music Lessons. So you can go back and check that one out. It’s actually, Sam, that was one of our most downloaded episodes at this point. yeah. So to kind of get into why I wanted to have you back so soon, Sam, when we had that conversation,
Sam Reti (01:04.512)
that’s awesome.
Sam Reti (01:16.568)
You
John Kozicki (01:18.889)
One of the things that we spoke about on that episode was specifically, asked you, and it was a broad conversation about AI and how it’s going to impact and how it is impacting music lessons. But I asked you pointedly, how long do you think it is before we’re in a situation where we’re real time feedback from AI as we’re playing? Meaning,
Here I am playing a guitar and chat GPT or whatever AI is giving me feedback on my technique. Do you remember what you said?
Sam Reti (01:59.328)
I mean, months, years, yeah, not the real answer, which turned out to be a few days.
John Kozicki (02:07.597)
Yeah, I think you had said we’re still a couple years from that. And yeah, and I think if I’m remembering correctly, it was more about because there are a lot of tools available in music lessons that utilize AI, some of which you’re using in music.com or music live already. But this this idea of like, the AI
Mandy York (02:10.401)
wild.
Sam Reti (02:12.726)
Yeah, it’s probably got a few years, yeah.
John Kozicki (02:36.98)
acting as an actual teacher is what we are speculating, which is that’s scary to a lot of music instructors. So that was three months ago that we talked about this. The episode came out a couple of weeks later. And then a couple of weeks after that, you had sent me a message with a video regarding the chat GPT update. Do you remember the video you sent me?
Sam Reti (02:43.543)
Yeah, yeah.
Sam Reti (03:05.093)
Yeah, yeah, this is the basically they they gave chat GPT eyes Through the through the guys of a camera built into your phone or whatever device But is you know in that particular video is being able to look at something and give instructions on how to use it I think it was the coffee machine and it was doing that but then I Sort of took that to another level and thought okay. Well if it can see
this coffee machine and give instructions on how to make pour over coffee in real time. Can it look at someone playing an instrument and give instructions on that? And the answer is kind of, yeah. So it’s not a hundred percent, you know, there’s still tons of room for growth and improvement. this was not like, it’s not like they came out and specifically did this for music lessons or for teaching music or anything like that.
John Kozicki (03:38.103)
Mm-hmm.
John Kozicki (03:58.232)
Sure.
Sam Reti (03:58.998)
So to kind of put it in perspective, is, which I think actually makes this more interesting because the focus was not, this look at an instrument and give feedback? But the byproduct of the technology is almost exactly that, is that you can turn on this feature now and it fires up the camera on your phone and you can have a pretty human conversation just going back and forth and it speaks back to you in real time with a voice.
So instead of chatting back and forth, you’re now talking back and forth. So that’s sort of the first layer, right? So even if we take out the camera and we’re just talking about conversational AI, which was a big leap within itself, but this took that to whole other level by providing the camera feed. So the first thing I did is grabbed one of those guitars off the shelf, put the camera on my desk, stepped back a foot, played a chord, and actually,
John Kozicki (04:29.056)
Yeah.
Mm-hmm.
John Kozicki (04:50.561)
Mm-hmm.
Sam Reti (04:56.888)
didn’t even strum the chord. I literally just put my hand in the position on the guitar of like a D major chord. And I said, what chord am I playing right now? And the AI thought for a second, it said, looks like you’re playing a D major chord. And then it actually followed up with a couple of like, remember if you’re playing a D major chord, you can either wrap your thumb over the top or put your thumb on the back of the guitar to, you know, to keep your wrist a little further out.
John Kozicki (05:04.301)
Mm-hmm.
John Kozicki (05:25.996)
Wow.
Sam Reti (05:26.744)
it started actually going into detail about how to execute that chord. And it sort of talked about which strings you want to strum versus which ones you might not want to strum in the chord. And now, it wasn’t 100 % accurate every single time. If you start playing little more complicated chords, it fails. It’ll think, if you’re playing a G major chord,
John Kozicki (05:40.525)
Mm-hmm.
Sam Reti (05:54.84)
I think you’re paying like a C major chord. So it’s not 100 % accurate. So I do say with a caveat of like, this isn’t ready to roll out tomorrow afternoon for students to go and record using their teacher. But within weeks of us talking about something that I was pretty confident was years away, turned out to be weeks away. And again, this wasn’t built for what we’re talking about.
John Kozicki (06:08.087)
Sure.
John Kozicki (06:19.094)
Yeah.
Sam Reti (06:24.766)
if somebody goes in there and starts training it on music lessons, on data, on teaching techniques, and things like that, and sort of builds a model using the backend technology from OpenAI, for example, then you can see a world where pretty quickly you’ve got something that actually has a decent understanding of what it’s looking at.
and hearing, you know, like I said, I didn’t even strum the chord. I’m sure if, you know, if you strum the chord and give it the visual information, now it’s got two pieces of the pie and that’s going to be a lot easier to build together. Now again, that’s not exactly how this works, but it shows the potential for something very soon.
John Kozicki (06:57.324)
Right.
John Kozicki (07:13.537)
Yeah, I can see, I can kind of see a parallel between what you’re talking about and the YouTube video put out by ChatGPT. And in that YouTube video, again, they were making a pour over cup of coffee and getting feedback from ChatGPT. How is my technique? ChatGPT says, you’re doing pretty good, you know?
It was wild and I would argue that, yeah, everyone drinks coffee, right? Narrow that down though, how many people drink pour over coffee and make their own pour over coffee? That’s a much more narrow skill, right? So when you’re talking about like a D major chord, for instance, on guitar, it’s pretty common.
Sam Reti (08:00.919)
Yeah.
Sam Reti (08:08.95)
Right.
John Kozicki (08:09.537)
But then when you’re talking about the other chords that it had some problems with, okay, it stands to reason that’s a lesser used chord. I can see how things are getting more and more precise. Now, in your opinion, now I haven’t played around with it on guitar. I’ve been thinking about what are the, could it do the same thing for other instruments?
Sam Reti (08:38.678)
Right. Yeah. Yeah. Piano to me is the logical one. If I had put my piano away, so it might be worth just dragging it out to give it a shot. because piano is so finite in, you know, respects like C major or C, middle C is middle C, right? Or on a guitar there’s, you know, five of them, you know, is, so that…
John Kozicki (08:38.719)
I’m thinking piano would be an easy one.
John Kozicki (08:59.778)
Yeah, yeah.
Sam Reti (09:05.108)
position of your hands, things like that on a piano are much more cut and dry than they are on a guitar. And like, one of the reasons I think it struggled with like G chords is your hands much more like overlapping itself and things like that that make it kind of hard to actually see the fretboard behind your hand. You know, so that is where you can see like the visual is not all that’s needed. The audio together probably would be…
needed in order to make that actually understand what chord you played. But on the flip side of piano, much more visual. I could probably just lay my fingers on the chord, not even play it, and it can just understand based on the location on the piano what chord that is, you know?
John Kozicki (09:49.292)
Yeah, and I think offering that feedback on technique will be tricky for many instruments. I mean, I’m thinking specifically of piano. Yeah, you can get an overhead view and you can see fingers in the right position and on the right keys. To recommend changes in technique, you might need a different view. You might need like a side view to see how like, how are the wrists.
You know, how do the wrists look? I, yeah. Yeah.
Sam Reti (10:19.158)
Yep. Are they sitting up correctly? know, your body posture right? is your… Yeah, exactly. that’s why, like I say, this is the infancy of this technology. But as we kind of said, the fact that it came out so quickly after we were speculating it would take so long was kind of fascinating. But you’re right. mean, but on the flip side, I can imagine a world where I’ve got a phone over here and a phone over there and they’re both
Mandy York (10:24.238)
Mm-hmm.
Sam Reti (10:48.696)
streaming you know the videos and then i’ve got it hooked up to one system i mean even in music music lessons people use two or three cameras to like for drum teachers they’ll have one on their foot one over the hi-hat one behind their head one over the ride symbol like you know if you but if you provided all of that and had all that information then and again this is this system does not exist yet we’re talking a hypothetical at this point
John Kozicki (11:00.0)
Sure.
John Kozicki (11:07.297)
Yes.
Sam Reti (11:16.056)
Like nobody’s built a platform where you can plug all the cameras in, get it to render them together. And that’s a lot of work. So this is why I try to caveat with all these sort of new things with the concept is available, the application isn’t. So, you know.
John Kozicki (11:31.498)
Sure. And I think we are still kind of in that Wild West phase with this technology where everyone’s trying to figure out those who have the resources and are bold enough to figure out how to leverage it, like yourself, right? Like in Musi are all sort of scrambling to see who’s going to be first. And then, you know, cause whoever’s first is going to have the advantage. So I think we have to assume
Sam Reti (11:35.768)
Yeah.
Sam Reti (11:46.36)
Yeah.
Sam Reti (11:52.534)
Yeah. Yeah.
John Kozicki (12:00.513)
Like, yeah, someone’s probably trying to make this stuff happen. Yeah. Yeah. Now.
Sam Reti (12:04.236)
Yeah, I don’t see why not.
John Kozicki (12:09.099)
Do you think, now again, we speculated on like, is it going to replace music instructors? And I think our general consensus was, well, maybe not. However, it’ll change. Maybe it’ll eliminate a couple of music instructors, right? Who are like diehards and just not willing to update their technology. It still seems more like a tool to me.
Sam Reti (12:27.213)
Yeah.
Sam Reti (12:36.952)
Yes. Yeah. Now, the way I really see this actually being fundamentally game-changing and allowing the progression of music to actually drive forward at a pretty incredible rate is in more of a practice setting. You still want a teacher. Human feedback is going to be fundamental to music forever because it’s such a human thing. Even with the AI models that write music,
Mandy York (12:53.836)
Yeah.
Sam Reti (13:05.546)
All that’s actually doing is mimicking what humans do. So it’s still a human thing at the core. So the way I see this is imagine I take my lesson with you once a week. Seven days of six other days of the week I’m not with you because I logically couldn’t afford to book a teacher every single day of the week. That’s kind of the unrealistic expectation. But what if I had a model that’s actually trained on your teaching style?
John Kozicki (13:26.966)
Right, right.
Sam Reti (13:35.67)
Right? It’s you who fed the model, the information. And so now when I’m at home practicing, I can call up AI John to check in on my practicing with me. So I have, you know, your feedback really in these situations where I couldn’t really realistically get your feedback. you know.
John Kozicki (13:35.969)
Mm-hmm.
John Kozicki (13:55.018)
Yeah, yeah, and I can see that being, I can see that being a nice add-on for instructors and something to set themselves apart from the sea of other instructors. It’s very interesting to me and I definitely think where,
I, you know, before we started talking a lot about this, I was personally sort of like, I don’t know what’s going to happen. Is it going to take all their jobs? Maybe I don’t know what’s going to happen. But the more that I talk about it, the more I think about it, the more I play around with, with AI platforms, the more I am starting to get comfortable with the idea of how I can use it to improve what I do.
be faster at certain tasks. for instance, I was just playing around with chat GPT today and I have a sheet of paper with probably, I want to say like 20 to 30 handwritten chord diagrams on it. A full sheet of paper.
You know, like the chord diagrams are handwritten too. So the lines are shaky. The dots are all over the place. And I thought, can I feed this to chat GPT and get it to clean it up for me and put it in a format that I can use? So I gave it very specific instructions. said, I’m going to give you this hand drawn sheet of paper that has these chord diagrams on it. I want you to clean them up, format them straight lines.
Sam Reti (15:34.509)
Right.
John Kozicki (15:47.246)
dots, letters and numbers. That’s step one. Step two, I want you to put it into a sheet for me that is in EPS format at at least 300 DPI. Yeah, so those are the very specific instructions I gave it. And I was like, I don’t know what’s gonna happen. You know, I have no idea. And so,
it repeated like, okay, I’m gonna do this, I’m gonna format it, I’m gonna do this. When I’m done, I will let you know, right? So it had been about 10 minutes and I’m like, I still don’t have anything. And so I asked any estimate on when this will be done and it tells me about six to eight hours. Six to eight hours. Do you want me to give you updates on it or do you just want me to let you know when it’s done? So it hasn’t been six or eight hours yet, I don’t know.
Sam Reti (16:20.952)
Yeah.
Sam Reti (16:41.726)
I’m a little frequency. Yeah.
John Kozicki (16:43.231)
I’m like, yeah, I’m dying to see what happens though, you know?
Sam Reti (16:47.436)
Yeah, I mean, that’s exactly how I see the use case for AI being the realistic, almost perfect parallel to that is the way that we use it in music right now. And I was actually just speaking at a class at Berkeley a few hours ago about exactly this. And it’s interesting kind of hearing the younger kids sort of thoughts, you know, all the new, you know, fresh out of college, gonna be teachers soon and how they all see it.
John Kozicki (17:13.42)
Yeah.
Sam Reti (17:16.178)
And when I was going through all the features of music and everything, what we do when we hit the AI part, you could see kind of a visual like, that’s interesting. And the idea behind our AI, which I probably explained before, but just for anyone new listening, you turn on a recorder. It listens to everything you say while you’re teaching your lesson. It transcribes that into a text document. And then it takes that text document.
John Kozicki (17:25.559)
haha
Sam Reti (17:43.338)
and it renders it into a practice routine or an assignment or lesson notes or whatever you choose. There’s about a dozen different choices. But the thing that’s interesting about it is it writes more detailed instructions than a teacher would ever have time to write in a lesson itself. So the benefits are the student has higher quality instructions than they would normally get. The teacher didn’t have to spend any of the lesson time writing those instructions. The teacher can edit them if they feel necessary, but
probably don’t need to. The AI will write things and mention things that you might have actually forgotten to say as a teacher. Like in the example I used for the college kids, I just did a really quick, hey, I want you to practice these chords, scales, and blah, blah. And the AI actually wrote in the bottom, says, and trick. then, know, semi-colon said, I suggest you use a metronome when you’re practicing these scales to keep yourself in time.
and don’t forget to warm up your hands by stretching before you actually start playing today. never said anything about a metronome. I never said anything about warming up. But the AI model that we’ve got is learning more, it’s built and designed to be a teaching assistant, and it’s getting better and better at that as it goes and as the models improve. So it’s sort of like you’ve got this really detailed description that your student can take home, and it took you no time out of the lesson.
John Kozicki (18:44.491)
Mmm.
Sam Reti (19:09.484)
So now you’re spending 30 minutes of the 30 minute lesson teaching instead of five minutes or 10 minutes writing notes. Because you know how goes, you’re, hey, practice that five times and then you’re writing something down as there is the student loops through the thing a couple times to give you a minute to write. You know, it’s like that’s how all my guitar lessons went when I was a kid. And this alleviates that and not only provides a higher quality output, but also saves time and provides all these other.
John Kozicki (19:17.601)
Yeah.
John Kozicki (19:31.213)
Mm-hmm.
Sam Reti (19:38.378)
opportunities as a teacher. So now I’m not spending time after my lessons writing notes for students that I didn’t get to during class. So like you’re saying with this example is like, could go and learn how to use Photoshop and go and learn how to render guitar diagrams yourself. But that’s a skill set that you’ve got to acquire over time and practice. And you’re a busy, human being who does not have the time to go and learn how to become a graphic designer.
John Kozicki (19:46.903)
Yeah.
John Kozicki (20:01.951)
Yes.
John Kozicki (20:07.479)
right.
Sam Reti (20:07.64)
It’s like if you can offload something like that to the AI, that’s a perfect example of when you should be offloading to the AI and not learning that skill yourself because you’ve got a million other things that you need to be doing that are much more important and integral to your actual business and job and personal life and things like that. you know, if you can make yourself more functional without needing to dedicate too much time.
John Kozicki (20:30.294)
Right.
Sam Reti (20:36.034)
to learning how to do all of these other skills, I think that’s great because it opens up time to go and learn the other set of skills that you actually do want to focus on and do want to do. Yeah, so that’s how I see the tool.
John Kozicki (20:45.495)
Mm-hmm.
So I wanna do what I wanna do now because we’ve talked extensively about AI and what it can do in music lessons in episode 17. What I wanna do now is I wanna give recommendations for music instructors and maybe studio owners. And I wanna do this in two different ways. So…
Number one, know, Mandy, you do not use AI currently, correct? And in your classes, music’s together classes with like the toddlers and the parents, it doesn’t make sense, right? We know that that’s like not what that’s about. But is there anything in your, on the business side of your studio where you feel like, this task,
Mandy York (21:33.79)
Not always,
John Kozicki (21:46.762)
if only I had someone that could do it for me or like take it off my plate. And so what I hope we can do here is if you have that task, maybe Sam has a recommendation, maybe I have a recommendation, and that’ll spark some ideas.
John Kozicki (22:07.841)
So any task that you can think of.
Mandy York (22:10.08)
Yeah, and we do use chat GPT a little bit, my admin and I, know, not extensively, but writing emails, coming up with copy for marketing things, stuff like that. I’ve used chat GPT for AI.
John Kozicki (22:15.307)
Okay.
Mandy York (22:29.452)
That’s it so far. I have just been in a stunned silence for this chat right now. mean, coming away, I just let me coming away from our last conversation with you, Sam, I was really great. And I had this kind of calming sense of, yeah, AI, this is great. It’s a tool, right? That’s what I took away from our conversation. AI is a tool. And then the development since then.
Sam Reti (22:34.136)
Thank
Mandy York (22:59.424)
you know, had me like twitching a little bit like, wow, okay. But hearing your description of how it is, again, just a tool, it learns the instructor and is a practice tool throughout the week when you’re not in a lesson. My stance on all of this has been humans are essential for learning music and making music always. But as we get deeper into this, I
how long until that practice, John, is just as good as John. And I’m with you, like, you know, I’ve never heard any like computer AI generated music that I enjoy. Music has breath, music has life, AI does not have either of those things. But can’t the AI just listen to, you know, all of the top recordings of the Goldberg variations, and then
John Kozicki (23:34.465)
Mm-hmm.
Mandy York (23:57.09)
give me a really beautiful rendition of the Goldberg variations. I’m like, I’m just a little freaked out right now, you guys, just a little bit. I know. And I told you told me not to, but
John Kozicki (24:05.773)
Don’t worry, Mandy, one of my recommendations. Okay, so, sorry, go ahead.
Mandy York (24:14.115)
But-
Sam Reti (24:16.29)
Yeah.
Mandy York (24:17.326)
That’s where my head space is right now. I’m hearing you and I’m with you. Music is human and these AI, this is a tool. Just curious about what our next conversation is gonna be in two weeks or two years. That’s all. That’s all I’m saying, okay?
Sam Reti (24:33.271)
you tomorrow afternoon.
John Kozicki (24:33.294)
So I have two thoughts, two thoughts and two recommendations. No, that’s okay. Number one, and this would be a recommendation for instructors. you know, like some of the ways that I’ve been experimenting with AI, with varying degrees of success.
Mandy York (24:38.284)
Okay, I’m sorry I derailed you, Jen.
John Kozicki (25:01.013)
I have a background in writing. So when I ask it to write something, I’ve never been happy with it. But one, one thing that I, I tend to do is I hand write things, you know, when I’m like planning out my day, I’m, write things down. I’m like a pencil and paper kind of person. And I’ve been using chat GPT to transcribe my handwriting. So I’ll take a picture of it.
And it gives me that, like a document of my handwriting, which has been really cool for me personally, because again, I like to write things down, pen and paper. If I then need to type it out, well, obviously then it’s like, it’s a whole process. So that’s one way I’ve been using it. That’s been really helpful. My other recommendation, and like I’ve not used this, but I think Sam, what
what you’re doing with Musi.live and those lesson notes, that is a game changer, right? To have a tool that can sit in, quote unquote, on the lesson and take notes for you and the students, that’s amazing. I think that’s like so, so helpful. And I think that would be helpful for any instructor.
Mandy York (26:24.334)
I can see that being helpful in training too, training new instructors.
Sam Reti (26:27.532)
Right. Yeah, yeah, yeah. You could build an entire course library almost based off of that kind of thing. Like if you video recorded yourself teaching lessons and then had the AI write transcripts and notes as you go, I mean, you could, this is something I try to tell teachers to do all the time is, know, if you’re teaching all day and this could be for training new employees, this could be for training new students, it could be anything you want, but you can see a world, and this is,
John Kozicki (26:27.791)
my gosh, yeah, yeah.
Sam Reti (26:57.644)
This is today, this isn’t hypothetical, this is you can do this right this minute, is grab a camera, phone, turn it on, record yourself teaching a lesson, have the AI write the transcript of the lesson, take the video clip, drop it into something like Opus Clips, which is an AI video editor. It will basically edit the video for you instantly, it just knows what’s useful and what’s not. Grab that transcript, import it in, it will put text over the top of the video for you.
John Kozicki (27:16.066)
Mm-hmm.
Sam Reti (27:27.382)
You can A, make hundreds of shorts doing that in like 20 minutes and that becomes marketing material right away. We can also build course content and you could fire that up onto Kajabi and you can start selling a course on how to teach group classes for children. And that could be something you actually sell to other teachers. And the effort of putting that course together is 10 times easier than it was, you know, six months ago when you have to put up a camera.
write the transcripts, record everything, edit it yourself, do all that. It’s like, with a couple of AI tools, you can slap that stuff together quickly, you know?
Mandy York (27:59.342)
Yeah.
John Kozicki (28:04.023)
So Sam, what would be your number one recommendation for an instructor? Trying to get comfortable with AI and utilizing AI.
Sam Reti (28:09.452)
I mean, yeah.
Sam Reti (28:14.464)
Yeah, mean, to me, lesson, the reason we started with the lesson notes is that’s the least intrusive thing I think you can do as an instructor. And you don’t need to be using music for that. You can just fire up ChatGBT and turn on the like listen mode. Just hit a little microphone button and then just talk. And it will just transcribe what you talk. And then just say, just ask ChatGBT, hey, can you turn these into lesson notes or practice routines or something for you?
John Kozicki (28:42.818)
Mm-hmm.
Sam Reti (28:43.576)
So that’d be like the manual way of doing it. To me, that’s a super simple thing that really anyone could be doing in any lesson, in any situation. You don’t need to buy fancy software. You don’t need to buy the subscription to music. You don’t need to do anything. You just open up ChatGPT and do it. That’s so simple that I can’t see why you wouldn’t just try it and see what happens. And then on the flip side, if you’re a teacher,
The number one thing I hear from people is, do I get more students? How do I grow my business? That’s the number one thing I’m hearing. And as I sort of described, marketing content is a huge factor in that sense. So I was even talking with a teacher yesterday who, he’s doing exactly this. He records all of his lessons and then he puts them into a product called Opus Clips, which makes shorts out of long videos.
John Kozicki (29:24.012)
Yes.
Sam Reti (29:37.208)
And then it slaps the text over the top. So it is like 30 to 60 seconds snippets of him teaching a lesson. And he then puts those up on his social media and that becomes marketing resources. And that is literally all automated. The recording is automated. All he has to do is download it and put it onto Opus, click render. It renders 50 different clips. You pick which ones you like, you post them on your social media account. It’s like, it couldn’t be more straightforward than that.
John Kozicki (29:44.343)
Mm-hmm.
John Kozicki (30:03.842)
Yeah.
Sam Reti (30:07.096)
And you know, being a good teacher or enhancing what we already can do as teachers, I think is crucial. And then of course, making more money, getting more students is obviously the lifeblood of the actual teaching side. So, you know, I always kind of look at both of those as really two sides of the same coin. Yeah.
John Kozicki (30:25.483)
Right.
John Kozicki (30:30.806)
Yeah, my number. OK, so this is going to be my number one recommendation for any music instructor, any studio owner. And this is this is kind of a overarching strategy that I think we all should be employing at this point. And this is going to be my drive at home point for the podcast. And this one’s for you, Mandy. We need to focus more.
on those aspects of music lessons and what music does for the human experience. Right? We need to change. If our messaging is about our instructors degrees and like, you know, things like that. If our messaging is about what you’re going to learn in your lessons, start changing now. Start talking about
social skills, start talking about finding your community of other musicians, start talking about all of these other things that we know music lessons do for us that will never be replaced by AI and can’t be replicated. Because if we start talking more about those things, then all of this AI stuff is not even going to be coming into into the conversation, right?
because the people that are coming to us are coming for different reasons other than than learning. Now, I know that’s a tricky, that might be a tricky one for for. I mean, Sam, you teach a lot of online lessons, right? So that might be tricky for those online instructors. However, figure out a way to craft that in there, right? Because that, yeah, that’s that’s going to be the thing that I think is going to
ensure that AI is not going to touch.
Sam Reti (32:31.82)
Yeah, 100%. And the thing is, the way I see it is that you gotta remember at the end of the day, no matter how many AI tools you use and how much AI assistance you have as a teacher, the reason someone’s choosing you is because of you and that connection they’re gonna have with you learning an instrument. And even if the teacher was literally entirely AI, now I know being teachers, that’s not really the ideal outcome for us, but let’s just say hypothetically, if there’s a world five years from now,
John Kozicki (32:49.654)
Yes.
Sam Reti (33:01.602)
two weeks from now, I’m just kidding, that there’s an AI version that teaches everything perfectly and it’s amazing and whatever. The thing that doesn’t make me upset about that is the outcome is another generation of people learning music and playing music and creating music. Now, if you were telling me that the next generation is AI teachers teaching AI students, I’m out. Like, that’s where I’m out, no interest.
John Kozicki (33:03.766)
Hahaha!
John Kozicki (33:19.979)
rights.
John Kozicki (33:28.183)
Yeah.
Mandy York (33:29.026)
Yes.
Sam Reti (33:31.49)
But if it’s humans learning an instrument and they’re learning the instrument to play and create music, that’s a net positive in my books. It’s kind of like the apps, there’s learning apps already that replace teachers and there’s no AI involved in those at all. I mean, you could go get like a Simply Piano type deal or like a U-sition type thing where you like play game or whatever. And sure, that replaces a teacher.
John Kozicki (33:54.573)
Correct, correct.
Sam Reti (33:58.328)
in the sense that I’m not paying for music lessons, I’m now paying for this app instead. So to me, those are very equatable to this figure it, you mythical AI teacher. Again, you can have a million apps, but people are still gonna wanna go back to the human because music is such a human thing. And to Mandy’s point, like, actually, yes, the AI will be able to render a perfect rendition of your favorite piece of music.
But imagine that as a perfect play along buddy. Like what if I live in a place where I don’t have access to a physical community, where I may be more isolated so I don’t have friends I can jam with. It would be amazing if I could jam with an artificial band so I at least get the experience of kind of playing with musicians in that sense, quote unquote. So yeah, positives are what I see coming down.
Mandy York (34:54.83)
Yeah, I agree with you. If the AI instruction is leading to more accessibility, that’s not a bad thing. And in fact, it can lead to more human interaction. If you fall in love with playing an instrument, then you’re going to seek out the other people in your community that do, or you’re going to want to take the next step and say, OK, I want to learn with a real instructor now. So it could lead people to you also.
Sam Reti (35:00.918)
Right. Right.
Sam Reti (35:07.266)
For sure. Yeah.
John Kozicki (35:20.653)
Mm-hmm.
Sam Reti (35:24.93)
Yeah. Yeah. Yeah.
John Kozicki (35:25.033)
Exactly. It could lead and it should lead. And that’s, and that’s kind of my point. That’s why we as studio owners and instructors should be talking about that outcome. We should be talking about this is what you can do with music. You can connect with people. You can play music with people. You can play music for people, right? That’s the, that’s that again, that human experience that can’t be replaced.
Mandy York (35:27.992)
Mm-hmm.
John Kozicki (35:56.043)
All right. Big sigh of relief. Yes. Big sigh of relief. Well, yeah, Sam, again, thank you so much. I’m sure we’re going to be talking again soon because this is like an ever changing. It changes so quickly. So Sam Reti, Muzie.live. Thank you, Mandy. Thanks. We’ll see everyone next time.
Mandy York (35:57.678)
Okay. Yup. That feels better.
Sam Reti (36:00.02)
And then you become that full circle.
Sam Reti (36:08.406)
Yeah,
Sam Reti (36:17.976)
Thank you.
Mandy York (36:20.674)
Yeah, thanks guys.
Sam Reti (36:21.944)
Thank you.