
Get in the KNOW
on LA Startups & Tech
Xartificial intelligence
This is the web version of dot.LA’s daily newsletter. Sign up to get the latest news on Southern California’s tech, startup and venture capital scene.
1212 Santa Monica was jam packed with roughly 700 founders and VCs on Wednesday night. As music pumped through the speakers, attendees of SUPERCHARGE LA: Access to Capital & Cocktails mingled throughout the two-story restaurant tasting sliders and tacos from chef Luca Maita.
The passionate crowd, clustered in the hollowed-out center space, along the balcony lining the walls and spilling onto the promenade was full of excitement. When Grammy Award winning singer Miguel took the stage to speak about how collaborations with other musicians and entrepreneurs inform his creative process, the attendees listened closely.
When asked by dot.LA executive chairman Spencer Rascoff to respond to musician Will.i.am’s prediction that more artists will turn to AI, Miguel wasn’t so swayed by the idea.
“[AI] will impact the industry from the business side but it will never have that spirituality,” Miguel said.
That’s not to say that the chart topping musician has entirely shied away from technological innovation in the music industry. “The business of music will have its use for AI,” he said. “Industrialization, through human history, has always been the thing that we gravitate towards, because we like convenience, and there's a way to make money out of that.”
For the time being then, Miguel is primarily focused on how web3 can revolutionize community ownership. Which is why, last year, he joined the web3 studio T3MP0 as the company’s global creative director. During the discussion, Miguel said the company, which focuses on connecting fans with creators though metaverse experiences and NFTs, allows him to flex his entrepreneurial muscles.
“It all stems from growing up here in Los Angeles, being Black and Mexican—feeling underrepresented and knowing that people have such a one dimensional way of looking at the world,” Miguel said.
As people leaned over the balcony to snap photos of Miguel, T3MP0 CEO Roger Chabra said tech people are constantly seeking to “create new rules,” as we saw with web3. But when it comes to NFT’s waning popularity, Chabra was quick to acknowledge that although T3MPO isn’t “as relevant as we used to be,” the company remains relevant to partners like Miguel, Dipolo and Marshawn Lynch, who are still looking for innovative ways to connect with fans.
Other Los Angeles companies are also trying to change the relationship between creators and fans. Prior to Miguel and Chabra’s discussion, Rascoff spoke with LA Chargers running back Austin Ekeler to discuss his community platform Eksperience. Through the platform, fans can connect with Ekeler and other athletes and influencers to video chat, receive signed merchandise or be sent a personalized video. After he amassed over 28,000 followers on Twitch, Ekeler said he was inspired to find more direct ways to connect with fans.
“I can't sign everyone's jersey here, so let me create an environment where I can control that and actually get those things done,” Ekeler said.
As part of the event, Anna Barber, Qiana Patterson, Elianne Rodriguez and Marcos Gonzalez from PledgeLA also announced their 50 in 5 initiative. Their goal? To direct at least half of LA’s VC funding to female, Black and Latino/a founders in the next five years by creating a pre-seed fund for underrepresented founders and a VC internship.
Before the night was capped off with DJ set from the iconic D-Nice who took over to set the musical vibe for the remainder of the evening, Rascoff reminded the crowd what makes the LA tech scene different from anywhere else in the world.
“All the people I've gotten to know in this community don’t have sharp elbows,” Rascoff said. “Instead we're all trying to lift up the LA tech community to reach its full potential.”
This is the web version of dot.LA’s daily newsletter. Sign up to get the latest news on Southern California’s tech, startup and venture capital scene.
At the headquarters of the FYI app in Hollywood on Tuesday night, founder, CEO will.i.am made a bold prediction:
“I think what's going to happen real soon is that instead of going to the studio and making a song to release on Spotify, or Apple Music or [other] streaming platforms, artists are going to go into the studio to train their model,” he said. “And it's their model that they're going to train, because they're training it to their fingerprint to their creative thumbprint.”
Will.i.am launched FYI (Focus Your Ideas) in late May to help creatives to collaborate on projects. The app lets users communicate via voice and video calls, centralize file management and to-dos and most crucially to will.i.am’s mission, is outfitted with an AI that will suggest new ideas.
It might seem surprising to hear a recording artist be so invested in AI at a time when some music producers and the publishing industry writ large are warning against it. But will.i.am has been advocating for the use of AI since 2010. Even going so far as to release his “Imma Be Rockin’ That Body” music video that featured will.i.am showing off an AI music model contained in a briefcase, much to the chagrin of his fellow bandmates.
During the LA Tech Week panel co-hosted by AI LA and moderated by Karen Allen, CEO of Infinite Album, a Los Angeles-based startup that makes AI-generated copyright-free music for Twitch streamers, will.i.am said that instead of buying an AI off-the-shelf to customize based on one or two existing public songs – that’s how we got the fake Drake AI copyright disaster in April – musicians will invest time and energy into training their own LLM based on every available demo, or take of a track.
How would that work? According to the Grammy award-winning recording artist, musicians could feed hundreds of thousands of hours of tape into the AI, which will begin to formulate a clear understanding of the artist’s creative process that can then be replicated by the model to spit out finished songs.
This would entail the musician going into the studio and laying down various tracks in different styles until the AI understood the direction the artist hopes to go – cello music one day, Samba the next, and then punk, he suggested.
Fellow panelist Matthew Adell, a longtime record executive who now runs AI-generated music company SOMMS.AI, said that he works with musicians and library holders who are eager to use the technology now. “These custom models or specific purposes are happening,” Adell said. “We have people who are creating cover versions, or remixes, new genre versions of functional music libraries [using AI].”
Adell added, “the real use currently is non-real-time generation, where you generate a lot of stuff in bulk… [Then] you have effectively a curated A&R person determine the stuff that's usable and beneficial to whatever their value proposition is, and then with the client, you might have someone go and master it or mix it manually.”
While the use of AI in music right now is still in its infancy, other musicians are experimenting with it. In April, several musicians told dot.LA they were using it, mainly for idea generation. At that time, Matt Lara, a musician and product specialist at Native Instruments, said he saw real potential for using AI to streamline the complex mixing and mastering process, allowing him to master several tracks at once with relative ease.
For his part, will.i.am said he’s already been experimenting with the technology, noting he recently worked with a music LLM made by Google.
“I heard a song that I’d written that I’d never wrote, a song I’d never produced that I produced,” will.i.am said. “That was so freakin’ fierce. Every sound was crisp. Every synth was like, Yeah, that's the sound. The bass was like, Yo, that's a right bass. Drums are punchy. The lyrics was like Yo, I would have [written] that myself.”
- Snafu Records Raises $6 Million to Replace Record Executives with AI ›
- Laylo's AI-Enabled Platform Is Reinventing the Music Industry's Fan Engagement Model ›
- Why Producers Remain Divided On Using AI to Make Music ›
- Here's What People Are Saying about LA Tech Week - dot.LA ›
- LA Tech Week: Normalizing Esports Investment Downturns - dot.LA ›
- LA Tech Week: Miguel on AI's Impact on Music and Artists - dot.LA ›
This is the web version of dot.LA’s daily newsletter. Sign up to get the latest news on Southern California’s tech, startup and venture capital scene.
When I woke up this morning, I didn’t expect to end up on the set of the Peabody award winning show, “The Good Place” with its countless picnic tables stretching across cobblestone streets set against the backdrop of pastel-colored building props.
On Tuesday, Microsoft’s M12 Venture fund, corporate VC firm Comcast Ventures, and The Mini Fund, a micro VC firm, hosted a panel discussion on the intersection of AI, gaming and entertainment at the Universal Studios backlot.
The panel featured Haiyan Zhang, General Manager of Gaming AI at Xbox, Michael Stewart, Partner at M12 and Stan Vishnevskiy, CTO & Co-Founder of Discord.
One of the hot topic questions that kept resurfacing: how do we regulate AI use? And perhaps more importantly, how do we balance regulating AI without sacrificing the freedom to create?
Stewart believes part of the solution is to create “clean datasets,” where the content produced by the AI model is in line with what the creator intended it to be. “Even if you labeled or watermarked AI output from a model, how do you control people's use of that within the boundaries of the owners of the data set that went into it?” Stewart posed to the crowd.
He argued that, at the end of the day, it's not just about “intercepting illegitimate content or harmful content” but that you have to fix the problem of ownership and licensing around AI content before you “can turn generative AI into a useful machine for commerce.”
Adding that, “there are still open questions on do you own what you're generating? So you've got to solve that before you turn generative AI into a useful machine for commerce.”
Amidst concerns around deepfakes and AI’s role in the spread of misinformation, all three panelists explained how its rapid expansion has forced them to evaluate the best ways to implement AI into their products.
“As ChatGPT and these things sprung up, we took a step back and said, ‘What would it mean for us to do it [AI integration] on Discord?’” said Vishnevskiy. “And for us that’s a multiplayer experience where you can spend time with others.”
Vishnevskiy pointed to Midjourney as one example of AI’s rapid expansion in a public space. “The coolest part about generative AI is it kind of democratizes it,” Vishnevskiy said. “You don’t have to be an ML engineer to play with it. And so when we started we were like, ‘Wow, this is people doing stuff together...so we worked closely with [Midjourney] to see could we give them even higher reach than our cap, so at that point you could have more than a million people on a Discord server.”
Zhang said Xbox’s “whole thesis” is helping users create engaging, meaningful experiences and in an AI-driven world, part of that includes implementing AI tools that can help creators. The gaming manager is particularly enthused about the application of AI in helping enhance the experience of players with disabilities.
“You see AI coming in to level the playing field,” Zhang said. “How can we have adaptive games that adapt to your capability?”
That said, Zhang said the verdict is still out on how AI is going to shape the future of humanity.
“Life has really changed,” she said. “Is the AI stuff just gonna be here…or is it going to fundamentally change what we do for work, what we do for play, how we realize our ultimate achievement, how we realize our creativity?”
- Could Lack of Access to Training Data Block Lightning-Fast AI Development? ›
- AI Is Undergoing Some Growing Pains at a Pivotal Moment in Its Development ›
- Here’s What’s Happening at LA Tech Week ›
- Here's What People Are Saying about LA Tech Week on Twitter - dot.LA ›
- LA Tech Week: Normalizing Esports Investment Downturns - dot.LA ›