This is the web version of dot.LA’s daily newsletter. Sign up to get the latest news on Southern California’s tech, startup and venture capital scene.
For better or for worse (let’s be real, probably for worse), I grew up on the Internet. But before I entered the wild west that is social media, my friends and I spent our afternoons traipsing through Club Penguin, playing poorly rendered games on Webkinz or constructing ridiculous outfits for online dolls.
Now, Club Penguin is defunct, Webkinz has limited offerings and the death of Adobe Flash effectively killed online dress-up games.
Though kids still flock to other gaming alternatives, young people are turning to social media—and platforms are struggling with balancing appropriate levels of content catering to their youngest users and a more mature audience (who can buy things).
TikTok is the latest company to try to separate the different groups, as the Culver City-based video-sharing app is currently allowing select users to restrict their live streams to users who are at least 18 years old. The feature is meant to further separate content meant for younger users and adults. The onus is on users to mark their content as having “mature themes,” though live streams that violate TikTok’s policies against nudity, sexual activity and violence will be removed.
TikTok Live has been criticized for fostering an unsafe environment, with teens engaging in sexual behavior for adults to view and “appear to toe the line of child pornography,” according to an April Forbes exposé. But you don’t have to be creating the content for TikTok Live to be exposed to sexually inappropriate material; six days ago, 30-year-old TikTok influencer Kylie Strickland was arrested after flashing underage boys at a pool during a livestream.
Dangerous trends have plagued the app, with a new lawsuit targeting TikTok’s algorithm after two children died while attempting the “Blackout Challenge,” which involves people choking themselves. TikTok’s ability to moderate such content is questionable, as contracted content moderators have filed lawsuits claiming severe psychological distress after viewing videos containing graphic content like child sexual abuse and self-harm.
Its rivals are also grappling with how to best approach the issue. Even YouTube, which cracks down heavily on nudity, hasn't escaped criticism that it pushes some viewers toward far-right extremism and violence and in 2019 was forced to change its algorithm to curtail conspiracy videos. Last year, Instagram shuttered plans to release a version of its app meant for children under the age of 13 after parents raised concerns over how it might affect young viewers' mental health. Its parent company, Meta, has also come under fire for not taking sufficient action despite internal reports that show it knew Instagram could exacerbate teen girls’ body image issues.
Regulations loom in the near future, though navigating the First Amendment has further complicated the matter. Both TikTok and Snapchat face legal pressures related to child endangerment, and California is currently considering a bill that would let parents sue social media companies for their products' addictive qualities.
On top of everything else, lawmakers are calling on President Biden to further investigate how TikTok handles data privacy.
Children are at the center of much of the debate—and social media companies have become their incredibly lax babysitters. But just how to create safe spaces online for children remains. —Kristin Snyder
How To Startup: How to Find Product-Market Fit
In the sixth installment of his series on How To Startup, Spencer Rascoff looks at how to assess the market’s demand for your product or service and whether your product satisfies market needs. That’s where product-market fit comes along.
E3 is Back, and LA Stands to Make Up for Lost Millions
One of the nation's largest gaming conventions is finally coming back to Los Angeles in-person next year and this time it is betting that a new event company, ReedPop, can help the struggling show ascend to the peak of pop culture.
Updated: Our TikTok Timeline
See our timeline below for key developments TikTok's story over the last 10 years, starting with the founding of ByteDance and moving through the app's rise to popularity and the mounting concerns about data privacy and security.
Rivian, Xos and the Volatile EV Startup World
The world of electric vehicle startups remains a rollercoaster of uncertainty. Two of Southern California’s biggest names in the space ended the week with radically different outlooks.
This Week In Raises: Tebra and EVSC's Big Hauls
This week in “Raises”: A Corona del Mar-based healthcare startup hit unicorn status, while a local EV charging company raised funding to support expansion into more than 35 new cities across California and Washington.
LA Tech ‘Moves’: Snap Taps the Secret Service
Former U.S. Secret Service Director James Murray has joined Snapchat to take on a top security job. Meanwhile, flight automation company Skyryse appointed a new CFO and COO. Catch up with all this week's career shifts.
SHYFT Founder Melissa Hibbért on How to Prioritize
On this episode of the Behind Her Empire podcast, marketing agency SHYFT Beauty founder Melissa Hibbért discusses how her lifelong love for beauty led her to ditch the corporate world and follow her passion.
What We’re Reading Elsewhere...
- A look at the culture changes at Netflix as it tightens its belt.
- L.A.-based virtual cardiac-care startup Moving Analtyics raises $20 million.
- UCLA engineers create a process for 3-D printing autonomous robots.
- Its logo once graced L.A. Lakers' jerseys. The NYT looks at the problems plaguing ecommerce startup Wish.
- Downtown-based cybersecurity startup Resecurity opens an office in "Silicon Beach".
- Canadian EV manufacturer GreenPower acquires Torrence.-based Lion Truck Body.
----
How Are We Doing? We're working to make the newsletter more informative, with deeper analysis and more news about L.A.'s tech and startup scene. Let us know what you think in our survey, or email us!