On this week's episode of the L.A. Venture podcast, meet Omar Hamoui, a partner at Mucker Capital. Hamoui is the founder of AdMob, a cornerstone of modern mobile advertising. He discusses being one of the first apps in the app store, and early negotiations with Steve Jobs. Hamoui also talks about how entering the venture world was difficult both then and now, despite his early success selling AdMob to Google for $750 million.
After his time with Google, Hamoui became a partner at Sequoia Capital, the venture firm that funded giants like YouTube, Zoom, Instacart and Zappos. He left in 2019 to join Santa Monica's Mucker Capital — a pre-seed and seed stage venture firm that helps early companies scale their brand. In this episode he also discusses why he thinks it's difficult to raise a Series A round outside of the Bay Area.
Hear Hamoui give first-hand accounts on how he learned to create startups, negotiate, when to sell and how to find the right team.
"Sometimes people build businesses that aren't working at their scale. They have to raise money to keep going, but they're really just covering the problem with more money. It's actually not a functional business in the first place." — Omar Hamoui
Omar Hamoui is a partner at Mucker Capital. He currently resides in Santa Monica.
dot.LA Engagement Intern Colleen Tufts contributed to this post.
Facebook's Mark Zuckerberg and Google CEO Sundar Pichai brushed off their platforms' role in the January Capitol insurrection, facing a congressional panel on Thursday.
It marked the tech giants' first appearance before Congress since hundreds of people fueled by social media messages stormed the building.
"The responsibility here lies with the people who took the actions to break the law and do the insurrection," Mark Zuckerberg told the Democratic-led House Energy and Commerce Committee via videoconference. "And the people who spread that content," he added. "Including the president."
Twitter's Jack Dorsey was the only social media CEO testifying to admit his platform bears responsibility.
Pichai said his company "always feels a deep sense of responsibility." But in this case, he said, "I think we worked hard. This election effort was one of our most substantial efforts."
Congress members pushed the three executives on platforms' algorithms and their role in spreading false and violent content. The powerful trio is under increased scrutiny as Congress considers revamping Section 230, a law protecting big tech companies such as Facebook, Twitter and Google from taking the hit when it comes to misinformation.
Rep. Jan Schakowsky (D-IL) asked directly about a Reuters interview in which Facebook CEO Sheryl Sandberg said planning ahead of the siege mostly took place on smaller platforms.
"Certainly there was content on our services," Zuckerberg replied. "From that perspective I think there's further work that we need to do to make our services and moderation more effective."
In his written testimony, Zuckerberg urged Congress to consider dialing back Section 230. His proposal would grant liability protections only to companies with systems in place to oversee those posts.
And Dorsey acknowledged that Twitter can "do more" when it comes to building and exposing the platform's algorithms that impact the content users see.
But behind their calls for tighter regulation is an army of lobbyists working to keep the nation's most influential — and profitable —companies on top.
A recent report from the nonprofit Public Citizen found that Facebook and Amazon are the two biggest corporate lobbying spenders in the nation. And 94% of Congress members with authority over antitrust and privacy issues have taken money from a big tech lobbyist or PAC.
Some say the proposals would ultimately benefit the tech giants. Small companies, meanwhile, may struggle to build new systems and teams dedicated to overseeing dicey content. Whatever happens is likely to have profound implications for how users experience social media.
In response to concerns about how the platforms moderated harmful and false posts, the tech titans defended their strategies. Zuckerberg cited the tags added to some 150 million posts that misrepresented the 2020 presidential election. Pichai said YouTube removed 13,000 channels for promoting violence and extremism between October and December 2020.
"There's a lot of impressive numbers in there," Carmen Scurato, president of the advocacy group Free Press, said during a livestream event prior to the hearing. "They're grading their own homework."
"Don't get wowed by these statistics," Dr. Joan Donovan from the Shorenstein Center at Harvard said during the YouTube livestream with Scurato.
Coalitions like the Real Facebook Oversight Board want more than numerical evidence.
The group of advocacy organizations and nonprofits formed in late 2020 to tackle the slew of Spanish language posts that violate Facebook's policies but aren't caught by its algorithms.
"What kind of investment is Facebook making on the different languages to make sure that we have more of an accuracy?" asked Rep Tony Cárdenas, a Democrat representing a heavily Latino portion of the San Fernando Valley.
Zuckerberg pointed to Facebook's international fact-checking program, an initiative he said is "something we invest a lot in and it will be something we continue to invest more in."
But advocates cast doubt.
"We are not convinced one bit by Zuckerberg's empty promises and roundabout answers meant to distract us from the truth," said Brenda Victoria Castillo, president and CEO of the National Hispanic Media Coalition.
- The Future of Content Moderation Online - dot.LA ›
- How Social Media Moderation Might Be Legislated - dot.LA ›