Growing up in a South Asian household that had rules around food and eating, Abhilash Patel developed an eating disorder.
Patel never fit the typical profile for someone who has one — most people of color and men don't. But after years of working on a slew of behavioral health companies that dealt with addiction, including Rehabs.com and Recovery.org, Patel said he was ready to tackle eating disorders.
In 2020, Patel met with Dr. Wendy Oliver-Pyatt, an eating disorder specialist of 25 years, and the pair launched Within Health this week. The app pairs patients with dietitians, therapists and psychiatrists to treat a variety of eating disorders such as binge eating disorder, night eating syndrome and orthorexia.
Within Health is using AI and machine learning to track patient patterns like when they have difficulty completing the treatment or when they are late to a therapy session in order to better predict where patients might have difficulty and intervene early. It's a common model for data harnessing lots of health and wellness apps like Headspace.
"We're able to deliver treatment with more efficacy than it would have been otherwise. It allows us to track outcomes better," Patel said. "It allows [us] to deliver treatment for an extremely good life, that we believe is actually leading to better outcomes."
He said unlike group therapy and other treatments often alienate people who don't fit the public's perception of an eating disorder patient: men (a 2007 study found a quarter of anorexia and bulimia patients were men), people who are overweight (some studies indicate binge eating disorder and obesity are comorbidities) and LGBTQ+ people.
Abhilash Patel, co-founder of Within Health.
"That's a lot of people who generally don't talk about it or seek treatment because largely treatment is dominated by people who are female, because that's what we think about," Patel said.
Though Within Health is self-funded, it's launching at a time when venture capital activity in this space is at its prime. Mental and behavioral health apps raised $549 million in 2020, and nearly that same amount just six months into 2021, according to Pitchbook.
Noom, which uses psychotherapy principles around stress and anxiety to help people lose weight, raised $540 in a Series F round in May. Talkspace, a therapy company, went public earlier this year.
Eating disorders are difficult to treat because patients often relapse and face stressful situations. It requires long-term care and a team of professionals to help patients through. It also demands monitoring both food intake and one's mental health.
Those who do seek treatment find themselves seeking treatment from multiple doctors, therapists and others scheduling meetings and sitting on waitlists for eating disorder specialists."It's always people going to treatment. And that's why most people don't ever get treatment because they, for one reason or another, can't or won't go. Instead we make treatment go to people who are suffering."
- Headspace Raises $93 Million, Plans Growth - dot.LA ›
- Holistic Health Kenshō Moves Into Telehealth ›
‘Like Modern-Day Phrenology’: Will a New Slew of Mobile Apps Improve Mental Health or Put Users at Risk?
The pandemic has been a disaster for our mental health.
According to the Kaiser Family Foundation, the number of people experiencing depression, anxiety and other mental health issues nearly doubled among Americans. Little surprise, then, that the demand for mental health services has skyrocketed.
As the country grapples with its mental health crisis, Apple has a proposal: Turn to your phone.
The tech giant is betting that the smartphone provides a trove of data that could help alleviate common mental health struggles — though not everyone is convinced it's such a good idea.
UCLA researchers have been working with Apple to see if an iPhone can detect depression and anxiety via so-called emotion recognition, the Wall Street Journal reported in September. The project is still in early development stages, and the company has not announced what the end product will look like.
Apple's venture with UCLA will triangulate a slew of data points including facial expressions, audible cues like tone, how often and how fast people are walking, heart rate, speed and accuracy of typing and what they type to determine what signals correspond with certain mental illnesses, according to the outlet. They will also ask participants to fill out surveys about how they feel. (Apple and UCLA did not respond to requests for comment.)
"If it's true that mental health affects people's day-to-day lives, then the question is in what ways does it affect people's day-to-day lives? And in what of those ways is there something measurable?" said UCLA professor Dr. Arash Naeim, who was not involved in the research with Apple.
But activists and experts warn the technology, which would use facial expressions as a data point in determining a user's mental health, could be used to exploit the people who use it.
"We have rampant mental health issues and no good routine way of screening for them. So if you were able to develop an automated way to really understand depression, anxiety, stress levels, that's really an advancement," said Ritika Chaturvedi, a precision medicine expert at the USC Schaeffer Center. "But — as with everything in tech — without guardrails, without regulation and without really understanding what the technology is actually doing, it's rife for abuse."
Facial recognition technology has long been used to match faces to identities, and is a trusted surveillance tool of government and law enforcement agents — often at the expense of activists, journalists and immigrants.
But emotion recognition takes it to another level by using one's face and attempting to infer people's emotions, which could heighten tensions in dangerous law enforcement situations, or have racist implications (like when Google's AI classified some photos of Black people as gorillas).
"I am a deep skeptic that such technology can actually work in any meaningful way," said Mark McKenna, a privacy law professor at UCLA. (McKenna was not involved in the research with Apple.) "So I think there's a huge risk of error here in ways that could be really damaging."
McKenna points to a variety of potential nefarious uses for this technology. For example, it might be deployed as a more sophisticated lie detector test — one that looks at emotional cues to determine whether someone is anxious. It could also bolster existing technology that claims to use facial recognition to determine someone's likelihood of committing a crime.
"It's kind of like modern-day phrenology done with digital tools," he said.
Doctors Are Relying More and More on Digital Technology
Even before this latest mental health bet, Apple was on the hunt to better utilize its products to improve users' health, as noted by its recent effort to expand the Apple Watch's capabilities as a glucose monitor. It has also added tools on the smartwatch to detect irregular heart rhythms and notifications to limit audio exposure on the iPhone.
These tools can often fill in the gaps for doctors who often rely on their patients to be accurate historians about their own health, diet and emotions. Arguably, the most intimate connection anyone has is with their smartphone. If patients can't articulate they're feeling depressed, aren't even aware they are dealing with a mental health struggle, it's almost impossible to provide them with proper care as poor mental health can be the root of other health problems.
"There's a lot of potential for technology to be able to leverage more information about the individual, make a contribution in their lives and health," said Naeim, the UCLA professor. "The future is having better data and using the better data in a smarter way to better make an impact on our patients' lives and as well as empowering patients themselves."
Health Privacy in a Digital World
But Apple isn't the only company that is collecting and leveraging highly personal data points. A plethora of period-tracking apps, mental health apps and food-tracking apps have the ability to collect and sell data on their customers.
"Health privacy has always been one of the biggest categories that people have been worried about" in data privacy circles, said McKenna. "We have a regulatory system, but that regulatory system is woefully inadequate for the kinds of tools that we have now, and especially where we live in a world where everybody is creating apps to track all kinds of information."
That regulatory system, the Health Insurance Portability and Accountability Act (HIPAA), was established in 1996 to standardize how doctors stored and shared patient health information so they don't jeopardize their privacy. The law only applies to health care institutions, and it's simply a disclosure that notifies a patient how their health information can be used.
"There's a huge industry of apps and health data That aren't regulated by HIPAA because they're not health care providers," McKenna said.
According to the Journal, Apple hopes to be able to warn people they may be struggling with mental health problems and ask them to seek care. But using facial recognition as a data point to determine one's emotion has use cases that reach far beyond the realm of mental health.
Emotion Recognition Is the New Wave of Facial Recognition
A 2019 study by psychologists at Northeastern University found that there are no objective ways to measure how one's face corresponds with emotions. Furrowing eyebrows, upturned corners of the mouth and a scrunched nose are not conclusively tied to a specific emotion. People's faces are simply constructed in different ways; pinched eyebrows on one person could be just close eyebrows on another.
That doesn't mean the technology isn't being used. 4 Little Trees, a Hong Kong company, has found success in a program that is said to be able to detect emotions in children as they attend school from home. AI-based emotion-detection software has been used on the Uyghur Muslims in China.
Chaturvedi said there needs to be guardrails put in place to make sure people can enjoy the benefits of this kind of technology without being exploited, and it would need to come from the public policy leaders and privacy experts.
"Because you need to give up a certain level of privacy, it makes it so that bad actors exploit the same technology," she said. "What we need are the guardrails to protect from harm, rather than the privacy itself being the fundamental issue."
- Health and Wellness Leaders on the Future of the Industry in LA ... ›
- Psilocybin Mushroom Treatment for Mental Health Is Coming - dot.LA ›
- Headspace Health Merges With Mental Health Startup Ginger - dot.LA ›
The three-year-old company is one of many in Los Angeles, including MedVector and Science37, investing in a new model of clinical drug trials that seek to virtualize and speed up a yearslong, data-intensive process heavily regulated by the Food and Drug Administration. Those trials are key to help doctors understand the efficacy and side effects of drugs before dosing their patients.
The approach has proved attractive during a pandemic that has forced traditional drug trials to slow down as researchers scrambled to safely conduct trials without putting participants in danger of COVID-19.
SEC filings from Monday show the startup raised $40 million from 10 undisclosed investors. Neither Lightship nor its previous investors responded to requests for comment.
But the move comes as the Food and Drug Administration shifts its thinking on clinical trials. In November, the agency updated its standards to accommodate patients participating in clinical trials from home. It also acknowledged the process was largely unfair because it failed to include underrepresented patients.
In issuing the guidance, Commissioner Stephen M. Hahn acknowledged "clinical trials requiring frequent visits to specific sites may place an added burden on participants."
The clinical trial process often excludes underprivileged people who would otherwise be prime candidates for the drug. That's because it requires participants to drive to hospitals or research sites, sometimes during the work day.
Lightship, though tight-lipped about its products, says on its website it aims to construct flexible clinical trial solutions for companies that want to accommodate patients that can't drive to a nearby facility for regular testing.
That could help clinical trials solve their diversity problem that some researchers argue have plagued the industry.
Clinical trials depend on racial and financial diversity because a drug's efficacy isn't based solely on the chemical composition of the drug – it also requires an extensive understanding of different environmental factors, such as someone's quality of sleep, outside stressors and genetics. Those factors can influence a drug's performance and they are part of what the FDA considers when it creates a drug's safety profile for doctors and patients before releasing the drug onto the market.
"If you don't include a diverse population in your clinical trial and really study those differences or similarities, you might end up having a situation where the product is out in the market and it has a different safety and efficacy profile [than what was originally written]," said Dr. Eunjoo Pacifici, a professor at the USC School of Pharmacy.
Lightship previously raised $10 million in a debt financing round in April, and nearly $20 million in venture capital in 2020.MedVector, another LA-based startup tackling the virtual clinical trial space, raised $630,000 in March for a product to help clinicians document a participant's vitals without requiring them to travel. Another, Science 37, raised $40 million in 2020 to construct flexible clinical trials.