Year: 2021

16 Feb 2021

Reddit’s transparency report shows a big spam problem and relatively few government requests

Reddit has published its transparency report for 2020, showing various numbers relating to removed content, government requests and other administrative actions. The largest problem by far — in terms of volume, anyway — is spam, which made up nearly all content taken down. Legal requests for content takedown and user information were far fewer, but not trivial, in number.

The full report is quite readable, but a bit long; the main points to understand are summarized below.

Of nearly 3.4 billion pieces of content created on Reddit (which is to say posts, comments, hosted images, etc.), 233 million were removed. These numbers are both up by 20%-30% from 2019. Of those 233 million, 131 million were “proactive” removals by the AutoMod system and 13.6 million were removed after user reports by subreddit moderators.

The remaining 85 million were taken down by Reddit admins; 99.76% of these were spam or “content manipulation” like brigading and astroturfing, with around 50,000 each of harassment, hate and sexualization of minors, smaller amounts of violent speech, doxing and so on.

Chart showing that content removal on reddit was largely spam.

Image Credits: Reddit

82,858 subreddits were removed, nearly four times more than 2019. The majority of these were for lack of moderation, followed by hate, harassment and ban evasion (e.g., r/bannedsub starts r/bannedsub2).

When it came to removing comments, hate, violence and harassment were much more prevalent. And 92% of private messages removed (of about 25,000 total) were for harassment.

Outside of spam and content manipulation, hate speech resulted in far more bans than any other infraction; more accounts were permanently banned for hate in 2020 than for all causes combined in 2019. (But far fewer for content violations than for spam and ban evasion.)

Government requests to remove content were relatively few. Overall Reddit received a couple hundred requests covering about 5,000 pieces of content or subreddits. For example, 753 subreddits had their access restricted to Pakistani users due to anti-obscenity laws there.

Requests from individuals or companies to remove things numbered in the hundreds, and copyright takedown notices asked for about half a million pieces of content to be removed (375,774 were), more than twice 2019’s. Only a handful of DMCA counter-notices were received.

Law enforcement came to Reddit 611 times for user information, up 50% from last year, and the company granted 424 of those requests. These are mostly subpoenas, court orders and search warrants. Since Reddit isn’t really a social network and accounts can be essentially anonymous or throwaway, it’s hard to say what level of disclosure this actually represents. Emergency disclosure requests numbered about 300 and were mostly complied with — these are supposedly life-or-death situations in which a Reddit account is concerned.

Lastly Reddit received somewhere between 0 and 249 secret requests for data, targeting somewhere between 0 and 249 users, same as last year. Sadly, federal law prohibits them from saying any more than this regarding FISA orders and National Security Letters.

Overall the picture painted of Reddit in 2020 is of a growing community plagued by spam and inauthentic activity, plus a significant and growing contingent of hate, harassment and other prohibited content (though last year was surely an exceptional one for this). Lacking much fundamental access to or use of personally identifiable data, Reddit isn’t much of a target for three-letter agencies and law enforcement. And with “free speech”-focused alternatives to Reddit and other platforms popping up, it’s likely that the hate and harassment that were deplatformed will roost elsewhere in 2021.

16 Feb 2021

Reddit’s transparency report shows a big spam problem and relatively few government requests

Reddit has published its transparency report for 2020, showing various numbers relating to removed content, government requests and other administrative actions. The largest problem by far — in terms of volume, anyway — is spam, which made up nearly all content taken down. Legal requests for content takedown and user information were far fewer, but not trivial, in number.

The full report is quite readable, but a bit long; the main points to understand are summarized below.

Of nearly 3.4 billion pieces of content created on Reddit (which is to say posts, comments, hosted images, etc.), 233 million were removed. These numbers are both up by 20%-30% from 2019. Of those 233 million, 131 million were “proactive” removals by the AutoMod system and 13.6 million were removed after user reports by subreddit moderators.

The remaining 85 million were taken down by Reddit admins; 99.76% of these were spam or “content manipulation” like brigading and astroturfing, with around 50,000 each of harassment, hate and sexualization of minors, smaller amounts of violent speech, doxing and so on.

Chart showing that content removal on reddit was largely spam.

Image Credits: Reddit

82,858 subreddits were removed, nearly four times more than 2019. The majority of these were for lack of moderation, followed by hate, harassment and ban evasion (e.g., r/bannedsub starts r/bannedsub2).

When it came to removing comments, hate, violence and harassment were much more prevalent. And 92% of private messages removed (of about 25,000 total) were for harassment.

Outside of spam and content manipulation, hate speech resulted in far more bans than any other infraction; more accounts were permanently banned for hate in 2020 than for all causes combined in 2019. (But far fewer for content violations than for spam and ban evasion.)

Government requests to remove content were relatively few. Overall Reddit received a couple hundred requests covering about 5,000 pieces of content or subreddits. For example, 753 subreddits had their access restricted to Pakistani users due to anti-obscenity laws there.

Requests from individuals or companies to remove things numbered in the hundreds, and copyright takedown notices asked for about half a million pieces of content to be removed (375,774 were), more than twice 2019’s. Only a handful of DMCA counter-notices were received.

Law enforcement came to Reddit 611 times for user information, up 50% from last year, and the company granted 424 of those requests. These are mostly subpoenas, court orders and search warrants. Since Reddit isn’t really a social network and accounts can be essentially anonymous or throwaway, it’s hard to say what level of disclosure this actually represents. Emergency disclosure requests numbered about 300 and were mostly complied with — these are supposedly life-or-death situations in which a Reddit account is concerned.

Lastly Reddit received somewhere between 0 and 249 secret requests for data, targeting somewhere between 0 and 249 users, same as last year. Sadly, federal law prohibits them from saying any more than this regarding FISA orders and National Security Letters.

Overall the picture painted of Reddit in 2020 is of a growing community plagued by spam and inauthentic activity, plus a significant and growing contingent of hate, harassment and other prohibited content (though last year was surely an exceptional one for this). Lacking much fundamental access to or use of personally identifiable data, Reddit isn’t much of a target for three-letter agencies and law enforcement. And with “free speech”-focused alternatives to Reddit and other platforms popping up, it’s likely that the hate and harassment that were deplatformed will roost elsewhere in 2021.

16 Feb 2021

Imagine a better future for social media at TechCrunch Sessions: Justice

Toxic culture, deadly conspiracies and organized hate have exploded online in recent years. We’ll discuss how much responsibility social networks have in the rise of these phenomena and how to build healthy online communities that make society better, not worse at TechCrunch Sessions: Justice on March 3.

Join us for a wide-ranging discussion with Rashad Robinson, Jesse Lehrich and Naj Austin that explores what needs to change to make social networks more just, healthy environments rather than dangerous echo chambers that amplify society’s ills.

Naj Austin is the founder and CEO of Somewhere Good and Ethel’s Club. She has spent her career building digital and physical products that make the world a more intersectional and equitable space. She was named one of Inc. magazine’s 100 Female Founders transforming America, a HuffPost Culture Shifter of 2020 and Time Out New York’s 2020 list of women making NYC better.

Jesse Lehrich is a co-founder of Accountable Tech. He has a decade of experience in political communications and issue advocacy, including serving as the foreign policy spokesman for Hillary Clinton’s 2016 presidential campaign, where he was part of the team managing the response to Russia’s information warfare operation.

Rashad Robinson is the president of Color Of Change, a leading racial justice organization driven by more than 7.2 million members who are building power for Black communities. Color Of Change uses innovative strategies to bring about systemic change in the industries that affect Black people’s lives: Silicon Valley, Wall Street, Hollywood, Washington, corporate board rooms, local prosecutor offices, state capitol buildings and city halls around the country.

Under Rashad’s leadership, Color Of Change designs and implements winning strategies for racial justice, among them: forcing corporations to stop supporting Trump initiatives and white nationalists; framing net neutrality as a civil rights issue; holding local prosecutors accountable to end mass incarceration, police violence and financial exploitation across the justice system; forcing over 100 corporations to abandon ALEC, the secretive right-wing policy shop; changing representations of race and racism in Hollywood; moving Airbnb, Google and Facebook to implement anti-racist initiatives; and forcing Bill O’Reilly off the air.

Be sure to join us for this conversation and much more at TechCrunch Sessions: Justice on March 3.

16 Feb 2021

Imagine a better future for social media at TechCrunch Sessions: Justice

Toxic culture, deadly conspiracies and organized hate have exploded online in recent years. We’ll discuss how much responsibility social networks have in the rise of these phenomena and how to build healthy online communities that make society better, not worse at TechCrunch Sessions: Justice on March 3.

Join us for a wide-ranging discussion with Rashad Robinson, Jesse Lehrich and Naj Austin that explores what needs to change to make social networks more just, healthy environments rather than dangerous echo chambers that amplify society’s ills.

Naj Austin is the founder and CEO of Somewhere Good and Ethel’s Club. She has spent her career building digital and physical products that make the world a more intersectional and equitable space. She was named one of Inc. magazine’s 100 Female Founders transforming America, a HuffPost Culture Shifter of 2020 and Time Out New York’s 2020 list of women making NYC better.

Jesse Lehrich is a co-founder of Accountable Tech. He has a decade of experience in political communications and issue advocacy, including serving as the foreign policy spokesman for Hillary Clinton’s 2016 presidential campaign, where he was part of the team managing the response to Russia’s information warfare operation.

Rashad Robinson is the president of Color Of Change, a leading racial justice organization driven by more than 7.2 million members who are building power for Black communities. Color Of Change uses innovative strategies to bring about systemic change in the industries that affect Black people’s lives: Silicon Valley, Wall Street, Hollywood, Washington, corporate board rooms, local prosecutor offices, state capitol buildings and city halls around the country.

Under Rashad’s leadership, Color Of Change designs and implements winning strategies for racial justice, among them: forcing corporations to stop supporting Trump initiatives and white nationalists; framing net neutrality as a civil rights issue; holding local prosecutors accountable to end mass incarceration, police violence and financial exploitation across the justice system; forcing over 100 corporations to abandon ALEC, the secretive right-wing policy shop; changing representations of race and racism in Hollywood; moving Airbnb, Google and Facebook to implement anti-racist initiatives; and forcing Bill O’Reilly off the air.

Be sure to join us for this conversation and much more at TechCrunch Sessions: Justice on March 3.

16 Feb 2021

Reimagining the path forward for the formerly incarcerated at TechCrunch Sessions: Justice

Reentering society after having been incarcerated by the criminal justice system can be daunting. Advances in technology and the continued, unchecked march of capitalism place obstacles in paths that can generally be difficult to overcome.

Fortunately for these returning citizens there are a variety of programs and resources designed to help get them up to speed. One such organization, The Last Mile, aims to help incarcerated folks learn skills so that they have a shot to get jobs after they reenter society. Some companies, like Slack, have committed to hiring returned citizens.

At TechCrunch Sessions: Justice on March 3, we’ll examine the importance of opportunities for returning citizens upon release from incarceration with a panel of people working in this important transition space. Joining us for the virtual discussion will be Aly Tamboura, strategic advisor at the newly formed Justice Accelerator Fund; Jason Jones, remote instruction manager for The Last Mile; and Deepti Rohatgi, head of Slack for Good and Public Affairs.

Aly Tamboura graduated from The Last Mile program while at San Quentin. Until recently, Tamboura was a manager in the Criminal Justice Reform Program at the Chan Zuckerberg Initiative, where he helped to guide the organization toward one of its stated goals, to reform the American criminal justice system. Just last week, the Justice Accelerator Fund announced that Tamboura joined the grant-making organization as its first strategic advisor. Tamboura will work alongside Founder and Executive Director Ana Zamora to “operationalize the fund and launch its first grant-making strategy later this year.”

Jason Jones also graduated from The Last Mile in 2018. Upon his release from San Quentin, he joined the organization as its remote instruction manager. He is a web developer and volunteers at West Oakland’s McClymonds High School teaching coding.

Slack decided to build its own take on programs like The Last Mile with Next Chapter, which helps train up formerly incarcerated individuals for jobs in tech and has hired a few itself. Deepti Rohatgi leads Slack for Good, which developed the program, though other companies have signed on to give it a try.

Join us on March 3 at TC Sessions: Justice to hear from Tamboura, Jones and Rohatgi about how the ability to start from a place of strength can help set folks up for success, as well as what the tech industry can do to help foster this environment. You can get your $5 ticket here.

16 Feb 2021

Reimagining the path forward for the formerly incarcerated at TechCrunch Sessions: Justice

Reentering society after having been incarcerated by the criminal justice system can be daunting. Advances in technology and the continued, unchecked march of capitalism place obstacles in paths that can generally be difficult to overcome.

Fortunately for these returning citizens there are a variety of programs and resources designed to help get them up to speed. One such organization, The Last Mile, aims to help incarcerated folks learn skills so that they have a shot to get jobs after they reenter society. Some companies, like Slack, have committed to hiring returned citizens.

At TechCrunch Sessions: Justice on March 3, we’ll examine the importance of opportunities for returning citizens upon release from incarceration with a panel of people working in this important transition space. Joining us for the virtual discussion will be Aly Tamboura, strategic advisor at the newly formed Justice Accelerator Fund; Jason Jones, remote instruction manager for The Last Mile; and Deepti Rohatgi, head of Slack for Good and Public Affairs.

Aly Tamboura graduated from The Last Mile program while at San Quentin. Until recently, Tamboura was a manager in the Criminal Justice Reform Program at the Chan Zuckerberg Initiative, where he helped to guide the organization toward one of its stated goals, to reform the American criminal justice system. Just last week, the Justice Accelerator Fund announced that Tamboura joined the grant-making organization as its first strategic advisor. Tamboura will work alongside Founder and Executive Director Ana Zamora to “operationalize the fund and launch its first grant-making strategy later this year.”

Jason Jones also graduated from The Last Mile in 2018. Upon his release from San Quentin, he joined the organization as its remote instruction manager. He is a web developer and volunteers at West Oakland’s McClymonds High School teaching coding.

Slack decided to build its own take on programs like The Last Mile with Next Chapter, which helps train up formerly incarcerated individuals for jobs in tech and has hired a few itself. Deepti Rohatgi leads Slack for Good, which developed the program, though other companies have signed on to give it a try.

Join us on March 3 at TC Sessions: Justice to hear from Tamboura, Jones and Rohatgi about how the ability to start from a place of strength can help set folks up for success, as well as what the tech industry can do to help foster this environment. You can get your $5 ticket here.

16 Feb 2021

Krisp nearly triples fundraise with $9M expansion after blockbuster 2020

Krisp, a startup that uses machine learning to remove background noise from audio in real time, has raised $9M as an extension of its $5M A round announced last summer. The extra money followed big traction in 2020 for the Armenian company, which grew its customers and revenue by more than an order of magnitude.

TechCrunch first covered Krisp when it was just emerging from UC Berkeley’s Skydeck accelerator, and co-founder Davit Baghdasaryan was relatively freshly out of his previous role at Twilio. The company’s pitch when I chatted with them in the shared office back then was simple and remains the core of what they offer: isolation of the human voice from any background noise (including other voices) so that audio contains only the former.

It probably comes as no surprise, then, that the company appears to have benefited immensely from the shift to virtual meetings and other trends accelerated by the pandemic. To be specific, Baghdasaryan told me that 2020 brought the company a 20x increase in active users, a 23x increase in enterprise accounts and 13x improvement of annual recurring revenue.

The rise in virtual meetings — often in noisy places like, you know, homes — has led to significant uptake across multiple industries. Krisp now has more than 1,200 enterprise customers, Baghdasaryan said: banks, HR platforms, law firms, call centers — anyone who benefits from having a clear voice on the line (“I guess any company qualifies,” he added). Enterprise-oriented controls like provisioning and central administration have been added to make it easier to integrate.

Illustration of six people using a video chat app.

Image Credits: Krisp

B2B revenue recently eclipsed B2C; the latter was likely popularized by Krisp’s inclusion as an option in popular gaming (and increasingly beyond) chat app Discord, though of course users of a free app being given a bonus product for free aren’t always big converters to “pro” tiers of a product.

But the company hasn’t been standing still, either. While it began with a simple feature set (turning background noise on and off, basically) Krisp has made many upgrades to both its product and infrastructure.

Noise cancellation for high-fidelity voice channels makes the software useful for podcasters and streamers, and acoustic correction (removing room echos) simplifies those setups quite a bit as well. Considering the amount of people doing this and the fact that they’re often willing to pay, this could be a significant source of income.

The company plans to add cross-service call recording and analysis; since it sits between the system’s sound drivers and the application, Krisp can easily save the audio and other useful metadata (How often did person A talk versus person B? What office locations are noisiest?). And the addition of voice cancellation — other people’s voices, that is — could be a huge benefit for people who work, or anticipate returning to work, in crowded offices and call centers.

Part of Krisp’s allure is the ability to run locally and securely on many platforms with very low overhead. But companies with machine learning-based products can stagnate quickly if they don’t improve their infrastructure or build more efficient training flows — Lengoo, for instance, is taking on giants in the translation industry with better training as more or less its main advantage.

Krisp has been optimizing and reoptimizing its algorithms to run efficiently on both Intel and ARM architectures, and decided to roll out its own servers for training its models instead of renting from the usual suspects.

“AWS, Azure and Google Cloud turned out to be too expensive,” Baghdasaryan said. “We have invested in building a data center with Nvidia’s latest A100s in them. This will make our experimentation faster, which is crucial for ML companies.”

Baghdasaryan was also emphatic in his satisfaction with the team in Armenia, where he and his co-founder Arto Minasyan are from, and where the company has focused its hiring, including the 25-strong research team. “By the end of 2021 it will be a 45-member team, all in Armenia,” he said. “We are super happy with the math, physics and engineering talent pool there.”

The funding amounts to $14 million if you combine the two disparate parts of the A round, the latter of which was agreed to just three months after the first. That’s a lot of money, of course, but may seem relatively modest for a company with a thousand enterprise customers and revenue growing by more than 2,000% year over year.

Baghdasaryan said they just weren’t ready to take on a whole B round, with all that involves. They do plan a new fundraise later this year when they’ve reached $15 million ARR, a goal that seems perfectly reasonable given their current charts.

Of course startups with this kind of growth tend to get snapped up by larger concerns, but despite a few offers Baghdasaryan says he’s in it for the long haul — and a multibillion dollar market.

The rush to embrace the new virtual work economy may have spurred Krisp’s growth spurt, but it’s clear that neither the company nor the environment that let it thrive are going anywhere.

16 Feb 2021

Krisp nearly triples fundraise with $9M expansion after blockbuster 2020

Krisp, a startup that uses machine learning to remove background noise from audio in real time, has raised $9M as an extension of its $5M A round announced last summer. The extra money followed big traction in 2020 for the Armenian company, which grew its customers and revenue by more than an order of magnitude.

TechCrunch first covered Krisp when it was just emerging from UC Berkeley’s Skydeck accelerator, and co-founder Davit Baghdasaryan was relatively freshly out of his previous role at Twilio. The company’s pitch when I chatted with them in the shared office back then was simple and remains the core of what they offer: isolation of the human voice from any background noise (including other voices) so that audio contains only the former.

It probably comes as no surprise, then, that the company appears to have benefited immensely from the shift to virtual meetings and other trends accelerated by the pandemic. To be specific, Baghdasaryan told me that 2020 brought the company a 20x increase in active users, a 23x increase in enterprise accounts and 13x improvement of annual recurring revenue.

The rise in virtual meetings — often in noisy places like, you know, homes — has led to significant uptake across multiple industries. Krisp now has more than 1,200 enterprise customers, Baghdasaryan said: banks, HR platforms, law firms, call centers — anyone who benefits from having a clear voice on the line (“I guess any company qualifies,” he added). Enterprise-oriented controls like provisioning and central administration have been added to make it easier to integrate.

Illustration of six people using a video chat app.

Image Credits: Krisp

B2B revenue recently eclipsed B2C; the latter was likely popularized by Krisp’s inclusion as an option in popular gaming (and increasingly beyond) chat app Discord, though of course users of a free app being given a bonus product for free aren’t always big converters to “pro” tiers of a product.

But the company hasn’t been standing still, either. While it began with a simple feature set (turning background noise on and off, basically) Krisp has made many upgrades to both its product and infrastructure.

Noise cancellation for high-fidelity voice channels makes the software useful for podcasters and streamers, and acoustic correction (removing room echos) simplifies those setups quite a bit as well. Considering the amount of people doing this and the fact that they’re often willing to pay, this could be a significant source of income.

The company plans to add cross-service call recording and analysis; since it sits between the system’s sound drivers and the application, Krisp can easily save the audio and other useful metadata (How often did person A talk versus person B? What office locations are noisiest?). And the addition of voice cancellation — other people’s voices, that is — could be a huge benefit for people who work, or anticipate returning to work, in crowded offices and call centers.

Part of Krisp’s allure is the ability to run locally and securely on many platforms with very low overhead. But companies with machine learning-based products can stagnate quickly if they don’t improve their infrastructure or build more efficient training flows — Lengoo, for instance, is taking on giants in the translation industry with better training as more or less its main advantage.

Krisp has been optimizing and reoptimizing its algorithms to run efficiently on both Intel and ARM architectures, and decided to roll out its own servers for training its models instead of renting from the usual suspects.

“AWS, Azure and Google Cloud turned out to be too expensive,” Baghdasaryan said. “We have invested in building a data center with Nvidia’s latest A100s in them. This will make our experimentation faster, which is crucial for ML companies.”

Baghdasaryan was also emphatic in his satisfaction with the team in Armenia, where he and his co-founder Arto Minasyan are from, and where the company has focused its hiring, including the 25-strong research team. “By the end of 2021 it will be a 45-member team, all in Armenia,” he said. “We are super happy with the math, physics and engineering talent pool there.”

The funding amounts to $14 million if you combine the two disparate parts of the A round, the latter of which was agreed to just three months after the first. That’s a lot of money, of course, but may seem relatively modest for a company with a thousand enterprise customers and revenue growing by more than 2,000% year over year.

Baghdasaryan said they just weren’t ready to take on a whole B round, with all that involves. They do plan a new fundraise later this year when they’ve reached $15 million ARR, a goal that seems perfectly reasonable given their current charts.

Of course startups with this kind of growth tend to get snapped up by larger concerns, but despite a few offers Baghdasaryan says he’s in it for the long haul — and a multibillion dollar market.

The rush to embrace the new virtual work economy may have spurred Krisp’s growth spurt, but it’s clear that neither the company nor the environment that let it thrive are going anywhere.

16 Feb 2021

Notable Health seeks to improve COVID-19 vaccine administration through intelligent automation

Efficient and cost-effective vaccine distribution remains one of the biggest challenges of 2021, so it’s no surprise that startup Notable Health wants to use their automation platform to help. Initially started to address the nearly $250 billion annual administrative costs in healthcare, Notable Health launched in 2017 to use automation to replace time-consuming and repetitive simple tasks in health industry admin. In early January of this year, they announced plans to use that technology as a way to help manage vaccine distribution.

“As a physician, I saw firsthand that with any patient encounter, there are 90 steps or touch points that need to occur,” said Notable Health Medical Director Muthu Alagappan in an interview. “It’s our hypothesis that the vast majority of those points can be automated.”

Notable Health’s core technology is a platform that uses robotic process automation (RPA), natural language processing (NLP) and machine learning to find eligible patients for the COVID-19 vaccine. Combined with data provided by hospital systems’ electronic health records, the platform helps those qualified to receive the vaccine set up appointments and guides them to other relevant educational resources.

“By leveraging intelligent automation to identify, outreach, educate and triage patients, health systems can develop efficient and equitable vaccine distribution workflows,” said Notable Health strategic advisor and Biden Transition COVID-19 Advisory Board Member Dr. Ezekiel Emanuel, in a press release.

Making vaccine appointments has been especially difficult for older Americans, many of whom have reportedly struggled with navigating scheduling websites. Alagappan sees that as a design problem. “Technology often gets a bad reputation, because it’s hampered by the many bad technology experiences that are out there,” he said.

Instead, he thinks Notable Health has kept the user in mind through a more simplified approach, asking users only for basic and easy-to-remember information through a text message link. “It’s that emphasis on user-centric design that I think has allowed us to still have really good engagement rates even with older populations,” he said.

While the startup’s platform will likely help hospitals and health systems develop a more efficient approach to vaccinations, its use of RPA and NLP holds promise for future optimization in healthcare. Leaders of similar technology in other industries have already gone on to have multibillion dollar valuations and continue to attract investors’ interest.

Artificial intelligence is expected to grow in healthcare over the next several years, but Alagappan argues that combining that with other, more readily available intelligent technologies is also an important step toward improved care. “When we say intelligent automation, we’re really referring to the marriage of two concepts: artificial intelligence — which is knowing what to do — and robotic process automation — which is knowing how to do it,” he said. That dual approach is what he says allows Notable Health to bypass administrative bottlenecks in healthcare, instructing bots to carry out those tasks in an efficient and adaptable way.

So far, Notable Health has worked with several hospital systems across multiple states in using their platform for vaccine distribution and scheduling, and are now using the platform to reach out to tens of thousands of patients per day.

16 Feb 2021

Notable Health seeks to improve COVID-19 vaccine administration through intelligent automation

Efficient and cost-effective vaccine distribution remains one of the biggest challenges of 2021, so it’s no surprise that startup Notable Health wants to use their automation platform to help. Initially started to address the nearly $250 billion annual administrative costs in healthcare, Notable Health launched in 2017 to use automation to replace time-consuming and repetitive simple tasks in health industry admin. In early January of this year, they announced plans to use that technology as a way to help manage vaccine distribution.

“As a physician, I saw firsthand that with any patient encounter, there are 90 steps or touch points that need to occur,” said Notable Health Medical Director Muthu Alagappan in an interview. “It’s our hypothesis that the vast majority of those points can be automated.”

Notable Health’s core technology is a platform that uses robotic process automation (RPA), natural language processing (NLP) and machine learning to find eligible patients for the COVID-19 vaccine. Combined with data provided by hospital systems’ electronic health records, the platform helps those qualified to receive the vaccine set up appointments and guides them to other relevant educational resources.

“By leveraging intelligent automation to identify, outreach, educate and triage patients, health systems can develop efficient and equitable vaccine distribution workflows,” said Notable Health strategic advisor and Biden Transition COVID-19 Advisory Board Member Dr. Ezekiel Emanuel, in a press release.

Making vaccine appointments has been especially difficult for older Americans, many of whom have reportedly struggled with navigating scheduling websites. Alagappan sees that as a design problem. “Technology often gets a bad reputation, because it’s hampered by the many bad technology experiences that are out there,” he said.

Instead, he thinks Notable Health has kept the user in mind through a more simplified approach, asking users only for basic and easy-to-remember information through a text message link. “It’s that emphasis on user-centric design that I think has allowed us to still have really good engagement rates even with older populations,” he said.

While the startup’s platform will likely help hospitals and health systems develop a more efficient approach to vaccinations, its use of RPA and NLP holds promise for future optimization in healthcare. Leaders of similar technology in other industries have already gone on to have multibillion dollar valuations and continue to attract investors’ interest.

Artificial intelligence is expected to grow in healthcare over the next several years, but Alagappan argues that combining that with other, more readily available intelligent technologies is also an important step toward improved care. “When we say intelligent automation, we’re really referring to the marriage of two concepts: artificial intelligence — which is knowing what to do — and robotic process automation — which is knowing how to do it,” he said. That dual approach is what he says allows Notable Health to bypass administrative bottlenecks in healthcare, instructing bots to carry out those tasks in an efficient and adaptable way.

So far, Notable Health has worked with several hospital systems across multiple states in using their platform for vaccine distribution and scheduling, and are now using the platform to reach out to tens of thousands of patients per day.