Year: 2019

09 Aug 2019

Hundreds of exposed Amazon cloud backups found leaking sensitive data

How safe are your secrets? If you used Amazon’s Elastic Block Storage, you might want to check your settings.

New research just presented at the Def Con security conference reveals how companies, startups, and governments are inadvertently leaking their own files from the cloud.

You may have heard of exposed S3 buckets — those Amazon-hosted storage servers packed with customer data but are often misconfigured and inadvertently set to “public” for anyone to access. But you may not have heard about exposed EBS volumes, which poses as much if not a greater risk.

These elastic block storage (EBS) volumes are the “keys to the kingdom,” said Ben Morris, a senior security analyst at cybersecurity firm Bishop Fox, in a call with TechCrunch ahead of his Def Con talk. EBS volumes store all the data for cloud applications. “They have the secret keys to your applications and they have database access to your customers’ information,” he said.

“When you get rid of the hard disk for your computer, you know, you usually shredded or wipe it completely,” he said. “But these public EBS volumes are just left for anyone to take and start poking at.”

He said that all too often cloud admins don’t choose the correct configuration settings, leaving EBS volumes inadvertently public and unencrypted. “That means anyone on the internet can download your hard disk and boot it up, attach it to a machine they control, and then start rifling through the disk to look for any kind of secrets,” he said.

Screen Shot 2019 08 07 at 2.14.30 PM

One of Morris’ Def Con slides noting the types of compromised data found using his research, often known as the “Wall of Sheep.” (Image: Ben Morris/Bishop Fox; supplied)

Morris built a tool using Amazon’s own internal volume search feature to query and scrape publicly exposed EBS volumes, then attach it, make a copy and list the contents of the volume on his system.

“If you expose the disk for even just a couple of minutes, our system will pick it up and make it copy of it,” he said.

It took him two months to build up a database of exposed volumes and just a few hundred dollars spent on Amazon cloud resources. Once he validates each volume, he deletes the data.

Morris found dozens of volumes exposed publicly in one region alone, he said, including application keys, critical user or administrative credentials, source code, and more. He found several major companies, including healthcare providers and tech companies.

He also found VPN configurations, which he said could allow him to tunnel into a corporate network. Morris said he did not use any credentials or sensitive data as it would be unlawful.

Among the most damaging things he found, Morris said he found a volume for one government contractor, which he did not name, but provided data storage services to federal agencies. “On their website, they brag about holding this data,” he said, referring to collected intelligence from messages sent to and from the so-called Islamic State terror group to data on border crossings.

“Those are the kind of things I would definitely not want some to be exposed to the public Internet,” he said.

He estimates the figure could be as many as 1,250 exposures across all Amazon cloud regions.

Morris plans to release his proof-of-concept code in the coming weeks.

“I’m giving companies a couple of weeks to go through their own disks and make sure that they don’t have any accidental exposures,” he said.

09 Aug 2019

Daily Crunch: Uber reports big losses and slowing growth

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Uber lost more than $5B last quarter

Uber reported earnings for the second time as a public company, posting its largest-ever quarterly loss. And while revenue grew 14% year-over-year, that also sparked concerns over (relatively) slow growth.

The company’s stock took a nose-dive of 11% in after-hours trading following the news.

2. HarmonyOS is Huawei’s Android alternative for smartphones and smart home devices

After months of conflicting statements from executives, Huawei has officially unveiled HarmonyOS, a distributed operating system developed to power smartphones, laptops and smart home devices as the company attempts to reduce its reliance on American firms.

3. Apple expands its bug bounty, increases maximum payout to $1M

Apple is finally giving security researchers something they’ve wanted for years: a macOS bug bounty.

GettyImages 142420587 1

Hustle and bustle of Indian roads around monument of Charminar in Hyderabad, India.

4. India’s Lendingkart raises $30M to help small businesses access working capital

Lendingkart Finance has issued over 60,000 loans to more than 55,000 small and medium-sized enterprises in 1,300 cities across India. The startup says it will use the fresh capital to widen its lending range and find new clients.

5. This startup is helping food app delivery workers start their own damn delivery companies

Dumpling’s goal is to turn today’s delivery workers into “solopreneurs” who build their own book of clients and keep much more of the money.

6. The smartwatch category is growing, as Apple remains dominant

The good news for smartwatches is that the category continues to grow — but the numbers start to look less rosy when you take Apple out of the equation. (Extra Crunch membership required.)

7. Quantum computing is coming to TC Sessions: Enterprise on Sept. 5

Joining us onstage will be Microsoft’s Krysta Svore, who leads the company’s quantum efforts; IBM’s Jay Gambetta, the principal theoretical scientist behind IBM’s quantum computing effort; and Jim Clarke, the director of quantum hardware at Intel Labs.

09 Aug 2019

Daily Crunch: Uber reports big losses and slowing growth

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Uber lost more than $5B last quarter

Uber reported earnings for the second time as a public company, posting its largest-ever quarterly loss. And while revenue grew 14% year-over-year, that also sparked concerns over (relatively) slow growth.

The company’s stock took a nose-dive of 11% in after-hours trading following the news.

2. HarmonyOS is Huawei’s Android alternative for smartphones and smart home devices

After months of conflicting statements from executives, Huawei has officially unveiled HarmonyOS, a distributed operating system developed to power smartphones, laptops and smart home devices as the company attempts to reduce its reliance on American firms.

3. Apple expands its bug bounty, increases maximum payout to $1M

Apple is finally giving security researchers something they’ve wanted for years: a macOS bug bounty.

GettyImages 142420587 1

Hustle and bustle of Indian roads around monument of Charminar in Hyderabad, India.

4. India’s Lendingkart raises $30M to help small businesses access working capital

Lendingkart Finance has issued over 60,000 loans to more than 55,000 small and medium-sized enterprises in 1,300 cities across India. The startup says it will use the fresh capital to widen its lending range and find new clients.

5. This startup is helping food app delivery workers start their own damn delivery companies

Dumpling’s goal is to turn today’s delivery workers into “solopreneurs” who build their own book of clients and keep much more of the money.

6. The smartwatch category is growing, as Apple remains dominant

The good news for smartwatches is that the category continues to grow — but the numbers start to look less rosy when you take Apple out of the equation. (Extra Crunch membership required.)

7. Quantum computing is coming to TC Sessions: Enterprise on Sept. 5

Joining us onstage will be Microsoft’s Krysta Svore, who leads the company’s quantum efforts; IBM’s Jay Gambetta, the principal theoretical scientist behind IBM’s quantum computing effort; and Jim Clarke, the director of quantum hardware at Intel Labs.

09 Aug 2019

Why AI needs more social workers, with Columbia University’s Desmond Patton

Sometimes it does seem the entire tech industry could use someone to talk to, like a good therapist or social worker. That might sound like an insult, but I mean it mostly earnestly: I am a chaplain who has spent 15 years talking with students, faculty, and other leaders at Harvard (and more recently MIT as well), mostly nonreligious and skeptical people like me, about their struggles to figure out what it means to build a meaningful career and a satisfying life, in a world full of insecurity, instability, and divisiveness of every kind.

In related news, I recently took a year-long paid sabbatical from my work at Harvard and MIT, to spend 2019-20 investigating the ethics of technology and business (including by writing this column at TechCrunch). I doubt it will shock you to hear I’ve encountered a lot of amoral behavior in tech, thus far.

A less expected and perhaps more profound finding, however, has been what the introspective founder Priyag Narula of LeadGenius tweeted at me recently: that behind the hubris and Machiavellianism one can find in tech companies is a constant struggle with anxiety and an abiding feeling of inadequacy among tech leaders.

In tech, just like at places like Harvard and MIT, people are stressed. They’re hurting, whether or not they even realize it.

So when Harvard’s Berkman Klein Center for Internet and Society recently posted an article whose headline began, “Why AI Needs Social Workers…”… it caught my eye.

The article, it turns out, was written by Columbia University Professor Desmond Patton. Patton is a Public Interest Technologist and pioneer in the use of social media and artificial intelligence in the study of gun violence. The founding Director of Columbia’s SAFElab and Associate Professor of Social Work, Sociology and Data Science at Columbia University.

desmond cropped 800x800

Desmond Patton. Image via Desmond Patton / Stern Strategy Group

A trained social worker and decorated social work scholar, Patton has also become a big name in AI circles in recent years. If Big Tech ever decided to hire a Chief Social Work Officer, he’d be a sought-after candidate.

It further turns out that Patton’s expertise — in online violence & its relationship to violent acts in the real world — has been all too “hot” a topic this past week, with mass murderers in both El Paso, Texas and Dayton, Ohio having been deeply immersed in online worlds of hatred which seemingly helped lead to their violent acts.

Fortunately, we have Patton to help us understand all of these issues. Here is my conversation with him: on violence and trauma in tech on and offline, and how social workers could help; on deadly hip-hop beefs and “Internet Banging” (a term Patton coined); hiring formerly gang-involved youth as “domain experts” to improve AI; how to think about the likely growing phenomenon of white supremacists live-streaming barbaric acts; and on the economics of inclusion across tech.

Greg Epstein: How did you end up working in both social work and tech?

Desmond Patton: At the heart of my work is an interest in root causes of community-based violence, so I’ve always identified as a social worker that does violence-based research. [At the University of Chicago] my dissertation focused on how young African American men navigated violence in their community on the west side of the city while remaining active in their school environment.

[From that work] I learned more about the role of social media in their lives. This was around 2011, 2012, and one of the things that kept coming through in interviews with these young men was how social media was an important tool for navigating both safe and unsafe locations, but also an environment that allowed them to project a multitude of selves. To be a school self, to be a community self, to be who they really wanted to be, to try out new identities.

09 Aug 2019

Why AI needs more social workers, with Columbia University’s Desmond Patton

Sometimes it does seem the entire tech industry could use someone to talk to, like a good therapist or social worker. That might sound like an insult, but I mean it mostly earnestly: I am a chaplain who has spent 15 years talking with students, faculty, and other leaders at Harvard (and more recently MIT as well), mostly nonreligious and skeptical people like me, about their struggles to figure out what it means to build a meaningful career and a satisfying life, in a world full of insecurity, instability, and divisiveness of every kind.

In related news, I recently took a year-long paid sabbatical from my work at Harvard and MIT, to spend 2019-20 investigating the ethics of technology and business (including by writing this column at TechCrunch). I doubt it will shock you to hear I’ve encountered a lot of amoral behavior in tech, thus far.

A less expected and perhaps more profound finding, however, has been what the introspective founder Priyag Narula of LeadGenius tweeted at me recently: that behind the hubris and Machiavellianism one can find in tech companies is a constant struggle with anxiety and an abiding feeling of inadequacy among tech leaders.

In tech, just like at places like Harvard and MIT, people are stressed. They’re hurting, whether or not they even realize it.

So when Harvard’s Berkman Klein Center for Internet and Society recently posted an article whose headline began, “Why AI Needs Social Workers…”… it caught my eye.

The article, it turns out, was written by Columbia University Professor Desmond Patton. Patton is a Public Interest Technologist and pioneer in the use of social media and artificial intelligence in the study of gun violence. The founding Director of Columbia’s SAFElab and Associate Professor of Social Work, Sociology and Data Science at Columbia University.

desmond cropped 800x800

Desmond Patton. Image via Desmond Patton / Stern Strategy Group

A trained social worker and decorated social work scholar, Patton has also become a big name in AI circles in recent years. If Big Tech ever decided to hire a Chief Social Work Officer, he’d be a sought-after candidate.

It further turns out that Patton’s expertise — in online violence & its relationship to violent acts in the real world — has been all too “hot” a topic this past week, with mass murderers in both El Paso, Texas and Dayton, Ohio having been deeply immersed in online worlds of hatred which seemingly helped lead to their violent acts.

Fortunately, we have Patton to help us understand all of these issues. Here is my conversation with him: on violence and trauma in tech on and offline, and how social workers could help; on deadly hip-hop beefs and “Internet Banging” (a term Patton coined); hiring formerly gang-involved youth as “domain experts” to improve AI; how to think about the likely growing phenomenon of white supremacists live-streaming barbaric acts; and on the economics of inclusion across tech.

Greg Epstein: How did you end up working in both social work and tech?

Desmond Patton: At the heart of my work is an interest in root causes of community-based violence, so I’ve always identified as a social worker that does violence-based research. [At the University of Chicago] my dissertation focused on how young African American men navigated violence in their community on the west side of the city while remaining active in their school environment.

[From that work] I learned more about the role of social media in their lives. This was around 2011, 2012, and one of the things that kept coming through in interviews with these young men was how social media was an important tool for navigating both safe and unsafe locations, but also an environment that allowed them to project a multitude of selves. To be a school self, to be a community self, to be who they really wanted to be, to try out new identities.

09 Aug 2019

Proterra, the Tesla of electric buses, closes in on $1 billion valuation

Proterra has authorized shares to raise $75 million, a new round of funding that would push the electric bus maker’s valuation past $1 billion, TechCrunch has learned.

The company authorized the sale of 10,857,762 shares at a price of $6.91 in a Series 8 round, according to a securities filing that was obtained by the Prime Unicorn Index, a company that tracks the performance of private U.S. companies, and reviewed by TechCrunch. If all of the shares are issued, the company’s total valuation would be $1.04 billion, pushing it into “unicorn” territory, according to Prime Unicorn Index.

Proterra declined to comment.

Efforts to raise capital come as the company explores an IPO, according to a report last month by Reuters that said Proterra had hired underwriters from Deutsche Bank, JPMorgan Chase and Morgan Stanley.

Prior to this August 2 filing, Proterra had raised a total of $$551.77 million in funding from investors that include G2VP, Kleiner Perkins Caufield & Byers, Constellation Ventures, Mitsui & Co as well as BMW i Ventures, Edison Energy, the Federal Transportation Administration, General Motors’s venture arm and Tao Capital Partners.

Proterra’s roots are producing electric buses for municipal, federal and commercial transit agencies; it has a line of electric buses, hundreds of which have been sold, that can travel 350 miles on a single charge. The Burlingame, Calif. company, which has a number of former Tesla employees in leadership positions including CEO Ryan Popple, has since diversified its business.

Proterra rolled out in April a $200 million credit facility backed by Japanese investment giant Mitsui & Co. to scale up a battery leasing program aimed at lowering the barrier of entry of buying an electric bus.

And just this month, company announced it has added a new business line called Proterra Powered that will sell its vehicle battery systems, powertrain technology and charging infrastructure to commercial truck and manufacturers of heavy-duty vehicles like garbage trucks.

This new business line stems from its previous relationships with companies like Van Hool and Daimler . Proterra announced last year it was working with Daimler to electrify the company’s Thomas Built Buses division, which makes a line of school buses. That relationship comes with some financial backing and an agreement to share technologies.

Daimler co-led a $155 million funding along with Tao Capital Partner. Proterra is lending its battery and drive train expertise; Daimler will show Proterra how to scale its manufacturing business even further.

The partnership has already been fruitful. Thomas Built Buses received certifications from the California Air Resources Board and the Hybrid and Zero-Emission Truck and Bus Voucher Incentive Project for an electric bus, known as the Saf-T-Liner C2 Jouley, that uses Proterra technology. Electric school bus production for demonstration and innovation vehicles begins in 2019 and commercial production begins in 2020.

09 Aug 2019

Last chance for early-bird tickets to TC Sessions: Enterprise 2019

It’s down to the wire folks. Today’s the last day you can save $100 on your ticket to TC Sessions: Enterprise 2019, which takes place on September 5 at the Yerba Buena Center in San Francisco. The deadline expires in mere hours — at 11:59 p.m. (PT). Get the best possible price and buy your early-bird ticket right now.

We expect more than 1,000 attendees representing the enterprise software community’s best and brightest. We’re talking founders of companies in every stage and CIOs and systems architects from some of the biggest multinationals. And, of course, managing partners from the most influential venture and corporate investment firms.

Take a look at just some of the companies joining us for TC Sessions: Enterprise: Bain & Company, Box, Dell Technologies Capital, Google, Oracle, SAP and SoftBank. Let the networking begin!

You can expect a full day of main-stage interviews and panel discussions, plus break-out sessions and speaker Q&As. TechCrunch editors will dig into the big issues enterprise software companies face today along with emerging trends and technologies.

Data, for example, is a mighty hot topic, and you’ll hear a lot more about it during a session entitled, Innovation Break: Data – Who Owns It?: Enterprises have historically competed by being closed entities, keeping a closed architecture and innovating internally. When applying this closed approach to the hottest new commodity, data, it simply does not work anymore. But as enterprises, startups and public institutions open themselves up, how open is too open? Hear from leaders who explore data ownership and the questions that need to be answered before the data floodgates are opened. Sponsored by SAP .

If investment is on your mind, don’t miss the Investor Q&A. Some of greatest investors in enterprise will be on hand to answer your burning questions. Want to know more? Check out the full agenda.

Maximize your last day of early-bird buying power and take advantage of the group discount. Buy four or more tickets at once and save 20%. Here’s a bonus. Every ticket you buy to TC Sessions: Enterprise includes a free Expo Only pass to TechCrunch Disrupt SF on October 2-4.

It’s now o’clock startuppers. Your opportunity to save $100 on tickets to TC Sessions: Enterprise ends tonight at precisely 11:59 p.m. (PT). Buy your early-bird tickets now and join us in September!

Is your company interested in sponsoring or exhibiting at TC Sessions: Enterprise? Contact our sponsorship sales team by filling out this form.

09 Aug 2019

Preclusio uses machine learning to comply with GDPR, other privacy regulations

As privacy regulations like GDPR and the California Consumer Privacy Act proliferate, more startups are looking to help companies comply. Enter Preclusio, a member of the Y Combinator Summer 2019 class, which has developed a machine learning-fueled solution to help companies adhere to these privacy regulations.

“We have a platform that is deployed on prem in our customer’s environment, and helps them identify what data they’re collecting, how they’re using it, where it’s being stored and how it should be protected. We help companies put together this broad view of their data, and then we continuously monitor their data infrastructure to ensure that this data continues to be protected,” company co-founder and CEO Heather Wade told TechCrunch.

She says that the company made a deliberate decision to keep the solution on-prem.”We really believe in giving our clients control over their data. We don’t want to be just another third-party SaaS vendor that you have to ship your data to,” Wade explained.

That said, customers can run it wherever they wish, whether that’s on prem or in the cloud in Azure or AWS. Regardless of where it’s stored, the idea is to give customers direct control over their own data. “We are really trying to alert our customers to threats or to potential privacy exceptions that are occurring in their environment in real time, and being in their environment is really the best way to facilitate this,” she said.

The product works by getting read-only access to the data, then begins to identify sensitive data in an automated fashion using machine learning. “Our product automatically looks at the schema and samples of the data, and uses machine learning to identify common protected data,” she said. Once that process is completed, a privacy compliance team can review the findings and adjust these classifications as needed.

Wade, who started the company in March, says the idea formed at previous positions where she was responsible for implementing privacy policies and found there weren’t adequate solutions on the market to help. “I had to face the challenges first-hand of dealing with privacy and compliance and seeing how resources were really taken away from our engineering teams and having to allocate these resources to solving these problems internally, especially early on when GDPR was first passed, and there really were not that many tools available in the market,” she said.

Interestingly Wade’s co-founder is her husband, John. She says they deal with the intensity of being married and startup founders by sticking to their areas of expertise. He’s the marketing person and she’s the technical one.

She says they applied to Y Combinator because they wanted to grow quickly, and that timing is important with more privacy laws coming online soon. She has been impressed with the generosity of the community in helping them reach their goals. “It’s almost indescribable how generous and helpful other folks who’ve been through the YC program are to the incoming batches, and they really do have that spirit of paying it forward,” she said.

09 Aug 2019

Robocall blocking apps caught sending your private data without permission

Robocall-blocking apps promise to rid your life of spoofed and spam phone calls. But are they as trustworthy as they claim to be?

One security researcher said many of these apps can violate your privacy as soon as they are opened.

Dan Hastings, a senior security consultant cybersecurity firm NCC Group, analyzed some of the most popular robocall-blocking apps — including TrapCall, Truecaller, and Hiya — and found egregious privacy violations.

Robocalls are getting worse, with some getting tens or dozens of calls a day. These automated calls demand you “pay the IRS” a fine you don’t owe or pretend to be tech support. They often try to trick you into picking up the phone by spoofing their number to look like a local caller. But as much as the cell networks are trying to cut down on spam, many are turning to third-party apps to filter their incoming calls.

But many of these apps, said Hastings, send user or device data to third-party data analytics companies — often to monetize your information — without your explicit consent, instead burying the details in their privacy policies.

One app, TrapCall, sent users’ phone numbers to a third-party analytics firm, AppsFlyer, without telling users — either in the app nor in the privacy policy.

He also found Truecaller and Hiya uploaded device data — device type, model and software version, among other things — before a user could accept their privacy policies. Those apps, said Hastings, violate Apple’s app guidelines on data use and sharing, which mandate that app makers first obtain permission before using or sending data to third-parties.

Many of the other apps aren’t much better. Several other apps that Hastings tested immediately sent some data to Facebook as soon as the app loaded.

“Without having a technical background, most end users aren’t able to evaluate what data is actually being collected and sent to third parties,” said Hastings. “Privacy policies are the only way that a non-technical user can evaluate what data is collected about them while using an app.”

None of the companies acted on emails from Hastings warning about the privacy issues, he said. It was only after he contacted Apple when TrapCall later updated its privacy policy.

But he reserved some criticism for Apple, noting that app privacy policies “don’t appear to be monitored” as he discovered with Truecaller and Hiya.

“Privacy policies are great, but apps need to get better about abiding by them,” said Hastings.

“If most people took the time to read and try to understand privacy policies for all the apps they use (and are able to understand them!), they might be surprised to see how much these apps collect,” he said. “Until that day, end-users will have to rely on security researchers performing manual deep dives into how apps handle their private information in practice.”

Spokespeople for TrapCall, Truecaller, and Hiya did not comment when reached prior to publication.

09 Aug 2019

Robocall blocking apps caught sending your private data without permission

Robocall-blocking apps promise to rid your life of spoofed and spam phone calls. But are they as trustworthy as they claim to be?

One security researcher said many of these apps can violate your privacy as soon as they are opened.

Dan Hastings, a senior security consultant cybersecurity firm NCC Group, analyzed some of the most popular robocall-blocking apps — including TrapCall, Truecaller, and Hiya — and found egregious privacy violations.

Robocalls are getting worse, with some getting tens or dozens of calls a day. These automated calls demand you “pay the IRS” a fine you don’t owe or pretend to be tech support. They often try to trick you into picking up the phone by spoofing their number to look like a local caller. But as much as the cell networks are trying to cut down on spam, many are turning to third-party apps to filter their incoming calls.

But many of these apps, said Hastings, send user or device data to third-party data analytics companies — often to monetize your information — without your explicit consent, instead burying the details in their privacy policies.

One app, TrapCall, sent users’ phone numbers to a third-party analytics firm, AppsFlyer, without telling users — either in the app nor in the privacy policy.

He also found Truecaller and Hiya uploaded device data — device type, model and software version, among other things — before a user could accept their privacy policies. Those apps, said Hastings, violate Apple’s app guidelines on data use and sharing, which mandate that app makers first obtain permission before using or sending data to third-parties.

Many of the other apps aren’t much better. Several other apps that Hastings tested immediately sent some data to Facebook as soon as the app loaded.

“Without having a technical background, most end users aren’t able to evaluate what data is actually being collected and sent to third parties,” said Hastings. “Privacy policies are the only way that a non-technical user can evaluate what data is collected about them while using an app.”

None of the companies acted on emails from Hastings warning about the privacy issues, he said. It was only after he contacted Apple when TrapCall later updated its privacy policy.

But he reserved some criticism for Apple, noting that app privacy policies “don’t appear to be monitored” as he discovered with Truecaller and Hiya.

“Privacy policies are great, but apps need to get better about abiding by them,” said Hastings.

“If most people took the time to read and try to understand privacy policies for all the apps they use (and are able to understand them!), they might be surprised to see how much these apps collect,” he said. “Until that day, end-users will have to rely on security researchers performing manual deep dives into how apps handle their private information in practice.”

Spokespeople for TrapCall, Truecaller, and Hiya did not comment when reached prior to publication.