Category: UNCATEGORIZED

17 Dec 2020

Walnut raises $3.5M from Ron Conway and others to stop remote sales pitches breaking

Amid the rise of remote selling due to the pandemic, Walnut, billing itself as a ‘Wix for Sales teams,’ has raised $3.5m from SV Angel (Ron and Topher Conway) and A.Capital by Ronny Conway and former a16z partners. This brings its total amount raised to $6m. Other investors include NFX, Joe Montana, Wix CEO, Immad Akhund and Kenny Stone.

Walnut is solving the friction between sales and back-end teams and the problem of sales demos that break during calls. A sales rep logs in to the main dashboard/product they are trying to sell and uses Walnut to choose pages and features that will be disconnected from the backend and appear as front-end in Walnut’s cloud, then edit to match it to the specific client. It’s like building web pages but for sales pitches.

It’s currently working with enterprise clients like Varonis and Adobe. While it competes with Gong.io, Walnut is hoping that its more technical approach of disconnecting front-end and back-end will aid its progress.

17 Dec 2020

Google grants $3 million to the CNCF to help it run the Kubernetes infrastructure

Back in 2018, Google announced that it would provide $9 million in Google Cloud Platform credits — divided over three years — to the Cloud Native Computing Foundation (CNCF) to help it run the development and distribution infrastructure for the Kubernetes project. Previously, Google owned and managed those resources for the community. Today, the two organizations announced that Google is adding on to this grant with another $3 million annual donation to the CNCF to “help ensure the long-term health, quality and stability of Kubernetes and its ecosystem.”

As Google notes, the funds will go to the testing and infrastructure of the Kubernetes project, which currently sees over 2,300 monthly pull requests that trigger about 400,000 integration test runs, all of which use about 300,000 core hours on GCP.

“I’m really happy that we’re able to continue to make this investment,” Aparna Sinha, a director of product management at Google and the chairperson of the CNCF governing board, told me. “We know that it is extremely important for the long-term health, quality and stability of Kubernetes and its ecosystem and we’re delighted to be partnering with the Cloud Native Computing Foundation on an ongoing basis. At the end of the day, the real goal of this is to make sure that developers can develop freely and that Kubernetes, which is of course so important to everyone, continues to be an excellent, solid, stable standard for doing that.”

Sinha also noted that Google contributes a lot of code to the project, with 128,000 code contributions in the last twelve months alone. But on top of these technical contributions, the team is also making in-kind contributions through community engagement and mentoring, for example, in addition to the kind of financial contributions the company is announcing today.

“The Kubernetes project has been growing so fast — the releases are just one after the other,” said Priyanka Sharma, the General Manager of the CNCF. “And there are big changes, all of this has to run somewhere. […] This specific contribution of the $3 million, that’s where that comes in. So the Kubernetes project can be stress-free, [knowing] they have enough credits to actually run for a full year. And that security is critical because you don’t want Kubernetes to be wondering where will this run next month. This gives the developers and the contributors to the project the confidence to focus on feature sets, to build better, to make Kubernetes ever-evolving.”

It’s worth noting that while both Google and the CNCF are putting their best foot forward here, there have been some questions around Google’s management around the Istio service mesh project, which was incubated by Google and IBM a few years ago. At some point in 2017, there was a proposal to bring it under the CNCF umbrella, but that never happened. This year, Istio became one of the founding projects of Open Usage Commons, though that group is mostly concerned with trademarks, not with project governance. And while all of this may seem like a lot of inside baseball — and it is — but it had some members of the open-source community question Google’s commitment to organizations like the CNCF.

“Google contributes to a lot of open-source projects. […] There’s a lot of them, many are with open-source foundations under the Linux Foundation, many of them are otherwise,” Singha said when I asked her about this. “There’s nothing new, or anything to report about anything else. In particular, this discussion — and our focus very much with the CNCF here is on Kubernetes, which I think — out of everything that we do — is by far the biggest contribution or biggest amount of time and biggest amount of commitment relative to anything else.”

17 Dec 2020

GitHub says goodbye to cookie banners

Microsoft -owned GitHub today announced that it is doing away with all non-essential cookies on its platform. Thanks to this, starting today, GitHub .com and its subdomains will not feature a cookie banner anymore, either. That’s one less cookie banner you’ll have to click away to get your work done.

“No one likes cookie banners,” GitHub CEO Nat Friedman writes in today’s announcement. “But cookie banners are everywhere!”

The reason for that, of course, is because of regulations like GDPR in the U.S. and the EU’s directive to give users the right to refuse the use of cookies that reduce their online privacy. The result, even though these regulations have the users’ best interest in mind, is the constant barrage of cookie banners you experience today.

“At GitHub, we want to protect developer privacy, and we find cookie banners irritating, so we decided to look for a solution. After a brief search, we found one: just don’t use any non-essential cookies. Pretty simple, really,” Friedman writes.

To be fair, for a service like GitHub, it may be a bit easier to do away with cookies than for most sites — and especially content sites (and yes, I’m well aware that you probably had to click away from a cookie popup when you came to TechCrunch, too. Feel free to tell me about the irony of that in the comments). GitHub, after all, has a paid product and an audience that likely uses extensions to block trackers and unnecessary cookies anyway. Because of this, the tracking data it gathered was probably not all that useful anyway. GitHub is one of the first large sites to make this move, though, and may be able to set a bit of a trend.

17 Dec 2020

Tips for applying an intersectional framework to AI development

By now, most of us in tech know that the inherent bias we possess as humans creates an inherent bias in AI applications — applications that have become so sophisticated they’re able to shape the nature of our everyday lives and even influence our decision-making.

The more prevalent and powerful AI systems become, the sooner the industry must address questions like: What can we do to move away from using AI/ML models that demonstrate unfair bias?

How can we apply an intersectional framework to build AI for all people, knowing that different individuals are affected by and interact with AI in different ways based on the converging identities they hold?

Start with identifying the variety of voices that will interact with your model.

Intersectionality: What it means and why it matters

Before tackling the tough questions, it’s important to take a step back and define “intersectionality.” A term defined by Kimberlé Crenshaw, it’s a framework that empowers us to consider how someone’s distinct identities come together and shape the ways in which they experience and are perceived in the world.

This includes the resulting biases and privileges that are associated with each distinct identity. Many of us may hold more than one marginalized identity and, as a result, we’re familiar with the compounding effect that occurs when these identities are layered on top of one another.

At The Trevor Project, the world’s largest suicide prevention and crisis intervention organization for LGBTQ youth, our chief mission is to provide support to each and every LGBTQ young person who needs it, and we know that those who are transgender and nonbinary and/or Black, Indigenous, and people of color face unique stressors and challenges.

So, when our tech team set out to develop AI to serve and exist within this diverse community — namely to better assess suicide risk and deliver a consistently high quality of care — we had to be conscious of avoiding outcomes that would reinforce existing barriers to mental health resources like a lack of cultural competency or unfair biases like assuming someone’s gender based on the contact information presented.

Though our organization serves a particularly diverse population, underlying biases can exist in any context and negatively impact any group of people. As a result, all tech teams can and should aspire to build fair, intersectional AI models, because intersectionality is the key to fostering inclusive communities and building tools that serve people from all backgrounds more effectively.

Doing so starts with identifying the variety of voices that will interact with your model, in addition to the groups for which these various identities overlap. Defining the opportunity you’re solving is the first step because once you understand who is impacted by the problem, you can identify a solution. Next, map the end-to-end experience journey to learn the points where these people interact with the model. From there, there are strategies every organization, startup and enterprise can apply to weave intersectionality into every phase of AI development — from training to evaluation to feedback.

Datasets and training

The quality of a model’s output relies on the data on which it’s trained. Datasets can contain inherent bias due to the nature of their collection, measurement and annotation — all of which are rooted in human decision-making. For example, a 2019 study found that a healthcare risk-prediction algorithm demonstrated racial bias because it relied on a faulty dataset for determining need. As a result, eligible Black patients received lower risk scores in comparison to white patients, ultimately making them less likely to be selected for high-risk care management.

Fair systems are built by training a model on datasets that reflect the people who will be interacting with the model. It also means recognizing where there are gaps in your data for people who may be underserved. However, there’s a larger conversation to be had about the overall lack of data representing marginalized people — it’s a systemic problem that must be addressed as such, because sparsity of data can obscure both whether systems are fair and whether the needs of underrepresented groups are being met.

To start analyzing this for your organization, consider the size and source of your data to identify what biases, skews or mistakes are built-in and how the data can be improved going forward.

The problem of bias in datasets can also be addressed by amplifying or boosting specific intersectional data inputs, as your organization defines it. Doing this early on will inform your model’s training formula and help your system stay as objective as possible — otherwise, your training formula may be unintentionally optimized to produce irrelevant results.

At The Trevor Project, we may need to amplify signals from demographics that we know disproportionately find it hard to access mental health services, or for demographics that have small sample sizes of data compared to other groups. Without this crucial step, our model could produce outcomes irrelevant to our users.

Evaluation

Model evaluation is an ongoing process that helps organizations respond to ever-changing environments. Evaluating fairness began with looking at a single dimension — like race or gender or ethnicity. The next step for the tech industry is figuring out how to best compare intersectional groupings to evaluate fairness across all identities.

To measure fairness, try defining intersectional groups that could be at a disadvantage and the ones that may have an advantage, and then examine whether certain metrics (for example, false-negative rates) vary among them. What do these inconsistencies tell you? How else can you further examine which groups are underrepresented in a system and why? These are the kinds of questions to ask at this phase of development.

Developing and monitoring a model based on the demographics it serves from the start is the best way for organizations to achieve fairness and alleviate unfair bias. Based on the evaluation outcome, a next step might be to purposefully overserve statistically underrepresented groups to facilitate training a model that minimizes unfair bias. Since algorithms can lack impartiality due to societal conditions, designing for fairness from the outset helps ensure equal treatment of all groups of individuals.

Feedback and collaboration

Teams should also have a diverse group of people involved in developing and reviewing AI products — people who are diverse not only in identities, but also in skillset, exposure to the product, years of experience and more. Consult stakeholders and those who are impacted by the system for identifying problems and biases.

Lean on engineers when brainstorming solutions. For defining intersectional groupings, at The Trevor Project, we worked across the teams closest to our crisis-intervention programs and the people using them — like Research, Crisis Services and Technology. And reach back out to stakeholders and people interacting with the system to collect feedback upon launch.

Ultimately, there isn’t a “one-size-fits-all” approach to building intersectional AI. At The Trevor Project, our team has outlined a methodology based on what we do, what we know today and the specific communities we serve. This is not a static approach and we remain open to evolving as we learn more. While other organizations may take a different approach to build intersectional AI, we all have a moral responsibility to construct fairer AI systems, because AI has the power to highlight — and worse, magnify — the unfair biases that exist in society.

Depending on the use case and community in which an AI system exists, the magnification of certain biases can result in detrimental outcomes for groups of people who may already face marginalization. At the same time, AI also has the ability to improve quality of life for all people when developed through an intersectional framework. At The Trevor Project, we strongly encourage tech teams, domain experts and decision-makers to think deeply about codifying a set of guiding principles to initiate industry-wide change — and to ensure future AI models reflect the communities they serve.

17 Dec 2020

Look at what’s happening at day two of TC Sessions: Space 2020

Yesterday, day one of TC Sessions: Space 2020, was (brace yourself) out of this world. Bad puns aside, today’s lineup promises even more in-depth interviews, Q&As and breakout sessions with the top experts in the space industry. It also includes a pitch-off and plenty of networking opportunities to connect with the leading movers and shakers across public, private and defense sectors.

It’s hard to imagine a more complex or expensive industry, and you’ll need all the tips, insight, guidance — and funding — you can get to successfully launch your space startup. Here’s a quick look at just some of day two’s presentations and opportunities designed to help you build a successful business. Check out the complete event agenda to plan your day.

Timely tip: The agenda automatically reflects the time zone in which you’re currently located.

Fast Money — Advancing Space Technology with NASA SBIR: Learn about the Small Business Innovation Research (SBIR) and the Small Business Technology Transfer (STTR) programs powered by NASA. Jenn Gustetic (NASA) and Lydia Etters (The Aerospace Corporation)

Starburst x TechCrunch — Pitch Me to the Moon: In which 10 promising early-stage space startups pitch their innovations live to a panel of high-profile judges from across the industry (including reps from Starburst and TechCrunch).

From Idea to Orbit: Rocket Lab has quickly become one of the most sought-after launch providers in the world. Founder and CEO Peter Beck will discuss the company’s approach to making space more accessible, from cheaper, faster launches to its new satellite platform.

Bridging Today and Tomorrow’s Tech: Corporate VC funds are a key source of investment for space startups, in part because they often involve partnerships that help generate revenue as well, and because they understand the timelines involved. We’ll talk about how they fit in with more standard venture to power the ecosystem. Don’t forget to submit your questions for the panel! Meagan Crawford (SpaceFund) and J. Christopher Moran (Lockheed Martin Ventures).

University Showcase — Boldly Innovating in Space, for Space (Part Two): Part one was fascinating, and you don’t want to miss part two of this university research showcase. You’ll hear scientists and academics from USC, MIT, UCLA, ASU and Caltech — all in partnership with The Aerospace Corporation — share insights on their space research and highlight a range of emerging space technologies. Learn how those technologies are evaluated and integrated into government missions with NASA, NOAA and the Air Force. Randy Villahermosa (The Aerospace Corporation), Kerri Cahoy (MIT), Bethany Ehlmann (Caltech), Craig Hardgrove (Arizona State University) and Dr. James M. Weygand (UCLA).

No FOMO: It’s not too late to buy a pass to get access to all of yesterday’s content and connect immediately with the global space community — for as little as $50.

17 Dec 2020

Spryker raises $130M at a $500M+ valuation to provide B2Bs with agile e-commerce tools

Businesses today feel, more than ever before, the imperative to have flexible e-commerce strategies in place, able to connect with would-be customers wherever they might be. That market driver has now led to a significant growth round for a startup that is helping the larger of these businesses, including those targeting the B2B market, build out their digital sales operations with more agile, responsive e-commerce solutions.

Spryker, which provides a full suite of e-commerce tools for businesses — starting with a platform to bring a company’s inventory online, through to tools to analyse and measure how that inventory is selling and where, and then adding on voice commerce, subscriptions, click & collect, IoT commerce, and other new features and channels to improve the mix — has closed a round of $130 million.

It plans to use the funding to expand its own technology tools, as well as grow internationally. The company makes revenues in the mid-8 figures (so, around $50 million annually) and some 10% of its revenues currently come from the U.S. The plan will be to grow that business as part of its wider expansion, tackling a market for e-commerce software that is estimated to be worth some $7 billion annually.

The Series C was led by TCV — the storied investor that has backed giants like Facebook, Airbnb, Netflix, Spotify and Splunk, as well as interesting, up-and-coming e-commerce “plumbing” startups like Spryker, Relex and more. Previous backers One Peak and Project A Ventures also participated.

We understand that this latest funding values Berlin-based Spryker at over $500 million.

Spryker today has around 150 customers, global businesses that run the gamut from recognised fashion brands through to companies that, as Boris Lokschin, who co-founded the company with Alexander Graf (the two share the title of co-CEOs) put it, are “hidden champions, leaders and brands you have never heard about doing things like selling silicone isolations for windows.” The roster includes Metro, Aldi Süd, Toyota and many others.

The plan will be to continue to support and grow its wider business building e-commerce tools for all kinds of larger companies, but in particular Spryker plans to use this tranche of funding to double down specifically on the B2B opportunity, building more agile e-commerce storefronts and in some cases also developing marketplaces around that.

One might assume that in the world of e-commerce, consumer-facing companies need to be the most dynamic and responsive, not least because they are facing a mass market and all the whims and competitive forces that might drive users to abandon shopping carts, look for better deals elsewhere, or simply get distracted by the latest notification of a TikTok video or direct message.

For consumer-facing businesses, making sure they have the latest adtech, marketing tech, and tools to improve discovery and conversion is a must.

It turns out that business-facing businesses are no less immune to their own set of customer distractions and challenges — particularly in the current market, buffeted as it is by the global health pandemic and its economic reverberations. They, too, could benefit from testing out new channels and techniques to attract customers, help them with discovery and more.

“We’ve discovered that the model for success for B2B businesses online is not about different people, and not about money. They just don’t have the tooling,” said Graf. “Those that have proven to be more successful are those that are able to move faster, to test out everything that comes to mind.”

Spryker positions itself as the company to help larger businesses do this, much in the way that smaller merchants have adopted solutions from the likes of Shopify.

In some ways, it almost feels like the case of Walmart versus Amazon playing itself out across multiple verticals, and now in the world of B2B.

“One of our biggest DIY customers [which would have previously served a mainly trade-only clientele] had to build a marketplace because of restrictions in their brick and mortar assortment, and in how it could be accessed,” Lokschin said. “You might ask yourself, who really needs more selection? But there are new providers like Mano Mano and Amazon, both offering millions of products. Older companies then have to become marketplaces themselves to remain competitive.”

It seems that even Spryker itself is not immune from that marketplace trend: part of the funding will be to develop a technology AppStore, where it can itself offer third-party tools to companies to complement what it provides in terms of e-commerce tools.

“We integrate with hundreds of tech providers, including 30-40 payment providers, all of the essential logistics networks,” Lokschin said.

Spryker is part of that category of e-commerce businesses known as “headless” providers — by which they mean those using the tools do so by way of API-based architecture and other easy-to-integrate modules delivered through a “PaaS” (clould-based Platform as a Service) model.

It is not alone in that category: There have been a number of others playing on the same concept to emerge both in Europe and the U.S. They include Commerce Layer in Italy; another startup out of Germany called Commercetools; and Shogun in the U.S.

Spryker’s argument is that by being a newer company (founded in 2018) it has a more up-to-date stack that puts it ahead of older startups and more incumbent players like SAP and Oracle.

That is part of what attracted TCV and others in this round, which was closed earlier than Spryker had even planned to raise (it was aiming for Q2 of next year) but came on good terms.

“The commerce infrastructure market has been a high priority for TCV over the years. It is a large market that is growing rapidly on the back of e-commerce growth,” said Muz Ashraf, a principal at TCV, to TechCrunch. “We have invested in across other areas of the commerce stack, including payments (Mollie, Klarna), underlying infrastructure (Redis Labs) as well as systems of engagement (ExactTarget, Sitecore). Traditional offline vendors are increasingly rethinking their digital commerce strategy, more so given what we are living through, and that further acts as a market accelerant.

“Having tracked Spryker for a while now, we think their solution meets the needs of enterprises who are increasingly looking for modern solutions that allow them to live in a best-of-breed world, future-proofing their commerce offerings and allowing them to provide innovative experiences to their consumers.”

17 Dec 2020

Is rising usage driving crypto’s recent price boom?

Everything is dumb until it works.

As 2020 comes to a close, the cryptocurrency world is experiencing another late-year surge of consumer interest as prices climb in value. Bitcoin is over $23,000 as I write to you, an all-time high. Ethereum’s cryptocurrency has recovered sharply as well, returning to mid-2018 prices.


The Exchange explores startups, markets and money. Read it every morning on Extra Crunch, or get The Exchange newsletter every Saturday.


These gains have created a huge amount of wealth for crypto holders. According to CoinMarketCap, after falling under $140 billion in mid-March during the market selloff surrounding the beginning of COVID-19’s battering of America, the value of all cryptos has surged to nearly $659 billion.

It still has some way to go before it crests the record of around $830 billion set back in January 2018. But your Twitter feed is once again rife with notes about crypto and some of your friends have become insufferable once again.

The tweets and the friends have something of a point. This morning I went around the Internet with a basket, collecting information about active bitcoin wallets, the distributed app (Dapp) market, the burgeoning decentralized finance (DeFi) space and other aspects to get a picture of what’s going on beyond mere price records.

After all, the price of every damn thing is inflated today, so seeing bitcoin set an all-time-high felt more appropriate than strange. Does the data show that there’s activity behind the valuation hype?

A quick look around the world of crypto

We have a few metrics to peek at, but let’s start with some old bitcoin-flavored favorites.

  • Unique bitcoin addresses used, via Blockchain.info: Modestly bullish.

Per the charting section of Blockchain.info, bitcoin unique addresses used — a proxy for the coin’s popularity — is up some in recent weeks, and up more generally in 2020. It remains below historical highs.

17 Dec 2020

Google now lets you virtually try on makeup using AR, shop from influencer videos

If you’ve ever used a Snapchat or Instagram filter, you know that one of the popular use cases for AR (augmented reality) is to change up your appearance with virtual makeup — like a different shade of lipstick or eyeshadow, for example. Today, Google is moving into this space, as well, with the launch of an AR-powered cosmetics try-on experience on Google Search. The company is working partnership with top brands like L’Oreal, Estee Lauder, MAC Cosmetics, Black Opal, and Charlotte Tilbury, to allow consumers to try on makeup shades across a range of models with various skin tones or even on themselves using their front-facing camera on mobile devices.

Google created the new feature with help from data partners ModiFace, which offers AR tech to beauty brands, and Perfect Corp, makers of the popular YouCam Makeup app and other AR beauty tech.

Image Credits: Google

Now, when consumers search for a particular lipstick or eyeshadow product on Google — like “L’Oreal’s Infallible Paints Metallic Eyeshadow,” for example — they’ll come across the virtual try-on shopping experience at the top of their search results. From here, they can click through photos of models representing a range of skin tone to help compare the shades and find the right product for you.

To see the product on yourself, another new option will enable you to do so with your phone’s camera. In this interface, a variety of shades are shown at the bottom of the camera feed for you to tap through — much like a social media filter would offer. The experience is similar to YouTube’s AR feature for makeup try-on launched last year.

Image Credits: Google

Except in Google’s case, it’s not trying to beautify your image for the purposes of social sharing. Its aim is to connect consumers to brands for the purpose of making sales, as part of its overall investment in online shopping and, of course, its broader online advertising business.

However, the AR try-on experience itself is not considered an ad format, Google tells us, and the participating brands are not paying Google to be a part of the feature. Instead, this is a continuation of Google’s moves to open up the Google Shopping destination to more retailers. In past years, the Shopping tab had been limited to paid product listings. But earlier this year, Google announced it would make the majority of its retail listings on the Shopping tab free.

This move came at a critical time for retailers, whose businesses were being significantly impacted by physical retail store closures due to the pandemic. But Google wasn’t making the shift for altruistic reasons. The reality was that by limiting Shopping to paid ads, Google’s Shopping search results were limited, too. They also often led to out-of-stock items or had other data quality issues. Meanwhile, Amazon was ramping up its advertising business, threatening to cut into Google’s ad revenue.

Plus, many younger consumers today don’t shop on Google at all. They have been finding out about products from social media, then clicking through direct links to retailers to make purchases or even transacting directly on social platforms like Facebook or Instagram without leaving the app.

Google is entering this influencer-driven shopping market today, as well.

In addition to the AR try-on, Google will now show recommendations from beauty, apparel and home and garden enthusiasts and experts, who will talk about their favorite products in videos found in Google Shopping. For example, you can hear from professional makeup artist Jonet about makeup looks or holiday gifts from Homesick Candles.

This feature comes from Shoploop, a product that has now graduated from Area 120, Google’s in-house incubator. It competes with efforts in video-based shopping from Facebook, Instagram and, more recently, TikTok.

The launches arrive at a time when beauty brand sales have been depressed by the pandemic, not only because of store closures and a work-from-home lifestyle, but also because it doesn’t make as much sense to wear makeup when half your face is under a mask.

The new AR try-on and the influencer videos are available in the Google app on iOS and Android.

 

 

17 Dec 2020

Europe clears Google-Fitbit with a ten-year ban on using health data for ads

Europe has greenlit Google’s $2.1BN acquisition of fitness wearable maker Fitbit, applying a number of conditions intended to shrink competition concerns over letting it gobble a major cache of health and wellness data following months of regulatory scrutiny of the deal.

While Google announced its plan to buy Fitbit over a year ago, it only notified the transaction to the Commission on June 15, 2020 — meaning it’s taken half a year to be given a caveated go-ahead by Europe. It is also now facing formal antitrust charges on its home turf — from more than one angle (though not related to Fitbit).

Under the terms of the EU’s clearance for ‘Gitbit’, Google has committed not use the Fitbit data of users in the European Economic Area for ad targeting purposes for a ten year period.

It says it will maintain a technical separation between health and wellness data collected via Fitbit wearables in a data silo kept separate from other Google data.

It has also committed to ensuring that regional users have an “effective choice” to grant or deny the use of health and wellness data stored in their Google Account or Fitbit Account by other Google services — such as Google Search, Google Maps, Google Assistant, and YouTube. But it’ll be interesting to see how much dark pattern design gets applied there.

Interestingly, the Commission says it may decide to extend the duration of the decade-long Ads Commitment by up to an additional ten years — if such an extension can be justified.

It further notes that clearance is conditional upon full compliance with all the commitments — the implementation of which will be monitored by a trustee, who must be appointed before the transaction can close.

This yet-to-be-appointed-person will have what the Commission couches as “far-reaching competences” — including access to “Google’s records, personnel, facilities or technical information”.

So EU regulators are taking a ‘trust but verify’ approach to letting this big tech merger steamroller on.

There are further competition focused commitments, too.

Here, Google has agreed to maintain access to Fitbit users’ data via a web API without charging third party developers for the access (of course subject to user consent).

It has also agreed to a number of commitments related to rival wearable makers’ access to Android APIs — saying it will continue free licensing for all core functionality competing devices need to plug into its dominant smartphone OS, Android.

Improvements in device functionality are covered under the agreement, per the Commission, so it’s intended to allow rival wearable makers to continue to be able to innovate without the risk of making a better/more capable device resulting in them being shut out of the Android ecosystem.

Google must also maintain API support in the Android Open Source Project version of its mobile platform.

Another concession the Commission has extracted from Google during this half year of investigation and negotiation is to say it won’t seek to circumvent the requirement to support rivals’ kit accessing Android via the API by degrading the user experience (such as by displaying warnings or error messages).

That’s — frankly — a pretty dysfunctional signal for a regulatory clearance to have to send. And it highlights the level of mistrust that has built up about how Google’s business operates.

Which in turn raises the existential question for EU regulators of why they are allowing themselves to bend over and let Google-Fitbit go ahead. Unsurprisingly, then, the Commission’s PR sounds a tad defensive — with EU lawmakers writing that the decision “is without prejudice to the Commission’s efforts to ensure fair and contestable markets in the digital sector, notably through the recently proposed Digital Markets Act” (DMA).

It also notes that the monitoring trustee will be entitled to share reports that it provides to the Commission with Google’s lead data protection supervisor, the Irish Data Protection Commission. (Though given the massive big tech-related case backlog on its desk — including a number of investigations into other elements of Google’s business — that’s unlikely to cause Mountain View any extra sleepless nights.)

The Commission does also say that commitments secured from Google include “a fast track dispute resolution mechanism that can be invoked by third parties”. So it’s clearly trying to go the extra mile to justify greenlighting further consolidation in a consumer digital services space that Google already massively dominates — at a time when US lawmakers are headed in the opposite direction. So, um…

Civil society in Europe (and beyond) has been raising a massive clamour about the Google-Fitbit acquisition ever since it was announced — urging the bloc’s regulators to stop the tech giant from gobbling Fitbit’s cache of health data, unless or until human rights protections can be guaranteed.

Today the Commission has sidestepped those wider rights concerns.

At best it believes it’s kicked the can down the road a decade or, max, two. And by 2030 (or 2040) it will hope that rules it’s just proposed to put strictures on digital gatekeepers like Google will be in a position to prevent future abuse in check.

The EU’s oft stated preference is to regulate tech giants, not to break up their empires — or, as it turns out, stand in the way of further empire expansion.

Commenting on the Google-Fitbit clearance in a statement, Vestager said: “We can approve the proposed acquisition of Fitbit by Google because the commitments will ensure that the market for wearables and the nascent digital health space will remain open and competitive. The commitments will determine how Google can use the data collected for ad purposes, how interoperability between competing wearables and Android will be safeguarded and how users can continue to share health and fitness data, if they choose to.”

Taking questions from a European parliament committee last week, Vestager signalled the inexorable looming clearance of Gitbit — saying the US and Europe have a different approach to dealing with market-dominating tech giants. “In Europe we do not have a ban of monopolies,” she told MEPs. “They have a different legal basis in the U.S. We would say you’re more than welcome to be successful but with success comes responsibility — which is why we have article 102 [against abusing a dominant position].”

It’s also why the Commission has felt the need to propose new regulation to strengthen competition enforcement in digital markets — though it’ll most likely be years before the DMA is adopted.

And in the meanwhile EU regulators are letting Google expand its dominance of people’s private information by bagging up Fitbit’s treasure trove of sensitive health data — for full exploitation later.

So plenty will say the Commission is just fiddling round the margins and giving big tech a bypass for competition enforcement.

17 Dec 2020

2020’s top 10 enterprise M&A deals totaled a staggering $165B

While 2020 won’t be remembered fondly by many of us for much of anything, it was a blockbuster year for enterprise M&A with the top 10 deals totaling an astounding $165.2 billion.

This is the third straight year I’ve done this compilation. Last year the number was $40 billion. The year prior it was $87 billion. Those numbers pale in comparison to 2020’s result.

Last year’s biggest deal — Salesforce buying Tableau for $15.7 billion — would have only been good for fifth place on this year’s list. And last year’s fourth largest deal, where VMware bought Pivotal for $2.7 billion, wouldn’t have even made this year’s list at all.

The 2020 number was lifted by four chip company deals totaling $106 billion alone. Consider that the largest of these deals at $40 billion matched last year’s entire list. But let’s not forget the software company acquisitions, which accounted for the remainder, three of which were via private equity deals.

It’s worth noting that the $165.2 billion figure doesn’t include the Oracle-TikTok debacle, which remains for now in regulatory limbo and may never emerge from it. Nor does it include two purely fintech deals — Morgan Stanley acquiring E-Trade for $13 billion or Intuit snagging Credit Karma for $7.1 billion — but we did include the $5.3 billion Visa-Plaid deal because as it involved an enterprise-y API company we felt like it fit our criteria.

Keep in mind as you go through this year’s list that it appears to be an outlier year in terms of total deal flow. Most years have maybe one or two megadeals, which I would define as over $10 billion. There were six this year. And there were a host of unlisted deals worth between $1 billion and $3.2 billion, several of which would have made it to the list in quieter years.

Without further adieu, here is this year’s Top 10 deals in M&A organized from smallest to largest:

10. Vista snags Pluralsight for $3.5B

This deal happened just this week as we were writing the story, vaulting into 10th place past the $3.2 billion Twilio-Segment deal. Vista has been active as always and it has added Pluralsight, an online education platform for IT pros with plans to take it private again. At a time when more people are online, this deal seems like a wise move.

9. KKR acquires Epicor for $4.7B

This was one of those under-the-radar private equity deals, but one with a bushel of money changing hands. Epicor, hardly a household name, is a mature ERP company dating back to the early 1970s. The company has been on a rocky financial road for much of the 21st century. This could be one of those deals where KKR sees a way to squeeze life from maintenance contracts. Otherwise this one is hard to figure.

8. Insight Partners nabs Veeam for $5B

In yet another private equity deal, Insight acquired Veeam, a cloud data backup and recovery startup based in Switzerland for $5 billion. This one was one of the earliest deals of 2020 and set the tone for the year. The firm had previously invested $500 million into Veeam and apparently liked what it saw and bought the company. Unlike the Epicor deal, Insight probably plans to invest in the company with an end goal of going public or flipping it for a profit at some point.