Author: azeeadmin

07 Sep 2018

Apple Music launches a ‘Top Charts’ playlist series

Apple Music is rolling out a new playlist series that will feature the Top 100 songs on Apple Music globally and for those countries where Apple Music is available. Because they’re playlists, users will be able to add these top charts for their country or the Top 100 Global songs to their library so they can stream them any time, or listen offline.

The feature was first reported by Rolling Stone, which was given a preview of the changes by Apple.

At launch, there are 116 charts launching in total, including the Top 100 Global and one for each Apple Music market. Many countries will have access to all of these new Top 100 playlist charts, but availability will vary, we understand.

What’s also interesting about the top chart playlists is that they’ll be updated daily at 12:00 AM PT based on Apple Music streams, which keeps them fresh.

Rolling Stone’s report indicates the release of these charts is due to growing importance of streaming numbers. Artists and their managers as well as labels and scouts tend to reference top streaming charts in the hunt for new talent, it says. And the industry has adapted, too, by more heavily weighting paid streaming over free.

On that front, Apple Music’s dominance in North America means its numbers, in particular, are important to track.

Apple Music, now with 50 million paid subscribers worldwide, is currently ahead of Spotify in the North American market, according to comments made by CEO Tim Cook on the latest earnings call.

“We took the leadership position in North America during the quarter and we have the leadership position in Japan, and in some of the markets that we’ve been in for a long period of time,” he said in July.

Spotify is still ahead on the worldwide stage, with 83 million paid users. 

However, it’s worth also pointing out that these new top charts aren’t just launching as a static section of the Apple Music app – they’re dynamic playlists.

That is, Apple’s new Top Charts playlists will not be replacing the existing Top 200 Songs chart, available today.

Playlists are an important battleground between the major streaming services, with Spotify focusing heavily on personalization with playlists like its flagship Discover Weekly, plus Release Radar, Daily Mixes (and a newer variation, Your Daily Car Mix), Your Summer Rewind, and Time Capsule.

Apple Music, meanwhile, offers users a Favorites playlist, along with a New Music Mix, Chill Mix, and is rolling out a Friends Mix in iOS 12.

The feature is available today on Apple Music. You can check out these playlists as an example:

 

07 Sep 2018

Dozens of popular iPhone apps caught sending user location data to monetization firms

A group of security researchers say dozens of popular iPhone apps are quietly sharing the location data of “tens of millions of mobile devices” with third-party data monetization firms.

Almost all require access to a user’s location data to work properly like weather and fitness apps, but share that data often as a way to generate revenue for free-to-download apps.

In many cases, the apps send precise locations and other sensitive, identifiable data “at all times, constantly,” and often with “little to no mention” that location data will be shared with third-parties, say security researchers at the GuardianApp project.

“I believe people should be able to use any app they wish on their phone without fear that granting access to sensitive data may mean that this data will be quietly sent off to some entity who they do not know and do not have any desire to do business with,” said Will Strafach, one of the researchers.

Using tools to monitor network traffic, the researchers found 24 popular iPhone apps that were collecting location data — like Bluetooth beacons to Wi-Fi network names — to know where a person is and where they visit. These data monetization firms also collect other device data from the accelerometer, battery charge status and cell network names.

In exchange for data, often these data firms pay app developers to collect data and grow their databases and often to deliver ads based on a person’s location history.

But although many claim they don’t collect personally identifiable information, Strafach said that latitude and longitude coordinates can pin a person to a house or their work.

To name a few:

ASKfm, a teen-focused anonymous question-and-answer app, has 1,400 ratings on the Apple App Store and touts tens of millions of users. It asks for access to a user’s location that “won’t be shared with anyone.” But the app sends that location data to two data firms, AreaMetrics and Huq. When reached, the app maker said it believes its location collection practices “fit industry standards, and are therefore acceptable for our users.”

NOAA Weather Radar has over 266,000 reviews and has millions of downloads. Access to your location “is used to provide weather info.” But an earlier version of the app from March was sending location data to three firms, Factual, Sense360 and Teemo. The code has since been removed. A spokesperson for Apalon, which built the app, said it “conducted a limited, brief test with a few of these providers” earlier this year.

Homes.com is a popular app that asks that you switch on your location to help “find nearby homes.” But the code, thought to be old code still sends precise coordinates to AreaMetrics. The app maker said it used AreaMetrics “for a short period” last year but said the code was deactivated.

Perfect365, an augmented reality beauty app with over 100 million users, asks for location to “customize your experience based on your location and more,” and refers users to the privacy policy for more — which does state that location data will be used for advertising. The app was briefly pulled after a BuzzFeed News story earlier this year outed the researchers, but returned to the app store days later. The current app version contains code for eight separate data monetization firms in the latest version of the app. The app maker did not return a request for comment.

And the list goes on — including over a hundred Sinclair-owned local news and weather apps, which share location data with Reveal, a data tracking and monetization firm, which the company says will help the media giant bolster its sales by “providing advertisers with target audiences.”

That can quickly become a lucrative business for developers with popular apps and monetization firms alike, some of which collect billions of locations each day.

Most of the data monetization firms deny any wrongdoing and say that users can opt out at any time. Most said that they demand that app makers explicitly state that they require app developers to explicitly state that they are collecting and sending data to third-party firms.

The team’s research shows that those requirements are almost never verified.

Reveal said it requires customers “state the use cases for location data in their privacy policy” and that users can opt-out at any time. Huq, like Reveal said it carries out “regular checks on our partner apps to ensure that they have implemented” measures that explain the company’s services. AreaMetrics, which collects primarily Bluetooth beacon data from public areas like coffee shops and retail stores, and says it has “no interest” in receiving personal data from users.

Sense360 said the data it collects is anonymous and requires apps to get explicit consent from its users, but Strafach said few apps he’s seen contained text that sought assurances. But did not answer a specific question why it no longer works with certain apps. Wireless Registry said it also requires apps seek consent from users, but would not comment on the security measures it uses to ensure user privacy. And in remarks, inMarket said it follows advertising standards and guidelines.

Cuebiq claims to use an “advanced cryptography method” to store and transmit data but Strafach said he found “no evidence” that any data was scrambled. It says it’s not a “tracker” but says while some app developers look to monetize users’ data, most are said to use it for insights. And, Factual said it uses location data for advertising and analytics but must obtain in-app consent from users.

When reached, Teemo did not answer our questions. SafeGraph, Mobiquity and Fysical did not respond to requests for comment.

“None of these companies appear to be legally accountable for their claims and practices, instead there is some sort of self-regulation they claim to enforce,” said Strafach.

He said there isn’t much users can do but limiting ad tracking in your iPhone’s privacy settings can make it more difficult for location trackers to identify users.

Apple’s crackdown on apps that don’t have privacy policies kicks in next month. But given how few people read them in the first place, don’t expect apps to change their behavior any time soon.

07 Sep 2018

Dozens of popular iPhone apps caught sending user location data to monetization firms

A group of security researchers say dozens of popular iPhone apps are quietly sharing the location data of “tens of millions of mobile devices” with third-party data monetization firms.

Almost all require access to a user’s location data to work properly like weather and fitness apps, but share that data often as a way to generate revenue for free-to-download apps.

In many cases, the apps send precise locations and other sensitive, identifiable data “at all times, constantly,” and often with “little to no mention” that location data will be shared with third-parties, say security researchers at the GuardianApp project.

“I believe people should be able to use any app they wish on their phone without fear that granting access to sensitive data may mean that this data will be quietly sent off to some entity who they do not know and do not have any desire to do business with,” said Will Strafach, one of the researchers.

Using tools to monitor network traffic, the researchers found 24 popular iPhone apps that were collecting location data — like Bluetooth beacons to Wi-Fi network names — to know where a person is and where they visit. These data monetization firms also collect other device data from the accelerometer, battery charge status and cell network names.

In exchange for data, often these data firms pay app developers to collect data and grow their databases and often to deliver ads based on a person’s location history.

But although many claim they don’t collect personally identifiable information, Strafach said that latitude and longitude coordinates can pin a person to a house or their work.

To name a few:

ASKfm, a teen-focused anonymous question-and-answer app, has 1,400 ratings on the Apple App Store and touts tens of millions of users. It asks for access to a user’s location that “won’t be shared with anyone.” But the app sends that location data to two data firms, AreaMetrics and Huq. When reached, the app maker said it believes its location collection practices “fit industry standards, and are therefore acceptable for our users.”

NOAA Weather Radar has over 266,000 reviews and has millions of downloads. Access to your location “is used to provide weather info.” But an earlier version of the app from March was sending location data to three firms, Factual, Sense360 and Teemo. The code has since been removed. A spokesperson for Apalon, which built the app, said it “conducted a limited, brief test with a few of these providers” earlier this year.

Homes.com is a popular app that asks that you switch on your location to help “find nearby homes.” But the code, thought to be old code still sends precise coordinates to AreaMetrics. The app maker said it used AreaMetrics “for a short period” last year but said the code was deactivated.

Perfect365, an augmented reality beauty app with over 100 million users, asks for location to “customize your experience based on your location and more,” and refers users to the privacy policy for more — which does state that location data will be used for advertising. The app was briefly pulled after a BuzzFeed News story earlier this year outed the researchers, but returned to the app store days later. The current app version contains code for eight separate data monetization firms in the latest version of the app. The app maker did not return a request for comment.

And the list goes on — including over a hundred Sinclair-owned local news and weather apps, which share location data with Reveal, a data tracking and monetization firm, which the company says will help the media giant bolster its sales by “providing advertisers with target audiences.”

That can quickly become a lucrative business for developers with popular apps and monetization firms alike, some of which collect billions of locations each day.

Most of the data monetization firms deny any wrongdoing and say that users can opt out at any time. Most said that they demand that app makers explicitly state that they require app developers to explicitly state that they are collecting and sending data to third-party firms.

The team’s research shows that those requirements are almost never verified.

Reveal said it requires customers “state the use cases for location data in their privacy policy” and that users can opt-out at any time. Huq, like Reveal said it carries out “regular checks on our partner apps to ensure that they have implemented” measures that explain the company’s services. AreaMetrics, which collects primarily Bluetooth beacon data from public areas like coffee shops and retail stores, and says it has “no interest” in receiving personal data from users.

Sense360 said the data it collects is anonymous and requires apps to get explicit consent from its users, but Strafach said few apps he’s seen contained text that sought assurances. But did not answer a specific question why it no longer works with certain apps. Wireless Registry said it also requires apps seek consent from users, but would not comment on the security measures it uses to ensure user privacy. And in remarks, inMarket said it follows advertising standards and guidelines.

Cuebiq claims to use an “advanced cryptography method” to store and transmit data but Strafach said he found “no evidence” that any data was scrambled. It says it’s not a “tracker” but says while some app developers look to monetize users’ data, most are said to use it for insights. And, Factual said it uses location data for advertising and analytics but must obtain in-app consent from users.

When reached, Teemo did not answer our questions. SafeGraph, Mobiquity and Fysical did not respond to requests for comment.

“None of these companies appear to be legally accountable for their claims and practices, instead there is some sort of self-regulation they claim to enforce,” said Strafach.

He said there isn’t much users can do but limiting ad tracking in your iPhone’s privacy settings can make it more difficult for location trackers to identify users.

Apple’s crackdown on apps that don’t have privacy policies kicks in next month. But given how few people read them in the first place, don’t expect apps to change their behavior any time soon.

07 Sep 2018

Commons Clause stops open-source abuse

There’s a dark cloud on the horizon. The behavior of cloud infrastructure providers, such as Amazon, threatens the viability of open source.

During 13 years as a venture investor, I have invested in the companies behind many open-source projects:

Open source has served society, and open-source business models have been successful and lucrative. Life was good.

Amazon’s behavior

I admire Amazon’s execution. In the venture business we are used to the large software incumbents (such as IBM, Oracle, HP, Compuware, CA, EMC, VMware, Citrix and others) being primarily big sales and distribution channels, which need to acquire innovation (i.e. startups) to feed their channel. Not Amazon. In July 2015, The Wall Street Journal quoted me as saying, “Amazon executes too well, almost like a startup. This is scary for everyone in the ecosystem.” That month, I wrote Fear The Amazon Juggernaut on investor site Seeking Alpha. AMZN is up 400 percent since I wrote that article. (I own AMZN indirectly.)

But to anyone other than its customers, Amazon is not a warm and fuzzy company. Numerous articles have detailed its bruising and cutthroat culture. Why would its use of open source be any different?

Go to Amazon Web Services (AWS) and hover over the Products menu at the top. You will see numerous open-source projects that Amazon did not create, but runs as-a-service. These provide Amazon with billions of dollars of revenue per year.

For example, Amazon takes Redis (the most loved database in StackOverflow’s developer survey), gives very little back, and runs it as a service, re-branded as AWS Elasticache. Many other popular open-source projects including, Elasticsearch, Kafka, Postgres, MySQL, Docker, Hadoop, Spark and more, have similarly been taken and offered as AWS products.

To be clear, this is not illegal. But we think it is wrong, and not conducive to sustainable open-source communities.

Commons Clause

In early 2018, I gathered together creators, CEOs or chief counsels of two dozen at-scale open-source companies, some of them public, to talk about what to do. In March I spoke to GeekWire about this effort. After a lot of constructive discussion the group decided that rather than beat around the bush with mixing and matching open-source licenses to discourage such behavior, we should create a straightforward clause that prohibits the behavior. We engaged respected open-source lawyer Heather Meeker to draft this clause.

In August 2018 Redis Labs announced their decision to add this rider (i.e. one additional paragraph) known as the Commons Clause to their liberal open-source license for certain add-on modules. Redis itself would remain on the permissive BSD license  —  nothing had changed with Redis itself! But the Redis Labs add-on modules will include the Commons Clause rider, which makes the source code available, without the ability to “sell” the modules, where “sell” includes offering them as a commercial service. The goal is to explicitly prevent the bad behavior of cloud infrastructure providers.

Anybody else, including enterprises like General Motors or General Electric, can still do all the things they used to be able to do with the software, even with Commons Clause applied to it. They can view and modify the source code and submit pull-requests to get their modifications into the product. They can even offer the software as-a-service internally for employees. What Commons Clause prevents is the running of a commercial service with somebody else’s open-source software in the manner that cloud infrastructure providers do.

This announcement has  — unsurprisingly, knowing the open-source community  — prompted spirited responses, both favorable and critical. At the risk of oversimplifying: those in favor view this as a logical and positive evolution in open-source licensing that allows open-source companies to run viable businesses while investing in open-source projects. Michael DeHaan, creator of Ansible, in Why Open Source Needs New Licenses, put one part particularly well:

We see people running open source “foundations” and web sites that are essentially talking heads, spewing political arguments about the definition of “open source” as described by something called “The Open Source Initiative”, which contains various names which have attained some level of popularity or following. They attempt to state that such a license where the source code is freely available, but use cases are limited, are “not open source”. Unfortunately, that ship has sailed.

Those neutral or against point out that the Commons Clause makes software not open source, which is accurate, and that making parts of the code base proprietary is against the ethos of open source; and Redis Labs must be desperate and having trouble making money.

First, do not worry about Redis Labs. The company is doing very, very well. And Redis is stronger, more loved and more BSD than ever before.

More importantly, we think it is time to reexamine the ethos of open source in today’s environment. When open source became popular, it was designed for practitioners to experiment with and build on, while contributing back to the community. No company was providing infrastructure as a service. No company was taking an open-source project, re-branding it, running it as a service, keeping the profits and giving very little back.

Our view is that open-source software was never intended for cloud infrastructure companies to take and sell. That is not the original ethos of open source. Commons Clause is reviving the original ethos of open source. Academics, hobbyists or developers wishing to use a popular open-source project to power a component of their application can still do so. But if you want to take substantially the same software that someone else has built, and offer it as a service, for your own profit, that’s not in the spirit of the open-source community.

As it turns out in the case of the Commons Clause, that can make the source code not technically open source. But that is something we must live with, to preserve the original ethos.

Apache + Commons Clause

Redis Labs released certain add-on modules as Apache + Commons Clause. Redis Labs made amply clear that the application of Commons Clause made them not open source, and that Redis itself remains open source and BSD-licensed.

Some rabid open-source wonks accused Redis Labs of trying to trick the community into thinking that modules were open source, because they used the word “Apache.” (They were reported to be foaming at the mouth while making these accusations, but in fairness it could have been just drool.)

There’s no trick. The Commons Clause is a rider that is to be attached to any permissive open-source license. Because various open-source projects use various open-source licenses, when releasing software using Commons Clause, one must specify to which underlying permissive open-source license one is attaching Commons Clause.

Why not AGPL?

There are two key reasons to not use AGPL in this scenario, an open-source license that says that you must release to the public any modifications you make when you run AGPL-licensed code as a service.

First, AGPL makes it inconvenient but does not prevent cloud infrastructure providers from engaging in the abusive behavior described above. It simply says that they must release any modifications they make while engaging in such behavior. Second, AGPL contains language about software patents that is unnecessary and disliked by a number of enterprises.

Many of our portfolio companies with AGPL projects have received requests from large enterprises to move to a more permissive license, since the use of AGPL is against their company’s policy.

Balance

Cloud infrastructure providers are not bad guys or acting with bad intentions. Open source has always been a balancing act. Many of us believe in our customers and peers seeing our source code, making improvements and sharing back. It’s always a leap of faith to distribute one’s work product for free and to trust that you’ll be able to put food on the table. Sometimes, with some projects, a natural balance occurs without much deliberate effort. But at other times, the natural balance does not occur: We are seeing this more and more with infrastructure open source, especially as cloud infrastructure providers seek to differentiate by moving up the stack from commodity compute and storage to higher level infrastructure services.

Revisions

The Commons Clause as of this writing is at version 1.0. There will be revisions and tweaks in the future to ensure that Commons Clause implements its goals. We’d love your input.

Differences of opinion on Commons Clause that we have seen expressed so far are essentially differences of philosophy. Much criticism has come from open-source wonks who are not in the business of making money with software. They have a different philosophy, but that is not surprising, because their job is to be political activists, not build value in companies.

Some have misconstrued that it prevents people from offering maintenance, support or professional services. This is a misreading of the language. Some have claimed that it conflicts with AGPL. Commons Clause is intended to be used with open-source licenses that are more permissive than AGPL, so that AGPL does not have to be used! Still, even with AGPL, few users of an author’s work would deem it prudent to simply disregard an author’s statement of intent to apply Commons Clause.

Protecting open source 

Some open-source stakeholders are confused. Whose side should they be on? Commons Clause is new, and we expected debate. The people behind this initiative are committed open-source advocates, and our intent is to protect open source from an existential threat. We hope others will rally to the cause, so that open-source companies can make money, open source can be viable and open-source developers can get paid for their contributions.

07 Sep 2018

Commons Clause stops open-source abuse

There’s a dark cloud on the horizon. The behavior of cloud infrastructure providers, such as Amazon, threatens the viability of open source.

During 13 years as a venture investor, I have invested in the companies behind many open-source projects:

Open source has served society, and open-source business models have been successful and lucrative. Life was good.

Amazon’s behavior

I admire Amazon’s execution. In the venture business we are used to the large software incumbents (such as IBM, Oracle, HP, Compuware, CA, EMC, VMware, Citrix and others) being primarily big sales and distribution channels, which need to acquire innovation (i.e. startups) to feed their channel. Not Amazon. In July 2015, The Wall Street Journal quoted me as saying, “Amazon executes too well, almost like a startup. This is scary for everyone in the ecosystem.” That month, I wrote Fear The Amazon Juggernaut on investor site Seeking Alpha. AMZN is up 400 percent since I wrote that article. (I own AMZN indirectly.)

But to anyone other than its customers, Amazon is not a warm and fuzzy company. Numerous articles have detailed its bruising and cutthroat culture. Why would its use of open source be any different?

Go to Amazon Web Services (AWS) and hover over the Products menu at the top. You will see numerous open-source projects that Amazon did not create, but runs as-a-service. These provide Amazon with billions of dollars of revenue per year.

For example, Amazon takes Redis (the most loved database in StackOverflow’s developer survey), gives very little back, and runs it as a service, re-branded as AWS Elasticache. Many other popular open-source projects including, Elasticsearch, Kafka, Postgres, MySQL, Docker, Hadoop, Spark and more, have similarly been taken and offered as AWS products.

To be clear, this is not illegal. But we think it is wrong, and not conducive to sustainable open-source communities.

Commons Clause

In early 2018, I gathered together creators, CEOs or chief counsels of two dozen at-scale open-source companies, some of them public, to talk about what to do. In March I spoke to GeekWire about this effort. After a lot of constructive discussion the group decided that rather than beat around the bush with mixing and matching open-source licenses to discourage such behavior, we should create a straightforward clause that prohibits the behavior. We engaged respected open-source lawyer Heather Meeker to draft this clause.

In August 2018 Redis Labs announced their decision to add this rider (i.e. one additional paragraph) known as the Commons Clause to their liberal open-source license for certain add-on modules. Redis itself would remain on the permissive BSD license  —  nothing had changed with Redis itself! But the Redis Labs add-on modules will include the Commons Clause rider, which makes the source code available, without the ability to “sell” the modules, where “sell” includes offering them as a commercial service. The goal is to explicitly prevent the bad behavior of cloud infrastructure providers.

Anybody else, including enterprises like General Motors or General Electric, can still do all the things they used to be able to do with the software, even with Commons Clause applied to it. They can view and modify the source code and submit pull-requests to get their modifications into the product. They can even offer the software as-a-service internally for employees. What Commons Clause prevents is the running of a commercial service with somebody else’s open-source software in the manner that cloud infrastructure providers do.

This announcement has  — unsurprisingly, knowing the open-source community  — prompted spirited responses, both favorable and critical. At the risk of oversimplifying: those in favor view this as a logical and positive evolution in open-source licensing that allows open-source companies to run viable businesses while investing in open-source projects. Michael DeHaan, creator of Ansible, in Why Open Source Needs New Licenses, put one part particularly well:

We see people running open source “foundations” and web sites that are essentially talking heads, spewing political arguments about the definition of “open source” as described by something called “The Open Source Initiative”, which contains various names which have attained some level of popularity or following. They attempt to state that such a license where the source code is freely available, but use cases are limited, are “not open source”. Unfortunately, that ship has sailed.

Those neutral or against point out that the Commons Clause makes software not open source, which is accurate, and that making parts of the code base proprietary is against the ethos of open source; and Redis Labs must be desperate and having trouble making money.

First, do not worry about Redis Labs. The company is doing very, very well. And Redis is stronger, more loved and more BSD than ever before.

More importantly, we think it is time to reexamine the ethos of open source in today’s environment. When open source became popular, it was designed for practitioners to experiment with and build on, while contributing back to the community. No company was providing infrastructure as a service. No company was taking an open-source project, re-branding it, running it as a service, keeping the profits and giving very little back.

Our view is that open-source software was never intended for cloud infrastructure companies to take and sell. That is not the original ethos of open source. Commons Clause is reviving the original ethos of open source. Academics, hobbyists or developers wishing to use a popular open-source project to power a component of their application can still do so. But if you want to take substantially the same software that someone else has built, and offer it as a service, for your own profit, that’s not in the spirit of the open-source community.

As it turns out in the case of the Commons Clause, that can make the source code not technically open source. But that is something we must live with, to preserve the original ethos.

Apache + Commons Clause

Redis Labs released certain add-on modules as Apache + Commons Clause. Redis Labs made amply clear that the application of Commons Clause made them not open source, and that Redis itself remains open source and BSD-licensed.

Some rabid open-source wonks accused Redis Labs of trying to trick the community into thinking that modules were open source, because they used the word “Apache.” (They were reported to be foaming at the mouth while making these accusations, but in fairness it could have been just drool.)

There’s no trick. The Commons Clause is a rider that is to be attached to any permissive open-source license. Because various open-source projects use various open-source licenses, when releasing software using Commons Clause, one must specify to which underlying permissive open-source license one is attaching Commons Clause.

Why not AGPL?

There are two key reasons to not use AGPL in this scenario, an open-source license that says that you must release to the public any modifications you make when you run AGPL-licensed code as a service.

First, AGPL makes it inconvenient but does not prevent cloud infrastructure providers from engaging in the abusive behavior described above. It simply says that they must release any modifications they make while engaging in such behavior. Second, AGPL contains language about software patents that is unnecessary and disliked by a number of enterprises.

Many of our portfolio companies with AGPL projects have received requests from large enterprises to move to a more permissive license, since the use of AGPL is against their company’s policy.

Balance

Cloud infrastructure providers are not bad guys or acting with bad intentions. Open source has always been a balancing act. Many of us believe in our customers and peers seeing our source code, making improvements and sharing back. It’s always a leap of faith to distribute one’s work product for free and to trust that you’ll be able to put food on the table. Sometimes, with some projects, a natural balance occurs without much deliberate effort. But at other times, the natural balance does not occur: We are seeing this more and more with infrastructure open source, especially as cloud infrastructure providers seek to differentiate by moving up the stack from commodity compute and storage to higher level infrastructure services.

Revisions

The Commons Clause as of this writing is at version 1.0. There will be revisions and tweaks in the future to ensure that Commons Clause implements its goals. We’d love your input.

Differences of opinion on Commons Clause that we have seen expressed so far are essentially differences of philosophy. Much criticism has come from open-source wonks who are not in the business of making money with software. They have a different philosophy, but that is not surprising, because their job is to be political activists, not build value in companies.

Some have misconstrued that it prevents people from offering maintenance, support or professional services. This is a misreading of the language. Some have claimed that it conflicts with AGPL. Commons Clause is intended to be used with open-source licenses that are more permissive than AGPL, so that AGPL does not have to be used! Still, even with AGPL, few users of an author’s work would deem it prudent to simply disregard an author’s statement of intent to apply Commons Clause.

Protecting open source 

Some open-source stakeholders are confused. Whose side should they be on? Commons Clause is new, and we expected debate. The people behind this initiative are committed open-source advocates, and our intent is to protect open source from an existential threat. We hope others will rally to the cause, so that open-source companies can make money, open source can be viable and open-source developers can get paid for their contributions.

07 Sep 2018

The reality of quantum computing could be just three years away

Quantum computing has moved out of the realm of theoretical physics and into the real world, but its potential and promise are still years away.

Onstage at TechCrunch Disrupt SF, a powerhouse in the world of quantum research and a young upstart in the field presented visions for the future of the industry that illustrated both how far the industry has come and how far the technology has to go.

For both Dario Gil, the chief operating officer of IBM Research and the company’s vice president of artificial intelligence and quantum computing, and Chad Rigetti, a former IBM researcher who founded Rigetti Computing and serves as its chief executive, the moment that a quantum computer will be able to perform operations better than a classical computer is only three years away.

“[It’s] generating a solution that is better, faster or cheaper than you can do otherwise,” said Rigetti. “Quantum computing has moved out of a field of research into now an engineering discipline and an engineering enterprise.”

Considering the more than 30 years that IBM has been researching the technology and the millions (or billions) that have been poured into developing it, even seeing an end of the road is a victory for researchers and technologists.

Achieving this goal, for all of the brainpower and research hours that have gone into it, is hardly academic.

The Chinese government is building a $10 billion National Laboratory for Quantum Information in Anhui province, which borders Shanghai and is slated to open in 2020. Meanwhile, the U.S. public research into quantum computing is running at around $200 million per year.

Source: Patin Informatics via Bloomberg News.

One of the reasons why governments, especially, are so interested in the technology is its potential to completely remake the cybersecurity landscape. Some technologists argue that quantum computers will have the potential to crack any type of encryption technology, opening up all of the networks in the world to potential hacking.

Of course, quantum computing is so much more than security. It will enable new ways of doing things we can’t even imagine because we have never had this much pure compute power. Think about artificial and machine learning or drug development; any type of operation that is compute-intensive could benefit from the exponential increase in compute power that quantum computing will bring.

Security may be the Holy Grail for governments, but both Rigetti and Gil say that the industrial chemical business will be the first place where the potentially radical transformation of a market will appear first.

What is quantum computing anyway?

To understand quantum computing it helps to understand the principles of the physics behind it.

As Gil explained onstage (and on our site), quantum computing depends on the principles of superposition, entanglement and interference.

Superposition is the notion that physicists can observe multiple potential states of a particle. “If you a flip a coin it is one or two states,” said Gil. Meaning that there’s a single outcome that can be observed. But if someone were to spin a coin, they’d see a number of potential outcomes.

Once you’ve got one particle that’s being observed, you can add another and pair them thanks to a phenomenon called quantum entanglement. “If you have two coins where each one can be in superpositions and then you can have measurements can be taken” of the difference of both.

Finally, there’s interference, where the two particles can be manipulated by an outside force to change them and create different outcomes.

“In classical systems you have these bits of zeros and ones and the logical operations of the ands and the ors and the nots,” said Gil. “The classical computer is able to process the logical operations of bits expressed in zeros and ones.”

“In an algorithm you put the computer in a super positional state,” Gil continued. “You can take the amplitude and states and interfere them and the algorithm is the thing that interferes… I can have many, many states representing different pieces of information and then i can interfere with it to get these data.”

These operations are incredibly hard to sustain. In the early days of research into quantum computing the superconducting devices only had one nanosecond before a qubit transforms into a traditional bit of data. Those ranges have increased between 50 and 100 microseconds, which enabled IBM and Rigetti to open up their platforms to researchers and others to conduct experimentation (more on that later).

The physical quantum computer

As one can imagine, dealing with quantum particles is a delicate business. So the computing operations have to be carefully controlled. At the base of the machine is what basically amounts to a huge freezer that maintains a temperature in the device of 15 millikelvin — near absolute zero degrees and 180 times colder than the temperatures in interstellar space.

“These qubits are very delicate,” said Gil. “Anything from the outside world can couple to it and destroy its state and one way to protect it is to cool it.”

Wiring for the quantum computer is made of superconducting coaxial cables. The inputs to the computers are microwave pulses that manipulates the particles creating a signal that is then interpreted by the computers’ operators.

Those operators used to require a degree in quantum physics. But both IBM and Rigetti have been working on developing tools that can enable a relative newbie to use the tech.

Quantum computing in the “cloud”

Even as companies like IBM and Rigetti bring the cost of quantum computing down from tens of millions of dollars to roughly $1 million to $2 million, these tools likely will never become commodity hardware that a consumer buys to use as a personal computer.

Rather, as with most other computing these days, quantum computing power will be provided as a service to users.

Indeed, Rigetti announced onstage a new hybrid computing platform that can provide computing services to help the industry both reach quantum advantage — that tipping point at which quantum is commercially viable — and to enable industries to explore the technologies to acclimatize to the potential ways in which typical operations could be disrupted by it.

“A user logs on to their own device and use our software development kit to write a quantum application,” said Rigetti. “That program is sent to a compiler and kicks off an optimization kit that runs on a quantum and classical computer… This is the architecture that’s needed to achieve quantum advantage.”

Both IBM and Rigetti — and a slew of other competitors — are preparing users for accessing quantum computing opportunities on the cloud.

IBM has more than a million chips performing millions of quantum operations requested by users in over 100 countries around the world.

“In a cloud-first era I’m not sure the economic forces will be there that will drive us to develop the miniaturized environment in the laptop,” Rigetti said. But the ramifications of the technology’s commercialization will be felt by everyone, everywhere.

“Quantum computing is going to change the world and it’s all going to come in our lifetime, whether that’s two years or five years,” he said. “Quantum computing is going to redefine every industry and touch every market. Every major company will be involved in some capacity in that space.”

07 Sep 2018

The reality of quantum computing could be just three years away

Quantum computing has moved out of the realm of theoretical physics and into the real world, but its potential and promise are still years away.

Onstage at TechCrunch Disrupt SF, a powerhouse in the world of quantum research and a young upstart in the field presented visions for the future of the industry that illustrated both how far the industry has come and how far the technology has to go.

For both Dario Gil, the chief operating officer of IBM Research and the company’s vice president of artificial intelligence and quantum computing, and Chad Rigetti, a former IBM researcher who founded Rigetti Computing and serves as its chief executive, the moment that a quantum computer will be able to perform operations better than a classical computer is only three years away.

“[It’s] generating a solution that is better, faster or cheaper than you can do otherwise,” said Rigetti. “Quantum computing has moved out of a field of research into now an engineering discipline and an engineering enterprise.”

Considering the more than 30 years that IBM has been researching the technology and the millions (or billions) that have been poured into developing it, even seeing an end of the road is a victory for researchers and technologists.

Achieving this goal, for all of the brainpower and research hours that have gone into it, is hardly academic.

The Chinese government is building a $10 billion National Laboratory for Quantum Information in Anhui province, which borders Shanghai and is slated to open in 2020. Meanwhile, the U.S. public research into quantum computing is running at around $200 million per year.

Source: Patin Informatics via Bloomberg News.

One of the reasons why governments, especially, are so interested in the technology is its potential to completely remake the cybersecurity landscape. Some technologists argue that quantum computers will have the potential to crack any type of encryption technology, opening up all of the networks in the world to potential hacking.

Of course, quantum computing is so much more than security. It will enable new ways of doing things we can’t even imagine because we have never had this much pure compute power. Think about artificial and machine learning or drug development; any type of operation that is compute-intensive could benefit from the exponential increase in compute power that quantum computing will bring.

Security may be the Holy Grail for governments, but both Rigetti and Gil say that the industrial chemical business will be the first place where the potentially radical transformation of a market will appear first.

What is quantum computing anyway?

To understand quantum computing it helps to understand the principles of the physics behind it.

As Gil explained onstage (and on our site), quantum computing depends on the principles of superposition, entanglement and interference.

Superposition is the notion that physicists can observe multiple potential states of a particle. “If you a flip a coin it is one or two states,” said Gil. Meaning that there’s a single outcome that can be observed. But if someone were to spin a coin, they’d see a number of potential outcomes.

Once you’ve got one particle that’s being observed, you can add another and pair them thanks to a phenomenon called quantum entanglement. “If you have two coins where each one can be in superpositions and then you can have measurements can be taken” of the difference of both.

Finally, there’s interference, where the two particles can be manipulated by an outside force to change them and create different outcomes.

“In classical systems you have these bits of zeros and ones and the logical operations of the ands and the ors and the nots,” said Gil. “The classical computer is able to process the logical operations of bits expressed in zeros and ones.”

“In an algorithm you put the computer in a super positional state,” Gil continued. “You can take the amplitude and states and interfere them and the algorithm is the thing that interferes… I can have many, many states representing different pieces of information and then i can interfere with it to get these data.”

These operations are incredibly hard to sustain. In the early days of research into quantum computing the superconducting devices only had one nanosecond before a qubit transforms into a traditional bit of data. Those ranges have increased between 50 and 100 microseconds, which enabled IBM and Rigetti to open up their platforms to researchers and others to conduct experimentation (more on that later).

The physical quantum computer

As one can imagine, dealing with quantum particles is a delicate business. So the computing operations have to be carefully controlled. At the base of the machine is what basically amounts to a huge freezer that maintains a temperature in the device of 15 millikelvin — near absolute zero degrees and 180 times colder than the temperatures in interstellar space.

“These qubits are very delicate,” said Gil. “Anything from the outside world can couple to it and destroy its state and one way to protect it is to cool it.”

Wiring for the quantum computer is made of superconducting coaxial cables. The inputs to the computers are microwave pulses that manipulates the particles creating a signal that is then interpreted by the computers’ operators.

Those operators used to require a degree in quantum physics. But both IBM and Rigetti have been working on developing tools that can enable a relative newbie to use the tech.

Quantum computing in the “cloud”

Even as companies like IBM and Rigetti bring the cost of quantum computing down from tens of millions of dollars to roughly $1 million to $2 million, these tools likely will never become commodity hardware that a consumer buys to use as a personal computer.

Rather, as with most other computing these days, quantum computing power will be provided as a service to users.

Indeed, Rigetti announced onstage a new hybrid computing platform that can provide computing services to help the industry both reach quantum advantage — that tipping point at which quantum is commercially viable — and to enable industries to explore the technologies to acclimatize to the potential ways in which typical operations could be disrupted by it.

“A user logs on to their own device and use our software development kit to write a quantum application,” said Rigetti. “That program is sent to a compiler and kicks off an optimization kit that runs on a quantum and classical computer… This is the architecture that’s needed to achieve quantum advantage.”

Both IBM and Rigetti — and a slew of other competitors — are preparing users for accessing quantum computing opportunities on the cloud.

IBM has more than a million chips performing millions of quantum operations requested by users in over 100 countries around the world.

“In a cloud-first era I’m not sure the economic forces will be there that will drive us to develop the miniaturized environment in the laptop,” Rigetti said. But the ramifications of the technology’s commercialization will be felt by everyone, everywhere.

“Quantum computing is going to change the world and it’s all going to come in our lifetime, whether that’s two years or five years,” he said. “Quantum computing is going to redefine every industry and touch every market. Every major company will be involved in some capacity in that space.”

07 Sep 2018

Five-camera phones are coming soon, because sure, why not?

Peer into the future — or a future, at least. Having otherwise run out of things to improve on, Nokia brand licenser HMD is apparently bringing a five-camera phone to market (well, six, probably, if you count the other side). Because three or four would be taking the easy way out.

The leaked image of the upcoming phone lines up with sketches and a recentish patent from lens makers, Zeiss. The cameras are lined up in a circular array, along with the flash, leap frogging that new Huawei handset that settled at a mere three. 

So, why five cameras? The obvious answer is why not, because whatever, I guess. I mean, we only go around once of this crazy planet of ours once. The more specific answer is a bit more difficult to suss out the image alone, but going with that number is one way not to have to choose between things like zoom and a secondary monochrome.

As has been noted, Nokia (the actual Nokia, that is) had a history of innovating on the imaging front from that 41-megapixel to the stupidly expensive Ozo VR rig. So at very least, this could be nice little homage to the Nokia of yore. How you’ll actually hold the whole thing without touching the lenses every time, on the other hand, is another question entirely.

07 Sep 2018

Deep-linking startup Branch is raising more than $100M at a unicorn valuation

Branch, the deep-linking startup backed by Andy Rubin’s Playground Ventures, will enter the unicorn club with an upcoming funding round.

The four-year-old company, which helps brands create links between websites and mobile apps, has authorized the sale of $129 million in Series D shares, according to sources and confirmed by PitchBook, which tracks venture capital deals. The infusion of capital values the company at roughly $1 billion.

In an e-mail this morning, Branch CEO Alex Austin declined to comment.

The Redwood City-based startup closed a $60 million Series C led by Playground in April 2017, bringing its total equity raised to $113 million. It’s also backed by NEA, Pear Ventures, Cowboy Ventures and Madrona Ventures. Rubin, for his part, is a co-founder of Android, as well as the founder of Essential, a smartphone company that, though highly valued, has had less success.

Branch’s deep-linking platform helps brands drive app growth, conversions, user engagement and retention.

Deep links are links that take you to a specific piece of web content, rather than a website’s homepage. This, for example, is a deep link. This is not.

Deep links are used to connect web or e-mail content with apps. That way, when you’re doing some online shopping using your phone and you click on a link to an item on Jet.com, you’re taken to the Jet app installed on your phone, instead of Jet’s desktop site, which would provide a much poorer mobile experience.

Branch supports 40,000 apps with roughly 3 billion monthly users. The company counts Airbnb, Amazon, Bing, Pinterest, Reddit, Slack, Tinder and several others as customers.

Following its previous round of venture capital funding, Austin told TechCrunch that the company had seen “tremendous growth” ahead of the raise.

“[We] have been fortunate enough to become the clear market leader,” he said. “There’s so much more we can accomplish in deep linking and this money will be used to fund Branch’s continued platform growth.”

07 Sep 2018

Trump wants to just tariff the hell out of China

Another day, another whopper of a tariff. The Trump administration has been busy finalizing the rulemaking process to put 25 percent tariffs on $200 billion of Chinese goods, which will almost certainly affect the prices of many critical technology components and have on-going repercussions for Silicon Valley supply chains. That followed the implementation of tariffs on $50 billion of goods earlier this year.

Now, President Trump, as reported by reporters on Air Force One this morning, has said that he is prepared to triple down on his tariffs strategy, saying that he is ready to add tariffs to another $267 billion worth of Chinese goods. Although the president has a flair for the dramatic in many of his policies, the China tariffs are one arena in which his rhetoric has matched the actions of his administration.

Each set of these tariffs has been vociferously opposed by tech industry trade groups, but their concerns seem to have had little effect on the administration’s final thinking. Jose Castaneda, a spokesperson for the Information Technology Industry Council, called this next wave of potential tariffs “grossly irresponsible and possibly illegal.”

Yet, despite the constant threat of more tariffs, CFIUS reforms, and the ZTE debacle, China continues to dominate trade with America. Numbers released by the Department of Commerce this week showed that America’s trade deficit with other nations reached five-year highs in July, surpassing $50 billion for the month, with the China trade goods deficit hitting $36.8 billion. These numbers may well have triggered the president’s latest comments.

They may also have been triggered by the recent anonymous op-ed in The New York Times, in which a Trump “senior administration official” said that “Although he was elected as a Republican, the president shows little affinity for ideals long espoused by conservatives: free minds, free markets and free people…. In addition to his mass-marketing of the notion that the press is the ‘enemy of the people,’ President Trump’s impulses are generally anti-trade and anti-democratic.”

Anti-trade or not, it is clear that the package of tariffs and other policy reforms have done little to dampen the trade deficit or trigger a broad restructuring of the supply chains underpinning American brands.

In my discussions at the Disrupt SF 2018 conference the past few days, one persistent theme has been the durability of certain Chinese cities — particularly Shenzhen but not exclusively — to weather these trade storms. Because of the depth of expertise, fast turnaround times, extreme flexibility and cheap costs of hardware supply chains, there are sustainable advantages that the U.S. can’t hope to fight with a couple of measly tariffs — even on $500 billion worth of goods.

Indeed, as one prominent venture capitalist put it to me, hardware investing is now significantly easier for those with the right knowledge of the Chinese ecosystem. Just a few years ago, a couple of millions in capital could get a startup a working prototype. Now, startups can raise $1-2 million in some cases and get a working product into sales channels. The Chinese ecosystem around hardware has just continued to improve with alacrity.

For Trump, a much more robust policy will be needed to move the trade numbers in the other direction. Better funding for universities to produce the right talent. Pushing for a region in the U.S. to become the “Shenzhen of America” through a combination of private and public funding. Greater preferential treatment around taxes for keeping manufacturing in the U.S.

And maybe tariff the hell out of them.