Author: azeeadmin

18 May 2021

Google Cloud launches Vertex AI, a new managed machine learning platform

At Google I/O today Google Cloud announced Vertex AI, a new managed machine learning platform that is meant to make it easier for developers to deploy and maintain their AI models. It’s a bit of an odd announcement at I/O, which tends to focus on mobile and web developers and doesn’t traditionally feature a lot of Google Cloud news, but the fact that Google decided to announce Vertex today goes to show how important it thinks this new service is for a wide range of developers.

The launch of Vertex is the result of quite a bit of introspection by the Google Cloud team. “Machine learning in the enterprise is in crisis, in my view,” Craig Wiley, the director of product management for Google Cloud’s AI Platform, told me. “As someone who has worked in that space for a number of years, if you look at the Harvard Business Review or analyst reviews, or what have you — every single one of them comes out saying that the vast majority of companies are either investing or are interested in investing in machine learning and are not getting value from it. That has to change. It has to change.”

Image Credits: Google

Wiley, who was also the general manager of AWS’s SageMaker AI service from 2016 to 2018 before coming to Google in 2019, noted that Google and others who were able to make machine learning work for themselves saw how it can have a transformational impact, but he also noted that the way the big clouds started offering these services was by launching dozens of services, “many of which were dead ends,” according to him (including some of Google’s own). “Ultimately, our goal with Vertex is to reduce the time to ROI for these enterprises, to make sure that they can not just build a model but get real value from the models they’re building.”

Vertex then is meant to be a very flexible platform that allows developers and data scientist across skill levels to quickly train models. Google says it takes about 80% fewer lines of code to train a model versus some of its competitors, for example, and then help them manage the entire lifecycle of these models.

Image Credits: Google

The service is also integrated with Vizier, Google’s AI optimizer that can automatically tune hyperparameters in machine learning models. This greatly reduces the time it takes to tune a model and allows engineers to run more experiments and do so faster.

Vertex also offers a “Feature Store” that helps its users serve, share and reuse the machine learning features and Vertex Experiments to help them accelerate the deployment of their models into producing with faster model selection.

Deployment is backed by a continuous monitoring service and Vertex Pipelines, a rebrand of Google Cloud’s AI Platform Pipelines that helps teams manage the workflows involved in preparing and analyzing data for the models, train them, evaluate them and deploy them to production.

To give a wide variety of developers the right entry points, the service provides three interfaces: a drag-and-drop tool, notebooks for advanced users and — and this may be a bit of a surprise — BigQuery ML, Google’s tool for using standard SQL queries to create and execute machine learning models in its BigQuery data warehouse.

We had two guiding lights while building Vertex AI: get data scientists and engineers out of the orchestration weeds, and create an industry-wide shift that would make everyone get serious about moving AI out of pilot purgatory and into full-scale production,” said Andrew Moore, vice president and general manager of Cloud AI and Industry Solutions at Google Cloud. “We are very proud of what we came up with in this platform, as it enables serious deployments for a new generation of AI that will empower data scientists and engineers to do fulfilling and creative work.”

18 May 2021

Google adds foldable-focused Android developer updates

Things have been a bit quiet on the foldables front of late, but plenty of parties are still bullish about the form factor’s future. Ahead of today’s big I/O kickoff, Samsung (undoubtedly the most bullish of the bunch) posted a bunch of metrics this morning, noting,

The global outlook is just as impressive. This year alone, the foldables market is expected to triple over last year — a year in which Samsung accounted for three out of every four foldable smartphones shipped worldwide.

Part of anticipating growth in the category is ensuring that the software is ready it. Samsung has been tweaking things for a while now on its end, and at I/O in 2018, Google announced that it would be adding support for foldable screens. Recent rumors have suggested that the company is working on its own foldable Pixel, but even beyond that, it’s probably in the company’s best interest to ensure that Android plays nicely with the form factor.

“We studied how people interact with large screens,” the company said in today’s developer keynote. This includes a variety of different aspects, including where users place their hands while using the device — which can be a bit all over the place when dealing with different applications in different orientations and form factors. Essentially, you don’t want to, say, put buttons where people generally place your hands.

The list of upgrades includes the ability to resize content automatically, without overly stretching it out to fit multiple panels. All of this is no doubt going to be a learning curve as foldables end up in the hands of more users. But at very least, it signals Google’s continued view of foldables as a growing category. It’s also one of multiple updates today that involve the company working more closely with Samsung.

The two tech giants also announced a joint Wear OS/Tizen play early today.

18 May 2021

Google updates its cross-platform Flutter UI toolkit

Flutter, Google’s cross-platform UI toolkit for building mobile and desktop apps, is getting a small but important update at the company’s I/O conference today. Google also announced that Flutter now powers 200,000 apps in the Play Store alone, including popular apps from companies like WeChat, ByteDance, BMW, Grab and DiDi. Indeed, Google notes that 1 in 8 new apps in the Play Store are now Flutter apps.

The launch of Flutter 2.2 follows Google’s rollout of Flutter 2, which first added support for desktop and web apps in March, so it’s no surprise that this is a relatively minor release. In many ways, the update builds on top of the features the company introduced in version 2 and reliability and performance improvements.

Version 2.2 makes null safety the default for new projects, for example, to add protections against null reference exceptions. As for performance, web apps can now use background caching using service workers, for example, while Android apps can use deferred components and iOS apps get support for precompiled shaders to make first runs smoother.

Google also worked on streamlining the overall process of bringing Flutter apps to desktop platforms (Windows, macOS and Linux).

But as Google notes, a lot of the work right now is happening in the ecosystem. Google itself is introducing a new payment plugin for Flutter built in partnership with the Google Pay team and Google’s ads SDK for Flutter is getting support for adaptive banner formats. Meanwhile, Samsung is now porting Flutter to Tizen and Sony is leading an effort to bring it to embedded Linux. Adobe recently announced its XD to Flutter plugin for its design tool and Microsoft today launched the alpha of Flutter support for Universal Windows Platform (UWP) apps for Windows 10 in alpha.

18 May 2021

Google launches the first beta of Android Studio Arctic Fox

At its I/O developer conference, Google today announced the first beta of the next version of its Android Studio IDE, Arctic Fox. For the most part, the idea here is to bring more of the tooling around building Android apps directly into the IDE.

While there is a lot that’s new in Arctic Fox, maybe the marquee feature of this update is the integration of Jetpack Compose, Google’s toolkit for building modern user interfaces for Android. In Android Studio, developers can now use Compose Preview to create previews of different configurations (think themes and devices) or deploy a preview directly to a device, all while the layout inspector makes it easier for developers to understand how (and why) a layout is rendered the way it is. With Live Updates enabled any change is then also directly streamed to the device.

The team also integrated the Android Accessibility Test Framework directly into Android Studio to help developers find accessibility issues like missing content descriptions or a low contrast in their designs.

Image Credits: Google

Just like with some of the updates to Android itself, the team is also looking at making it easier to develop for a wider range of form factors. To build Wear OS apps, developers previously had to physically connect the watch to their development machine or go through a lot of steps to pair the watch. Now, users can simply pair a watch and phone emulator (or physical phone) with the new Wear OS Pairing feature. All this takes now is a few clicks.

Also new on the Wear OS side is a new heart rate sensor for the Wear OS Emulators in Android Studio, while the Android Automotive emulator gains the ability to replay car sensor data to help those developers with their development and testing workflow.

Android Studio users who work on a Mac will be happy to hear that Google is also launching a first preview of Android Studio for the Apple Silicon (arm64) architecture.

Image Credits: Google

18 May 2021

Google is making a 3D, life-size video calling booth

Google is working on a video calling booth that uses 3D imagery on a 3D display to create a lifelike image of the people on both sides. While it’s still experimental, “Project Starline” builds on years of research and acquisitions, and could be the core of a more personal-feeling video meeting in the near future.

The system was only shown via video of unsuspecting participants, who were asked to enter a room with a heavily obscured screen and camera setup. Then the screen lit up with a video feed of a loved one, but in a way none of them expected:

“I could feel her and see her, it was like this 3D experience. It was like she was here.”

“I felt like I could really touch him!”

“It really, really felt like she and I were in the same room.”

CEO Sundar Pichai explained that this “experience” was made possible with high-resolution cameras and custom depth sensors, almost certainly related to these Google research projects into essentially converting videos of people and locations into interactive 3D scenes:

The cameras and sensors — probably a dozen or more hidden around the display — capture the person from multiple angles and figure out their exact shape, creating a live 3D model of them. This model and all the color and lighting information is then (after a lot of compression and processing) sent to the other person’s setup, which shows it in convincing 3D. It even tracks their heads and bodies to adjust the image to their perspective.

But 3D TVs have more or less fallen by the wayside; turns out no one wants to wear special glasses for hours at a time, and the quality on glasses-free 3D was generally pretty bad. So what’s making this special 3D image?

Pichai said “we have developed a breakthrough light field display,” probably with the help of the people and IP it scooped up from Lytro, the light field camera company that didn’t manage to get its own tech off the ground and dissolved in 2018.

Light field cameras and displays create and show 3D imagery using a variety of techniques that are very difficult to explain or show in 2D. The startup Looking Glass has made several that are extremely arresting to view in person, showing 3D models and photographic scenes that truly look like tiny holograms.

Whether Google’s approach is similar or different, the effect appears to be equally impressive, as the participants indicate. They’ve been testing this internally and are getting ready to send out units to partners in various industries (such as medicine) where the feeling of a person’s presence makes a big difference.

At this point Project Starline is still very much a prototype, and probably a ridiculously expensive one — so don’t expect to get one in your home any time soon. But it’s not wild to think that a consumer version of this light field setup may be available down the line. Google promises to share more later this year.

18 May 2021

May Mobility’s Edwin Olson and Nina Grooms Lee and Toyota AI Ventures’ Jim Adler on validating your startup idea

When a founder has a work history that includes the name of the parent company of one of their key investors, you probably assume that was one of the first deals to come together. Not so with May Mobility and Toyota AI Ventures, which connected for the company’s second seed round, after May went out and raised its original seed purely on the strength of its own ideas and proposed solutions.

That’s one of the many interesting things we learned from speaking to May Mobility co-founder and CEO Edwin Olson, as well as Chief Product Officer Nina Grooms Lee and Toyota AI Ventures founding partner Jim Adler on an episode of Extra Crunch Live.

Extra Crunch Live goes down every Wednesday at 3 p.m. EDT/noon PDT. Our next episode is with Sequoia’s Shaun Maguire and Vise’s Samir Vasavada, and you can check out the upcoming schedule right here.

Meanwhile, read on for highlights from our chat with Olson, Grooms Lee and Adler, and then stay tuned at the end for a recording of the full session, including our live pitch-off.

A different approach to corporate VC

One thing Adler brought up early in the chat is that Toyota AI Ventures likely takes a different approach than most traditional corporate VCs, which are often thought of as being more incentivized by strategic alignment than by venture-scale returns. Adler says the firm he founded within the automaker’s corporate umbrella actually does behave much more like a traditional VC in some ways than many would assume.

18 May 2021

Google partners with Shopify on online shopping expansion

Google today announced it’s partnering with Shopify, giving the e-commerce platform’s over 1.7 million merchants the ability to reach consumers through Google Search and its other services. The integration will allow merchants to sign up in just a few clicks to have their products appear across Google’s 1 billion “shopping journeys” that take place every day through Search, Maps, Images, Lens and YouTube.

The company didn’t offer extensive details about the integration when it was announced during Google’s I/O Developer event this afternoon. But the news follows a series of updates to Google Shopping resulting from Amazon’s increased investment in its own advertising business, which threatens Google’s core ads business.

Google made its pitch to online advertisers today, describing how its so-called “Shopping Graph” would now begin to pull together information from across websites, price reviews, videos and product data pulled directly from brands and retailers, to help better inform online shoppers about where to find items, how well they were received, which merchant has the best price, and more.

This Shopping Graph can span across Google’s platforms, whether someone is discovering products through Google Search or even watching videos on YouTube, among other things.

Image Credits: Google I/O 2021

For example, when you now view screenshots of products in Google Photos, there will be a suggestion to search the photo using Google Lens, to help you find the item for sale. And Google announced earlier this year it was pilot-testing a new experience on YouTube that allows users to shop products they learn about from their favorite creators — a move to counteract the growing threats from TikTok and Facebook, and their own investments in e-commerce.

But before any of this Shopping Graph functionality can really work, Google needs consumers to find shopping for products via Google actually useful. That’s partly why Google made it free for merchants to sell their products across Google this past year — a change that Google says drove an 80% increase in merchants on Google, with the “vast majority” being small to medium-sized business.

That’s where the partnership with Shopify comes in, too. Though this integration doesn’t mean that every Shopify storefront will be included on Google — the merchants have to take an action to make that happen — it would be almost a no-brainer for them not to leverage the new option.

Shopify isn’t playing favorites when it comes to distribution, however. It’s integrated with other large platforms, too, including Facebook and TikTok. And it’s been working with Walmart to expand the retailer’s online marketplace, as well.

Investors seemed happy with the Shopify news this afternoon. Shortly after Google’s announcement, the stock popped 3.52%.

18 May 2021

With Android 12, Google will turn your smartphone into a car key

Google is working with BMW and other automakers to develop a digital key that will let car owners lock, unlock and start a vehicle from their Android smartphone, the company announced Tuesday during its 2021 Google I/O developer event.

The digital key is one many new features coming to Android 12, the latest version of the company’s mobile operating system. The digital car keys will become available on select Pixel and Samsung Galaxy phones later this year, according to Sameer Samat, VP of PM for Android & Google Play. The digital car key will be available in yet unnamed 2022 vehicle models, including ones made by BMW, and some 2021 models.

The digital key uses so-called Ultra Wideband (UWB) technology, a form of radio transmission for which the sensor can tell the direction of the signal, sort of like a tiny radar. This lets the antenna in your phone locate and identify objects equipped with UWB transmitters. By using UWB technology, the Android user will be able to lock and unlock their vehicle without taking their phone out. 

Google Android digital car key

Image Credits: Google

Consumers who own car models that have enabled NFC technology, or near-field communication, will be able unlock their car by tapping their phone against the door. The phone communicates with an NFC reader in the user’s car, which is typically located within the door handle. Google said users will also be able to securely and remotely share their car key with friends and family if they need to borrow the car.

The announcement follows a similar move made by Apple last year that allowed users to add a digital car key to their iPhone or Apple Watch. That feature, which was part of iOS 14, works over NFC and first became available in the 2021 BMW 5 Series.

A growing number of automakers have developed their own apps, which can also control certain functions such as remote locking and unlocking. The big benefit, in Google’s and likely Apple’s view, is that by offering the digital car key in its mobile operating system, users don’t have to download an app.

The intent is for a less clunky experience. And there’s a movement to make it even more seamless. The Car Connectivity Consortium, which Apple, Google, Samsung along with automakers BMW, GM, Honda, Hyundai and Volkswagen are members of, have spent the past several years creating an underlying agreement to make it easier to work in a seamless way and to standardize a digital key solution.

The development of the digital car key is just part of Google’s push to ensure the smartphone is the centerpiece of consumers’ lives. And it’s a goal that can’t be achieved without including vehicles.

“When purchasing a phone these days, we’re buying not only a phone, but also an entire ecosystem of devices that are all expected to work together — such as TVs, laptops, cars and wearables like smartwatches or fitness tracker, Google’s vp of engineering Erik Kay wrote in a blog post accompanying the announcement during the event. “In North America, the average person now has around eight connected devices, and by 2022, this is predicted to grow to 13 connected devices.”

Google said it is expanding its “fast pair” feature, which lets users pair their devices via Bluetooth with a single tap, to other products, including vehicles. To date, consumers have used “fast pair” more than 36 million times to connect their Android phones with Bluetooth accessories, including Sony, Microsoft, JBL, Philips, Google and many other popular brands, according to Kay.

The feature will be rolled out to more devices in the coming months, including Beats headphones as well as cars from BMW and Ford, Sameer Samat, VP of PM for Android & Google Play said during Google I/O.

 

18 May 2021

With Android 12, Google will turn your smartphone into a car key

Google is working with BMW and other automakers to develop a digital key that will let car owners lock, unlock and start a vehicle from their Android smartphone, the company announced Tuesday during its 2021 Google I/O developer event.

The digital key is one many new features coming to Android 12, the latest version of the company’s mobile operating system. The digital car keys will become available on select Pixel and Samsung Galaxy phones later this year, according to Sameer Samat, VP of PM for Android & Google Play. The digital car key will be available in yet unnamed 2022 vehicle models, including ones made by BMW, and some 2021 models.

The digital key uses so-called Ultra Wideband (UWB) technology, a form of radio transmission for which the sensor can tell the direction of the signal, sort of like a tiny radar. This lets the antenna in your phone locate and identify objects equipped with UWB transmitters. By using UWB technology, the Android user will be able to lock and unlock their vehicle without taking their phone out. 

Google Android digital car key

Image Credits: Google

Consumers who own car models that have enabled NFC technology, or near-field communication, will be able unlock their car by tapping their phone against the door. The phone communicates with an NFC reader in the user’s car, which is typically located within the door handle. Google said users will also be able to securely and remotely share their car key with friends and family if they need to borrow the car.

The announcement follows a similar move made by Apple last year that allowed users to add a digital car key to their iPhone or Apple Watch. That feature, which was part of iOS 14, works over NFC and first became available in the 2021 BMW 5 Series.

A growing number of automakers have developed their own apps, which can also control certain functions such as remote locking and unlocking. The big benefit, in Google’s and likely Apple’s view, is that by offering the digital car key in its mobile operating system, users don’t have to download an app.

The intent is for a less clunky experience. And there’s a movement to make it even more seamless. The Car Connectivity Consortium, which Apple, Google, Samsung along with automakers BMW, GM, Honda, Hyundai and Volkswagen are members of, have spent the past several years creating an underlying agreement to make it easier to work in a seamless way and to standardize a digital key solution.

The development of the digital car key is just part of Google’s push to ensure the smartphone is the centerpiece of consumers’ lives. And it’s a goal that can’t be achieved without including vehicles.

“When purchasing a phone these days, we’re buying not only a phone, but also an entire ecosystem of devices that are all expected to work together — such as TVs, laptops, cars and wearables like smartwatches or fitness tracker, Google’s vp of engineering Erik Kay wrote in a blog post accompanying the announcement during the event. “In North America, the average person now has around eight connected devices, and by 2022, this is predicted to grow to 13 connected devices.”

Google said it is expanding its “fast pair” feature, which lets users pair their devices via Bluetooth with a single tap, to other products, including vehicles. To date, consumers have used “fast pair” more than 36 million times to connect their Android phones with Bluetooth accessories, including Sony, Microsoft, JBL, Philips, Google and many other popular brands, according to Kay.

The feature will be rolled out to more devices in the coming months, including Beats headphones as well as cars from BMW and Ford, Sameer Samat, VP of PM for Android & Google Play said during Google I/O.

 

18 May 2021

After closing Fitbit acquisition, Google is going big with Wear OS

For years, Wear OS has been, at best, something of a dark horse among Google operating systems. It’s certainly not for lack of partnership or investment, but for whatever reason, the company has never really stuck the landing with its wearable operating system.

It’s a category in which Apple has been utterly dominant for some time. Google has largely failed to chip away at that market, in spite of enlisting some of the biggest names in consumer electronics as partners. Figures from Strategy Analytics classify Wear OS among the “others” category.

Google’s strategy is, once again, the result of partnerships – or, more precisely, partnerships combined with acquisitions. At the top of the list is an “if you can’t beat ‘em, join em’” approach to Samsung’s longstanding preference for open-source Tizen. It seemed like one of the stranger plays in the category, but building out its own version of Tizen has proven a winning strategy for the company, which trails only Apple in the category.

During today’s I/O keynote, the company company revealed a new partnership with Samsung, “combining the best of Wear OS and Tizen.” We’re still waiting to see how that will play out, but it will be fascinating watching two big players combine forces to take on Apple. You come at the king, you best not miss, to quote a popular prestige television program. On the developer side, this seems to allude to the ability to create joint apps for both platforms, as third-party app selection has been a sticking point for both.

The other big change sheds some more light on precisely why the company was interested in Fitbit. Sure the company was a wearables leader that dominated fitness bands and eventually created its own solid smartwatches (courtesy of, among other things, its own acquisition of Pebble), but health is really the key here.

Health monitoring has become the dominant conversation around wearables in recent years, and Google’s acquisition seems to be, above all, about integrating that information. “[A] world-class health and fitness service from Fitbit is coming to the platform,” the company noted. Beyond adding Fitbit’s well-loved tracking features, the company will also be integrated Wear features into Google’s hardware, working to blur the line between the two companies.

Developing…