Robin.io, a cloud-native application and data management solution with enterprise customers like USAA, Sabre, SAP, Palo Alto Networks and Rakuten Mobile, today announced the launch of its new free(-mium) version of its service, in addition to a major update to the core of its tool.
Robin .io promises that it brings cloud-native data management capabilities to containerized applications with support for standard operations like backup and recovery, snapshots, rollbacks and more. It does all of that while offering bare-metal performance and support for all major clouds. The service is essentially agnostic to the actual database being used and offers support for the likes of PostgreSQL, MySQL, MongoDB, Redis, MariaDB, Cassandra, Elasticsearch and others.
Image Credits: Robin.io
“Robin Cloud Native Storage works with any workload on any Kubernetes-based platform and on any cloud,” said Robin founder and CEO Partha Seetala. “With capabilities for storing, taking snapshots, backing up, cloning, migrating and securing data — all with the simplest of commands — Robin Cloud Native Storage offers developers and DevOps teams a super simple yet highly performant tool for quickly deploying and managing their enterprise workloads on Kubernetes.”
The new free version lets teams manage up to 5 nodes and 5TB of storage. The promise here is that this a free-for-life offering and the company obviously expects that it allows enterprises to get a feel for the service and then upgrade to its paid enterprise plans over time.
Talking about those enterprise plans, the company also today announced that it is moving to a consumption-based pricing plan, starting at $0.42 per node-hour (though it also offers annual subscriptions). The enterprise plan includes 24×7 support and doesn’t limit the number of nodes or storage capacity.
Among the new features to Robin’s core storage service are data management support for Helm Charts (where Helm is the Kubernetes package manager), the ability to specify where exactly the data should reside (which is mostly meant to keep it close to the compute resources) and affinity policies that ensure availability for stateful applications that rely on distributed databases and data platforms.
Genies, has updated its software development kit and added Giphy and Gucci as new partners to enable their users to create personalized Genie avatars.
The company released the first version of its sdk in 2018 when it raised a $10 million to directly challenge Snap and Apple for avatar dominance. Now, with the latest update, the company said it has managed to create a new three dimensional rendering that can be used across platforms — if developers let Genies handle the animation.
Genies has already managed to sign up many of the biggest names in entertainment to act as their official manager through their Genies talent agency. These include celebrities like Shawn Mendes, Justin Bieber, Cardi B, and Rihanna. Genies also locked in deals with the National Football League’s player’s association along with Major League Baseball and the National Basketball Association.
Now, those celebrities and athletes can monetize exclusive digital goods made by Genies on platforms like Gucci and Giphy and the fashion house and meme generator can now give users their own digital identity to play around with.
“Over the past year, our technology has been sharpened by the exacting creative demands of celebrities. This advanced Genies’ march to be the go-to avatar globally,” said Akash Nigam, Genies CEO and co-founder, in a statement. “What was previously a celebrity exclusive experience, is now broadly available for consumers to use as their virtual portable identities. By opening up to the masses, we’ve now created an opportunity for tastemakers to forge new, unique relationships with their audiences through avatar digital goods.”
The SDK integrations are still highly curated and tailored (there’s a lot of heavy lifting that Genies needs to do with each one). For instance, Gucci users can try on the latest designs and the company will sell digital goods on its platform created by Genies. Giphy users will use their avatars as gifs on its site and through its distribution network.
“Our Avatar Agency has served as the go-to platform for thousands of artists, and with our next-gen, highly expressive and dynamic 3D Genie, we will further solidify our position as the universal digital identity,” said Izzy Pollak, Director of Avatar SDK at Genies. “For celebrities and everyday users alike, it unlocks new arenas and verticals for users to cultivate their avatars in. On top of traditional 2D environments like mobile apps and websites, Genies can now live in AR/VR platforms, games, and in use cases or SDK partner platforms that demand a 360-degree rendering of the digital goods they purchase,”
Security testing company NSS Labs “ceased operations” last week, the company said in a notice on its website, citing impacts related to the ongoing coronavirus pandemic.
The Austin, Texas-based company was quietly acquired by private equity firm Consecutive last October. But last week, the company was reportedly preparing for layoffs, according to Dark Reading, which first reported news of the company’s shuttering.
In a brief post on LinkedIn, NSS Labs’ chief executive Jason Brvenik hinted at layoffs, adding: “If you are in need of excellent people that exceed my high standards, please get in touch.” (Brvenik listed himself as a former chief executive on his LinkedIn profile.)
Former employees told TechCrunch that they had been laid off as a result of the company’s closure.
NSS Labs, founded in 2007, was one of the most well-known product security testing companies, allowing customers to use real threat data to stress-test their products and discover potential vulnerabilities and security issues.
But the last few years have been rocky. NSS Labs retracted its “caution” rating for CrowdStrike’s Falcon platform in 2019, after the two companies confidentially settled a lawsuit challenging the results. NSS Labs also dropped its antitrust suit against the Anti-Malware Testing Standards Organization (AMTSO), Symantec and ESET, after the testing giant claimed it had discovered evidence of the companies allegedly conspiring to make it harder to test their products.
Spokespeople for NSS Labs and Consecutive did not immediately return requests for comment.
Send tips securely over Signal and WhatsApp to +1 646-755-8849.
The public sector usually publishes its business opportunities in the form of ‘tenders,’ to increase transparency to the public. However, this data is scattered, and larger businesses have access to more information, giving them opportunities to grab contracts before official tenders are released. We have seen the controversy around UK government contracts going to a number of private consultants who have questionable prior experience in the issues they are winning contracts on.
And public-to-private sector business makes up 14% of global GDP, and even a 1% improvement could save €20B for taxpayers per year, according to the European Commission .
Stotles is a new UK startup technology that turns fragmented public sector data — such as spending, tenders, contracts, meeting minutes, or news releases — into a clearer view of the market, and extracts relevant early signals about potential opportunities.
It’s now raised a £1.4m seed round led by Speedinvest, with participation from 7Percent Ventures, FJLabs, and high-profile angels including Matt Robinson, co-founder of GoCardless and CEO at Nested; Carlos Gonzalez-Cadenas, COO at Go -Cardless; Charlie Songhurst, former Head of Corporate Strategy at Microsoft; Will Neale, founder of Grabyo; and Akhil Paul. It received a previous investment from Seedcamp last year.
Stotles’ founders say they had “scathing” experiences dealing with public procurement in their previous roles at organizations like Boston Consulting Group and the World Economic Forum.
The private beta has been open for nine months, and is used by companies including UiPath, Freshworks, Rackspace, and Couchbase. With this funding announcement, they’ll be opening up an early access program.
Competitors include: Global Data, Contracts Advance, BIP Solutions, Spend Network/Open Opps, Tussel, TenderLake. However, most of the players out there are focused on tracking cold tenders, or providing contracting data for periodic generic market research.
VENN , the streaming network hoping to be gaming culture’s answer to MTV, has raised $26 million to bring its mix of video game-themed entertainment and streaming celebrity features to the masses.
The financing came from previous investor Bitkraft, one of the largest funds focused on the intersection of gaming and synthetic reality, and new investor Nexstar Media Group, a publicly traded operator of regional television broadcast stations and cable networks around the U.S.
The investment from Nexstar gives Venn a toehold in local broadcast that could see the network’s shows appear on regular broadcast televisions in most major American cities, and adds to a roster of Nexstar properties including CourtTV, Bounce, and Ion Television. The company has over 197 television stations and a network of websites that average over 100 million monthly active users and 1 billion page views, according to a statement from Ben Kusin, Venn’s co-founder and chief executive.
“VENN is a new kind of TV network built for the streaming and digital generation, and it’s developing leading-edge content for the millennial and Gen Z cultures who are obsessed with gaming,” Nexstar Media Group President, Chief Operating Officer and Chief Financial Officer, Thomas E. Carter said in a statement. “Gaming and esports are two fast growing sectors and through our investment we plan to distribute VENN content across our broadcast platform to address a younger audience; utilize VENN to gain early access to gaming-adjacent content; and present local and national brands with broadcast and digital marketing and advertising opportunities to reach younger audiences.”
It’s unclear how much traction with younger audiences Venn has. The company’s YouTube channel has 14,000 subscribers and its Twitch Channel boasts a slightly more impressive 57.7 thousand subscribers. Still, it’s early days for the streaming network, which only began airing its first programming in September.
Since its launch a little over a year ago, Venn has managed to poach some former senior leadership from Viacom’s MTV and MTV Music Entertainment Group, which has been the model the gaming-focused streaming network has set for itself. Jeff Jacobs, the former senior vice president for production planning, strategies and operations at MTV’s parent company, Viacom and most recently an independent producer for Viacom, the NBA, Global Citizen and ACE Universe.
Venn is currently available on its own website and various streaming services as well as through partnerships with the Roku Channel, Plex, Xumo, Samsung TV Plus and Vizio.
The company has also managed to pick up some early brand partnerships with companies including Subway, Draft Kings, Alienware, Adidas and American Eagle.
In the suit, the Justice Department is expected to argue that Google used anticompetitive practices to safeguard its monopoly position as the dominant force in search and search-advertising, which sit at the foundation of the company’s extensive advertising, data mining, video distribution, and information services conglomerate.
It would be the first significant legal challenge that Google has faced from U.S. regulators despite years of investigations into the company’s practices.
A 2012 attempt to bring the company to the courts to answer for anti-competitive practices was ultimately scuttled because regulators at the time weren’t sure they could make the case stick. Since that time Alphabet’s value has skyrocketed to reach over $1 trillion (as of today’s share price).
Alphabet, Google’s parent company, holds a commanding lead in both search and video. The company dominates the search market — with roughly 90% of the world’s internet searches conducted on its platform — and roughly three quarters of American adults turn to YouTube for video, as the Journal reported.
In the lawsuit, the Department of Justice will say that Alphabet’s Google subsidiary uses a web of exclusionary business agreements to shut out competitors. The billions of dollars that the search giant collects wind up paying mobile phone companies, carriers and browsers to make the Google search engine a preset default. That blocks competitors from being able to access the kinds of queries and traffic they’d need to refine their own search engine.
It will be those relationships — alongside Google’s insistence that its search engine come pre-loaded (and un-deletable) on phones using the Android operating system and that other search engines specifically not be pre-loaded — that form part of the government’s case, according to Justice Department officials cited by the Journal.
The antitrust suit comes on the heels of a number of other regulatory actions involving Google, which is not only the dominant online search provider, but also a leader in online advertising and in mobile technology by way of Android, as well as a strong player in a web of other interconnected services like mapping, online productivity software, cloud computing and more.
MOUNTAIN VIEW, UNITED STATES – 2020/02/23: American multinational technology company Google logo seen at Google campus. (Photo by Alex Tai/SOPA Images/LightRocket via Getty Images)
A report last Friday in Politico noted that Democrat Attorneys General would not be signing the suit. That report said those AGs have instead been working on a bipartisan, state-led approach covering a wider number of issues beyond search — the idea being also that more suits gives government potentially a stronger bargaining position against the tech giant.
A third suit is being put together by the state of Texas, although that has faced its own issues.
While a number of tech leviathans are facing increasing scrutiny from Washington, with the US now just two weeks from Election Day, it’s unlikely that we are going to see many developments around this and other cases before then. And in the case of this specific Google suit, in the event that Trump doesn’t get re-elected, there will also be a larger personnel shift at the DoJ that could also change the profile and timescale of the case.
In any event, fighting these regulatory cases is always a long, drawn-out process. In Europe, Google has faced a series of fines over antitrust violations stretching back several years, including a $2.7 billion fine over Google shopping; a $5 billion fine over Android dominance; and a $1.7 billion fine over search ad brokering. While Goolge slowly works through appeals, there are also more cases ongoing against the company in Europe and elsewhere.
Google is not the only one catching the attention of Washington. Earlier in October, the House Judiciary Committee released a report of more than 400 pages in which it outlined how tech giants Apple, Amazon, Alphabet (Google’s parent company) and Facebook were abusing their power, covering everything from the areas in which they dominate, through to suggestions for how to fix the situation (including curtailing their acquisitions strategy).
That seemed mainly to be an exercise in laying out the state of things, which could in turn be used to inform further actions, although in itself, unlike the DoJ suit, the House report lacks teeth in terms of enforcement or remedies.
The Pocket 2 is the kind of device that makes me wish I got out a bit more. I’ve been testing it out for a few days, and, while it’s done a reasonably good job making my life look a bit more interesting, there’s only so much such a little device can do during this lockdown. That’s no fault of DJI’s of course. There’s only so much that can be done — and at the end of the day, a camera can only really work with the content you give it.
Even so, I’ve enjoyed my time with the product. As I did with its predecessor, the DJI Osmo Pocket. The device returns this week with a truncated name and a handful of improvements. Nothing on board is particularly revolutionary, but the original device was such a cool and innovative product when it first arrived roughly two years ago, the company can be forgiven for mostly focusing on refinement.
Image Credits: Brian Heater
The line builds on DJI’s know-how, developed with years of drone imaging and gimbal expertise. Unlike, say the Ronin or Osmo Mobile lines, however, the product works as a standalone, with a small built-in display that records directly onto a microSD card. But as with the original, the whole getup works a heck of a lot better when you’ve got an Android or iOS handset to work with. The Pocket still does the majority of the heavy imaging lifting, but your phone just works as a much better preview screen and control center than the measly one built into the device.
The system ships with a pair of connectors: USB-C and Lightning, depending on your device. It’s a solid setup, best controlled with two hands. I didn’t have any issues, but I don’t entirely trust the integrity of a connector enough to hold it with one. Better yet, there are wireless accessories that allow for you to control the system remotely via phone. And speaking of accessories, I highly recommend getting a mini tripod or splurging for the bonus pack that includes one. It can be tricky propping the system up correctly for those modes that require minutes-long record times. More than once a video ended when the device fell over due to a strong gust.
Image Credits: Brian Heater
The underlying imaging hardware has been improved throughout. The camera now sports a larger sensor (64-megapixels) and wider lens, shooting better videos and stills than the original. The device can zoom up to 8x — though I’d recommend sticking with the 4x lossless optical, so as to not degrade those shots you’re taking. (HDR, incidentally, is coming at a later date.)
The mics, too, have been upgraded. There are four in total on board. Definitely use that optional wind noise reduction. For even better quality, the combo pack also includes a wireless microphone with windscreen, so that, too, may be worth investigating depending on what and where you plan to shoot. The three-axis gimbal does a good job keep things steady — and moves smoothly for a variety of different image and video capturing tasks. As with the last version, I found the battery to be lacking — that’s doubly the case for the gimbal charging up an attached phone by default.
As usual, the shooting modes are the real secret sauce. In particular, I’m really smitten with timelapse and hyperlapse. The former offers a sped-up image, using the gimbal to stabilize the shot as you move:
Image Credits: Brian Heater
Hyperlapse takes it a step further, mechanically moving the gimbal from left to right in slow increments that give a sweeping shot of a space over time:
Image Credits: Brian Heater
The system also borrows subject tracking from the drone line. Draw a rectangular around an object on the smartphone display and the gimbal will move along with it. The tracking proved to be pretty accurate, though I ran into some issues in the shadows and in situations when there’s a lot of divergent movement happening — like when I attempted to capture the runner in a softball game. On the whole however, it does a pretty solid job with people and animals alike.
The gimbal is also great for stitching together panorama shots — something that can be a pain on a standard smartphone. It can either do together a standard ultra wide 180-degree shot or create a highly detailed 3×3 image by essentially stitching together nine images in one:
Image Credits: Brian Heater
The Pocket 2 occupies a strange territory. It’s essentially a $349 add-on designed to augment smartphone photography. It’s an easy shortcut for grabbing some really cool shots, but pros are going to be much more interested in shooting with, say, a Ronin and an SLR. That leaves hobbyists with cash to spend on something that will, say, really wow their friends on social media. It’s a way to capture some drone-style shots without ever having to leave the ground.
There’s an odd tension in only being able to review half of the new iPhone lineup. Though this review is focused on breaking down the iPhone 12 and iPhone 12 Pro, we all know that the iPhone mini and the iPhone 12 Pro Max are sitting out there in the wings.
Fortunately, we can extrapolate a lot about those devices from these, especially as the iPhone 12 mini is a direct mini-turization of the iPhone 12. Because of the way that Apple has bifurcated the line into ‘Pro’ and ‘non-Pro’ options, these two represent the meaty center of this particular rack of ribs. I think it’s safe to say that for the broad majority, one of these devices is going to be the de-facto option this season.
In some ways, these two phones are closer than ever before. They share a large chunk of internals, a nicely refreshed design philosophy and essentially complete parity of daily utility.
In others, they diverge, taking two paths of that design ethos. One path towards the boardroom and the other path towards the coffee shop.
Let’s break it down.
Design
The 12 Pro is likely the most premium feeling piece of consumer electronics I’ve ever touched.
If you’ve ever had the pleasure of handling or wearing an insanely high-end timepiece you’ll know that there’s a particular blend of sensations that tells you you’re touching something special. The hundreds or thousands of person hours that went into its design and construction, the sheer density of its high quality materials and the finishes that defy the eye to differentiate it from something grown, rather than synthesized.
Most of the iPhone 12 Pro finishes still use a physical vapor deposition process for edge coating. But the new gold (which I do not have in person but looks great) uses a special high-power, impulse magnetron sputtering (HiPIMS) process that lays down the coating in a super dense pattern, allowing it to be tough and super bright with a molecular structure that mimics the stainless steel underneath — making it more durable than “standard” PVD. One side effect is that it’s easier to wipe clean and takes on less fingerprints, something that my blue model was, uh, definitely prone to.
All of those characteristics of the world’s finest watches and jewelry pieces are present in the iPhone 12 Pro. Without the hundreds of thousands of dollars in usual cost.
And, like the proverbial ‘best soda’, you literally couldn’t pay anyone on the planet to make you a better one. When it comes to fine watches, the high end might as well be on another planet from most of us. When it comes to phones, the world is precisely 7.4mm thick.
Where the iPhone iPhone 12 Pro is jewel like, the iPhone 12 is fun, bright and utilitarian. The PVD coatings of the stainless steel are deep and rich on the Pro — but they gather fingerprints like they were in the business of collecting evidence. The blasted aluminum sides of the iPhone 12 welcome you to grab and go.
The back color on the blue model I had was also very well chosen. It’s deep indoors, bright in the sun and feels like part of a modern palette. All of which makes me sad that, as someone who seeks the high end of camera offerings with any of my phones, I can’t carry a bold color any more.
As of a few models ago, Apple deemed bright colors ‘not high end’ and has produced mostly sedate (with a few gold-flavored exceptions) dark greens, greys and neutral silvers in its top line phones. If you want a nice cool mint or bold red, you’re going to have to be happy in the “middle” of the line. I hope that this situation changes and we see the same bright design energy that shows off so well in the iPhone 12 coming up to the top of the line. Maybe just produce a couple of special ‘Pro-only’ colors like gold or navy.
One thing worth mentioning here too is that the iPhone 12 Pro is 189 grams where the iPhone 12 is 164 grams. While it may seem silly to note a 15 gram difference, I can say that in practice it does feel quite a bit lighter
Overall, the iPhone 12 feels like the Timex to the iPhone 12 Pro’s Rolex. It’s a great daily driver that feels light and fun. The iPhone 12 Pro leverages refinement as a category differentiator projecting a solidity that plays into the “Pro” posturing.
I have seen a few fine scratches crop up on my iPhone 12’s screen. I am not particularly careful with my review units, as I think it is my duty to treat these things as utility items that will get intense daily usage. Which is what they are. Nothing insanely noticeable, mind you, but whatever the improvements to overall hardness the new Corning Ceramic Shield process brings to the table it is not and will not be invincible to wear and tear.
U.S. users will get one “special” design detail that other countries won’t have yet: a small translucent window on the right side of the device that allows Verizon 5G Ultra Wideband signals to pass through. It’s odd to see an external detail added to the iPhone when the company has been so obsessed with removing detail for a decade. Especially when this feature is one that most people will never get the chance to use. No 80% design philosophy behind this decision.
One last note, the squared-off sides make it far easier to grip and to pick up from a flat surface than the iPhone 11’s rounded edges. As a fan of the iPhone 5’s ID I welcome this change back. Over time, it could lead to less grip fatigue among those who go caseless because less pressure is required to secure it.
For bottom finger resters, the return to a square edge means a bit more discomfort here for your pinky. But the new ‘unified’ mating of the edge metal and glass means that you get a nice bullnose and no additional ridge of glass like you did on the iPhone 4. This makes it much less of a factor.
Overall, a bunch of really stellar work on this refresh. It’s pleasant, durable and attractive.
Pricing
This year’s iPhones did not increase in price. In fact, on a per-gigabyte basis the new iPhone 12 Pro is actually cheaper this year. The savings from the adapters and packaging probably negligible on a per-unit basis, though it does contribute to margins, obviously. The expanded storage options are actually cheaper by $50 this year. The overall effect is to make these new models a better value on every vector, especially if you upgrade. Even when you account for the loss in accessories.
The $30 surcharge per carrier is a bit of a copout. This fee, whether marketed as an ‘unlock’ or ‘upgrade’ fee makes the iPhones more expensive across the board than Apple’s announced minimums. Zero points for clarity of messaging on this one, Apple.
Camera
When evaluating the cameras in the new models, It’s important to note that the iPhone 11 Pro, iPhone 12 Pro and iPhone 12 all share the exact same sensor and hardware for the Wide and Ultra Wide cameras. The iPhone 12 and iPhone 12 Pro have received an updated 7-element lens that Apple says assists it in edge sharpness.
I saw some signs of improvement here but it can be difficult to tell for a few reasons. The iPhone 11 Pro was already very sharp across the image field, for one, and there is so much computational blending happening that it can be hard to differentiate between something that the software improved and something that the hardware has improved.
That, of course, is the whole point of software-driven photography. The hardware provides a foundation, but the image is built by algorithms whose parameters are decided on by engineers.
The other big upgrade in the Wide camera, though, is a new f1.6 aperture, which allows an Apple-quoted 27% more light in. In my testing I found the image quality to be pretty spectacular but without nullifying the iPhone 11 Pro except in some specific conditions. Simply put, the iPhone 11’s camera is already very, very good, but the moves forward in the iPhone 12 slot in above what would normally be a ‘one cycle’ difference.
There are some special upgrades that may be enticing to heavy iPhone photographers though, and I’ll get into those.
[gallery ids="2063137,2063119,2063107,2063099"]
The cameras in the iPhone 12 and 12 Pro and therefore their performance are very nearly identical. The major differences in the iPhone 12 Pro camera system can be summed up to these items:
A telephoto lens
LiDAR assisted autofocus
LiDAR assisted Night Portrait Mode (wide lens only)
The LiDAR array is very nice to have on the iPhone 12 Pro. There is one completely new mode that is not available on the iPhone 12 here — Night Mode Portraits. The autofocus improvement is active in any low light situation.
The ISP and Neural Engine improvements on iPhone 12 mean that these devices can now use Deep Fusion and Smart HDR 3 on all cameras. And, of course, on the iPhone 12 Pro they also handle LiDAR integration for autofocus and even Night Mode portraits now. This means that at one time you could have a dozen layers of image processing, depth maps, segmentation maps, tone mapping and data from multiple sensors all firing off and being processed in the space of one shutter press.
When Apple talks about Neural Engine performance increases, it’s not just the pure ML models that get “faster”, it’s the integrations into systems like these that get more capable. Unlike a CPU that foregrounds its benefits for you in terms of raw speed on a single tough task, the Neural Engine works in a quieter fashion to enable machine learning and compute tasks across the breadth of Apple’s baked in apps and third party apps that use the ML frameworks.
This enhanced high tension threading weaves itself into the fabric of standard processes, making them faster and lighter lift computationally — this reduces power consumption making battery life longer at the same time as it enables features like Deep Fusion to make the jump to the front camera and Ultra Wide camera.
You also get a bump in raw range with the addition of highlight mapping in HDR 3. This bumps your range up as much as 3 stops in high contrast situations. One byproduct of this is that shadows on the iPhone 12/12 Pro tend to look a bit more ‘open’ than they used to. This can be an adjustment from the more clipped blacks on the iPhone 11 but overall allow for more detail. It’s one of those auteur-style choices made by the iPhone photography team.
The practical benefits can be seen in the iPhone 12’s increased Ultra Wide lens quality. I absolutely love the iPhone 11 Pro’s Ultra Wide as a photographer — its bold angles and wide perspective open up a nice toolbox drawer for images of big stuff in tight spaces. But the fixed focus and generally lower sharpness always made me a tad reluctant to use it.
The iPhone 12 fixes this. The Ultra Wide is sharper edge-to-edge, crisper overall and has some very judiciously applied perspective correction built in to make sure that you don’t get distracting distortion along architectural lines in your images.
I was a bit worried about this corrective approach because those can be notoriously sketch. But I’m happy to say that it was applied with restraint. You probably won’t notice it unless you compare but it is great to have it running in the background for you.
For purists who love the distortion that wide angle lenses bring, this can be toggled off in the Camera Settings app.
The Ultra Wide lens getting Night Mode is hot, I’m glad this made the cut this year. The results mirror what you’d expect to see from the Wide camera, which is nice. The process is the same as it was last year when Night mode was introduced, with a slider for intensity (length of exposure) but now you get a nice alignment crosshair that helps you to keep your shot as straight as possible, which improves the ISP’s ability to align the multiple exposures being shot.
Both the iPhone 12 and iPhone 12 Pro have an improved lens for the Wide camera. It’s got a 1.6f stop aperture now which Apple says gives it 27% more low light gathering. In my tests this held up with a nice improvement in image quality and sharpness in dim lighting conditions. In order to test this, I made sure to turn off Night Mode completely on both of my test models and you can clearly see better color rendition, better sharpness and greater tonal range.
Where it gets tricky in this test is the iPhone 12’s improved optical image stabilization. I tested the phone in a stable position so it’s unlikely it contributed much but in handheld situations the 5000 actions per second of the new OIS system will give an edge to low light non-Night-Mode shots as well.
I didn’t have time to test Night Mode timelapse, which is a thing that now exists.
There is also now a Scene Detection toggle in your camera settings. This enables or disables an additional layer of image improvement that uses ML models that have been trained off of hundreds of different scenes to apply adjustments to specific kinds of recognizable scenes.
Shooting food pics on a plate? It will ignore the broad, bright plate which normally causes under-exposure. Shooting images of a big blue sky? It will minimize texture and moire issues. If you don’t want this even more aggressive computational feature active you can turn it off. This one is going to require more extensive testing, I wasn’t able to reliably detect the difference between two images shot with the toggle on or off.
At a time when Google’s Pixel line is leaning away from major camera improvements, there continues to be a lot of action in Apple’s camp. The majority of which you’ll get a benefit from whether you know the first thing about photography and editing or not.
Video
Apple has had a history of camera firsts with the iPhone and this year it’s Dolby Vision. Both the iPhone 12 and iPhone 12 Pro record Dolby Vision in up to 4K.
The iPhone 12 Pro can record 4k/60fps in HDR Dolby vision, but the iPhone 12 is limited to 4k/30fps in HDR. Here’s a screenshot of the modes and megabytes from the iPhone 12 Pro’s settings app.
10-bit HDR brings an expanded range of exposure and color possibilities for those that want to shoot extremely high quality video on an iPhone. Apple continues to use its home grown silicon to flex its video chops here. Processing over 50mb/s of 10-bit HDR video and then being able to edit it on device is pretty insane. In every day use, the iPhone 11 shoots pretty great video already.
I’m going to be flat out honest with you: the vast majority of iPhone users will never even be able to access HDR footage or workflows, and will never ever need this. If you do most of your video shooting on an iPhone and then share directly to social networks or in Messages, HDR will likely bring you no major benefit. The iPhone 12 shoots pretty amazing right out of the camera 4k footage in 30 and 60 fps.
However! If you shoot in demanding situations, and are among those who use the iPhone as a real filmmaking tool or simply love video as a hobby — you are in for a treat. In my testing, the iPhone 12 delivers a wider color gamut with an insane range of exposure. It retained detail in normally clipped highlights, displayed the ability to capture deep blacks with a real lack of crushing and blocking and was super forgiving in the edit bay.
This analogy is not precisely accurate, but the loose idea here is the same as shooting stills in RAW vs. letting the camera handle the processing. If you shoot RAW, you have more information to play with, but by default the image often starts out looking worse to some degree because it requires that you, the shooter, make choices about it. That’s the burden that the ISP in the iPhone takes on for you, it makes the adjustments to get your video looking good right out of the camera.
I shot similar casual video with both the iPhone 12 Pro at 4k/60/HDR and the iPhone 11 Pro. It’s difficult to represent them side-by-side because you must color grade the iPhone 11 Pro to fairly represent it, but I applied light grading to both cameras in order to get you a feel for what a project might look like shot from each and then exported it in SDR. The results speak for themselves, in my opinion. This is a big jump forward for iPhone video quality, which was already excellent if you are willing to adjust and color grade your footage. If you shoot both phones straight up and hold them up side by side, the differences are pretty minimal.
It was pretty straightforward to drop the Dolby Vision footage into Final Cut Pro X and edit it on a Pro Display XDR. But how many people have a setup like that yet? Obviously, HDR monitors and workflows are getting more common as we go but this is clearly a carrot dangled at serious video shooters and editors for now.
As a bonus, while shooting test images, I also managed to test the water resistance of both of the phones (great) and a set of AirPods (not great) when I stepped back into open air and fell right into my pool. You can see that at the end of the reel here. Great news, they kept recording, still work fine and the footage is crisp. The only long-term damage here involved my pride. Impressively, the AirPods pro worked just fine after I fished them off the bottom of the pool.
Portrait mode improvements
The portrait mode on iPhone 12 and iPhone 12 Plus are greatly improved in one major respect: they do a much better job of segmenting images along the border of things like leaves, hair, fur and other areas of fine detail.
You’re going to get less ‘messy blur’ at the borders of heads and faces and more crisp differentiation and, for lack of a better term, more smoothly confident separation of foreground and background.
Night Mode portraits are also now go on the Wide camera but I’ll discuss those in the section below.
The True Depth camera Night Mode and Deep Fusion support. The Night Mode is welcome for sure, because without a real flash (the screen blink flash has always been pretty low utility) selfies in dark places became essentially impossible on previous iPhones. Having this option makes a nice neon-lit selfie with yourself or friends a strong possibility. I must note here that Apple aggressively processes these shots and Deep Fusion is in full effect. As in previous years, Apple is making pretty strong choices with regards to what it thinks the product of this front facing camera should be. I found them to be bright and well exposed, but a bit over-tweaked in many cases leading to more skin smoothing, shadows that are more open than average and a flatter, lower contrast look.
LiDAR Stuff
LiDAR is an iPhone 12 Pro only feature. It enables faster auto-focus lock-in in low light scenarios as well as making Portrait Mode possible on the Wide lens in Night Mode shots.
First, the auto-focus is insanely fast in low light. The image above is what is happening, invisibly, to enable that. The LiDAR array constantly scans the scene with an active grid of infrared light, producing depth and scene information that the camera can use to focus.
In practice, what you’ll see is that the camera snaps to focus quickly in dark situations where you would normally find it very difficult to get a lock at all. The LiDAR-assisted low light Portrait Mode is very impressive, but it only works with the Wide lens. This means that if you are trying to capture a portrait and it’s too dark, you’ll get an on-screen prompt that asks you to zoom out.
These Night Mode portraits are demonstrably better looking than the standard portrait mode of the iPhone 11 because those have to be shot with the telephoto, meaning a smaller, darker aperture. They also do not have the benefit of the brighter sensor or LiDAR helping to separate the subject from the background — something that gets insanely tough to do in low light with just RGB sensors.
As a note, the LiDAR features will work great in situations under 5 meters along with Apple’s Neural Engine, to produce these low-light portraits. Out beyond that it’s not much use because of light falloff.
Well lit Portrait Mode shots on the iPhone 12 Pro will still rely primarily on the information coming in through the lenses optically, rather than LiDAR. It’s simply not needed for the most part if there’s enough light.
If you’re a camera-oriented iPhone user, your usage of the telephoto lens is probably the most crisp deciding factor between the iPhone 12 Pro and the iPhone 12. The LiDAR benefits are there, and they absolutely make a big difference. But not having a telephoto at all could be an easy make-or-break for some people.
One easy trick here is to make a smart album in Photos on a Mac (or sort your photos using another tool that can read metadata) specifying images shot with a telephoto lens. If that’s a sizeable portion of your pics over the last year, then you’ve got a decision to make about whether you’re comfortable losing that option.
When I did this, just about 19% of my iPhone 11 Pro shots were taken with the telephoto lens. Around 30% of those were portrait shots. So for me, 1 in every 5 images was shot with that tighter framing. It’s just something I find attractive. I like a little bit more precise of a crop and the nice amount of compression (for closer subjects) that comes with the longer focal length. Personally, I would absolutely miss it, which is a ding for me against the the otherwise solid iPhone 12.
5G
I’m gonna be straightforward here: Nothing that even approaches next gen 5G is available in California’s Central Valley, where I live. On one hand, it’s not great that I can’t zip out to a 5Guw enabled city like San Jose or SF, but on the other hand I think plenty of other reviews will touch on this.
The fact is that my experience will be shared by the vast majority of iPhone 12 buyers this year. The fastest flavors of 5G are available only on a few blocks of a handful of major cities at the moment and though the speeds are absolutely incredible there, that will have very little to do with the wider experience of buyers over the next 6 months. And, of course, millimeter wave 5G is not live for customers outside of the U.S. currently.
The LTE speeds delivered 50-80mbps average performance, but I did not perform extensive testing in this regard. My AT&T iPhone 11 routinely hits 150mpbs in the same area (I am within spitting distance of a Verizon and AT&T tower.)
Accessories
There have been a lot of arguments about the fact that Apple is not including a power adapter with this iPhone. The truth is that this situation is simultaneously several things. It’s a big marketing coup for sustainability, it’s an actual step forward in reducing e-waste and it’s cost savings for Apple who did not reduce the prices of the phones to compensate for no headphones and power adapter. All-in you’re looking at a rough $40 expense to replace those items or a $40 savings if you already have power adapters and headphones littering your place.
The big accessory news this time around, though, isn’t the lack of a power adapter or crapbuds, it’s MagSafe.
MagSafe
The reaction to MagSafe on iPhone will probably run the gamut from ‘this is a money grab’ to ‘finally’, to ‘meh’. I personally like it a lot as someone who charges almost exclusively via wireless charging now, but I can see the different perspectives here.
The MagSafe charger is finished nicely with an aluminum ring. The cord relief seems adequate, especially given that it’s not going to be getting the yanking that a Lightning cord often does. The top of it is a soft pad that prevents the phone from getting scratched while sitting on it or being attached.
Inside the back of the iPhone 12 lineup is a ring of magnets. There is no single polarization of the magnets in the phone, they are alternating which allows the alignment of accessories to be consistent and reliable. This means that no, they will not stick to your fridge.
As an aside, there is a slight magnetism if you place the MagSafe Charger onto the back of an iPhone 11 Pro. Not enough to align it properly at all, probably just residual attraction from the Qi charging array.
The array inside means that alignment works as advertised, at any rotational angle your iPhone will pop onto the charger and begin charging right away. There is a visual affordance on the screen that mirrors the shape of the MagSafe charger.
This process is way faster than fiddling with a Qi charger, and way less frustrating. Though I have had some luck with upright Anker chargers, the pad style chargers have always been crap at this. You have to fiddle to get them on right and often wake up realizing that you didn’t get it quite centered and your phone hasn’t charged at all.
In my testing, the iPhone 12 Pro charged an average of 11% every 20 minutes on the MagSafe charger connected to a 20W power adapter. For comparison, the iPhone 12 charged about 6% on a regular Qi charger over the same period. MagSafe also charges any phone that charges via Qi — it is in fact using a superset of Qi in a similar fashion to the way that AirPods use a superset of Bluetooth.
You can also use your iPhone while it’s charging with the MagSafe charger attached to the back.
But, of course, you can do that with a Lightning cable too. But there are ways that you will be able to use MagSafe that you can’t use a Qi charger. I expect third party accessories to enable mounting your phone via the charger to the side of a bed or desk or wall for vertical charging, for instance.
The MagSafe connection on the back of the phone is strong, but not massively so. The use of things like car mounts or PopSockets is going to depend on how strong the magnets in those units are. There is enough tension in it to hold the MagSafe charger on no problem. Which also means you must detach the charger by pulling it away or holding it down when you’re removing your phone. This will annoy some people but in practice is really not that big of a deal. I personally will likely use a small 3M pad under mine on my nightstand to keep it stationary. But for more heavy duty mounting situations, the covalent magnet strength is going to matter a lot.
And yes, it’s quite obvious that the MagSafe charger is paving the way to a portless iPhone altogether. Apple will be contributing many of the ‘enhancements’ it is making to the Qi standard back to the consortium so you may see more aligned magnetized stands in the future. This is not a ‘pay us dearly to license this tech’ situation, Apple wants this standard to proliferate.
The MagSafe Charger does not come with a power adapter, so be prepared to spend at least $60 total ($40 for the charger) getting it up and running if you don’t have a USB-C wall wart already.
Cases
I tested the silicon case and the clear case with the iPhone 12/12 Pro. I like the silicon case ust fine, and I’m glad that it now covers the bottom of the iPhone as well now. The clear case is pretty straightforward though I think a lot of people will be turned off by the large white ring (which does not center the Apple logo) on the back.
That ring is a magnet array, same as the one that’s inside the silicon case or the MagSafe charger. You can just see it because the case is clear. The small vertical line is an orienting magnet that helps the wallet and the other vertical accessories like the Belkin card case go on and stay on in the right orientation.
The magnets in the cases act as ‘passthrough’ for mounting, though the Qi field is just coming from the phone. The case allows other accessories like the wallet to be stacked, for instance — but you cannot charge through any more layers than you could before.
Wallet
About the wallet. I like the concept of this a lot, but the actual use of it is a bit meh. Here are some observations:
It’s shielded. This means that the magnets in MagSafe won’t wipe your cards’ mag strip. Though the bare phone does have the potential to do this through a thin wallet or in your pocket with a hotel key. Keep them separated.
The shielding works inwards and outwards, letting the wallet act as an NFC and RFID blocking wallet as well. This means no scanning your stuff and no accidental payment activations at tap stations.
You must to remove it to get cards out reliably. On the back of the wallet there is a thumb hole that lets you push to slide a card up and out. Some folks that have used them for a while tell me that sliding on the cards themselves from the front can work too. But in my experience, any way of getting them out that isn’t taking the wallet off the phone and pushing up on the back is an exercise in frustration.
It really only fits 3 “regular thickness” cards or 2 “premium thickness” cards like an Amex Platinum or Chase Reserve plus a thin ID card. Many credit cards like the Apple Card are now made of metal and thicker than they used to be. This means that the wallet actually has pretty limited storage capacity. Forget any folded cash.
The magnet effect is strong but not crazy strong. The wallet does not slide off but it’s not like it’s on there rock solid. It comes off pretty easily if you try. I am on the fence about how comfortable I feel trusting it to stay on there. More time with it is needed.
Performance and Battery
Battery usage seemed to be much in line with the iPhone 11 Pro. I typically clone all of my test devices off of my current devices and then do testing on performance once indexing has settled down. I got around 15 hours of heavy usage on the iPhone 12 Pro every day. The iPhone 12 seemed similar but it’s hard to say because I had to focus on one main device.
Performance wise I ran standard benchmarks on them but really can’t bring myself to do much more these days. They perform great and there does not seem to be any big anomaly between their claims and what exists. The fact is that with the growing importance of the Neural Engine and background ML tasks, the raw clock speeds of Apple’s processors actually get less and less important every release.
On the memory side, the iPhone 12 Pro seems to have 6GB of RAM and the iPhone 12 has 4GB of RAM.
Packaging
Apple has been working towards all paper packaging on all of its products for a while, though there is an outer plastic shell here they get closer than they have before.
The new packages are worth talking about the changes for a few reasons. They’ve ditched the accessories which means these are much much smaller profiles. That means more iPhones per square foot and lighter weight which means cheaper and more energy efficient to ship which actually means less emissions and expenditure down the complete line of the supply chain.
The boxes are thinner, because there is no need to accommodate the not-included power adapter or wired headphones. Apple has also completely eliminated the cheat sheet ‘manual’ from the package. That function is now performed by a simple screen protecting sheet of paper with the basic button functions stenciled on it. There’s even a wraparound ‘tail’ for the Lightning port.
You still do get a sticker though.
Finale
I’m in the awkward position here, maybe for the first time since I’ve been reviewing iPhones, of having neither of the devices that I want to daily drive in my hand. Though both of these phones have many features to commend them, they fall outside of the parameters that I use to decide what to carry with me.
My device picking rubric is purely defined by two characteristics:
The most compact and unobtrusive shape.
The best camera that I can afford.
The iPhone 12 Pro is bested (theoretically) in the camera department by the iPhone 12 Pro Max, which has the biggest and best sensor Apple has yet created. )But its dimensions are similarly biggest.) The iPhone 12 has been precisely cloned in a smaller version with the iPhone 12 mini. By my simple decision-making matrix, either one of those are a better choice for me than either of the models I’ve tested. If the object becomes to find the best compromise between the two, the iPhone 12 Pro is the pick.
But, for most people, the iPhone 12 is a really stellar buy. Its bright colors, lightweight but sound construction and improved camera make it the ‘easy choice’ for those confused by Apple’s broad current lineup.
As mentioned above in the camera section, if the telephoto lens is something you use a significant amount on your current phone, it’s a simple call, upgrade. If it’s not, do yourself a favor and think about putting a bit of color in your life, you’re not going to miss much by choosing the ‘regular’ iPhone 12.
Year after year, phishing remains one of the most popular and effective ways for attackers to steal your passwords. As users, we’re mostly trained to spot the telltale signs of a phishing site, but most of us rely on carefully examining the web address in the browser’s address bar to make sure the site is legitimate.
But even the browser’s anti-phishing features — often the last line of defense for a would-be phishing victim — aren’t perfect.
Security researcher Rafay Baloch found several vulnerabilities in some of the most widely used mobile browsers — including Apple’s Safari, Opera, and Yandex — which if exploited would allow an attacker to trick the browser into displaying a different web address than the actual website that the user is on. These address bar spoofing bugs make it far easier for attackers to make their phishing pages look like legitimate websites, creating the perfect conditions for someone trying to steal passwords.
The bugs worked by exploiting a weakness in the time it takes for a vulnerable browser to load a web page. Once a victim is tricked into opening a link from a phishing email or text message, the malicious web page uses code hidden on the page to effectively replace the malicious web address in the browser’s address bar to any other web address that the attacker chooses.
In at least one case, the vulnerable browser retained the green padlock icon, indicating that the malicious web page with a spoofed web address was legitimate — when it wasn’t.
An address bar spoofing bug in Opera Touch for iOS (left) and Bolt Browser (right). These spoofing bugs can make phishing emails look far more convincing. (Image: Rapid7/supplied)
Rapid7’s research director Tod Beardsley, who helped Baloch with disclosing the vulnerabilities to each browser maker, said address bar spoofing attacks put mobile users at particular risk.
“On mobile, space is at an absolute premium, so every fraction of an inch counts. As a result, there’s not a lot of space available for security signals and sigils,” Beardsley told TechCrunch. “While on a desktop browser, you can either look at the link you’re on, mouse over a link to see where you’re going, or even click on the lock to get certificate details. These extra sources don’t really exist on mobile, so the location bar not only tells the user what site they’re on, it’s expected to tell the user this unambiguously and with certainty. If you’re on palpay.com instead of the expected paypal.com, you could notice this and know you’re on a fake site before you type in your password.”
“Spoofing attacks like this make the location bar ambiguous, and thus, allow an attacker to generate some credence and trustworthiness to their fake site,” he said.
Baloch and Beardsley said the browser makers responded with mixed results.
So far, only Apple and Yandex pushed out fixes in September and October. Opera spokesperson Julia Szyndzielorz said the fixes for its Opera Touch and Opera Mini browsers are “in gradual rollout.”
But the makers of UC Browser, Bolt Browser, and RITS Browser — which collectively have more than 600 million device installs — did not respond to the researchers and left the vulnerabilities unpatched.
TechCrunch reached out to each browser maker but none provided a statement by the time of publication.
Adobe is betting big on its Sensei AI platform, and so it’s probably no surprise that the company also continues to build more AI-powered features into its flagship Photoshop applications. At its MAX conference, Adobe today announced a handful of new AI features for Photoshop, with Sky Replacement being the most obvious example. Other new AI-driven features include new so-called “Neural Filters” that are essentially the next-generation of Photoshop filters and new and improved tools for selecting parts of images, in addition to other tools to improve on existing features or simplify the photo-editing workflow.
Photoshop isn’t the first tool to offer a Sky Replacement feature. Luminar, for example, has offered that for more than a year already, but it looks like Adobe took its time to get this one right. The idea itself is pretty straightforward: Photoshop can now automatically recognize the sky in your images and then replace it with a sky of your choosing. Because the colors of the sky also influence the overall scene, that would obviously result in a rather strange image, so Adobe’s AI also adjusts the colors of the rest of the image accordingly.
Image Credits: Adobe
How well all of this works probably depends a bit on the images, too. We haven’t been able to give it a try ourselves, and Adobe’s demos obviously worked flawlessly.
Photoshop will ship with 25 sky replacements, but you can also bring in your own.
Neural Filters are the other highlight of this release. They provide you with new artistic and restorative filters for improving portraits, for example, or quickly replacing the background color of an image. The portrait feature will likely get the most immediate use, given that it allows you to change where people are looking, change the angle of the light source and “change hair thickness, the intensity of a smile, or add surprise, anger, or make someone older or younger.” Some of these are a bit more gimmicky than others, and Adobe says they work best for making subtle changes, but either way — making those changes would typically be a lot of manual labor, and now it’s just a click or two.
Image Credits: Adobe
Among the other fun new filters are a style transfer tool and a filter that helps you colorize black and white images. The more useful new filters include the ability to remove JPEG artifacts.
As Adobe noted, it collaborated with Nvidia on these Neural Filters, and, while they will work on all devices running Photoshop 22.0, there’s a real performance benefit to using them on machines with built-in graphics acceleration. No surprise there, given how computationally intensive a lot of these are.
Image Credits: Adobe
While improved object selection may not be quite as flashy as Sky Replacement and the new filters, “intelligent refine edge,” as Adobe calls it, may just save a few photo editors’ sanity. If you’ve ever tried to use Photoshop’s current tools to select a person or animal with complex hair — especially against a complex backdrop — you know how much manual intervention the current crop of tools still need. Now, with the new “Refine Hair” and “Object Aware Refine Mode,” a lot of that manual work should become unnecessary.
Other new Photoshop features include a new tool for creating patterns, a new Discover panel with improved search, help and contextual actions, faster plugins and more.
Also new is a plugin marketplace for all Creative Cloud apps that makes it easier for developers to sell their plugins.