The lock screen is the object you interact with more than anything else on your phone – it’s the most personalized part of the phone, and yet with iOS 16, Apple offers fundamental changes to the flagship screen. Apple Vice President of Engineering Craig Federighi described it on TechRadar this week as “a huge step forward.” The journey of about two years from the early attempts to personalize the home screen in iOS 14 to the rich and stunning expressive tools in iOS 16 is, in a way, obvious: a joint effort between engineering and design to offer personalization without blurring what people and love the iOS interface. But it is also a story full of surprises and, yes, innovation. As Apple’s WWDC 2022 is complete, Federighi and Apple Vice President of Design Alan Dye sat down with us via video conference to guide us through the development, decisions and cutting-edge technology that led to the brand new lock screen capabilities of the iPhone . The lock screen is already a destination for utilities (access to camera and flashlight), information (all these notifications that can fill the screen) and some light personalization (photo of your partner or cat). However, Apple’s personalization changes to the iPhone home screen two years ago in iOS 14 (custom widgets that share the screen with app icons) set the stage for bigger changes to the lock screen. “We knew this was a multi-act project and we knew our next place would be the lock screen,” Federighi said. “We saw a real opportunity to take this area that has really evolved slowly over time, but has never seen such a huge step forward, and to do something very big – but something very Apple and very personal. So, this is an act of love this year “, he added. We saw a real opportunity to do something very big – but something very Apple and very personal. This is an act of love. Craig Federigi Federighi, who is something of a WWDC meme (opens in a new tab) and known for his diffuse passion for all things-Apple and his attention to detail may be forgiven for some exaggeration, but it fits the idea he gets Apple This issue is more serious than many other phone manufacturers. Redesigning your iPhone’s face requires a lot of betting – consumers need to feel compelled to change for the better, which means that what is on offer must also be personal and retain the recognition of the brand for which it is known. Apple. “Our goal,” Dye told us, “was to make the iPhone even more personal – and certainly more useful – but also to keep intact these key elements that make the iPhone, the iPhone.” More than once Dye has said that the lock screen is a key part of the “iPhone icon”.

It was about time

The Apple iOS 16 lock screen has a whole new look and feel, but how did it happen? (Image: Apple) If you had to pick an item that really speaks “iPhone”, this might be the watch. You can look back at the iPhone’s 15-year history and instantly identify the device from this large, centered, upper third of the screen time. That will not change with the new lock screen – while Apple has been considering it, the decision has been made to retain the iconic feature. Instead, Dye described how his team designed a bold new, customized version of its San Francisco font that, for the first time, allowed iPhone users to choose different font styles and clock colors. “Typography is a huge passion for us, the design team, and we have a number of other Apple design fonts, even some non-Latin scripts. “So, for the first time, we let users choose their favorite,” Dye said. Obviously, personalization does not stop with adjustments to the way you view the time. IOS 16 enhances all the basic functions of the lock screen (information, personalization and utility), while also creating something much more visually impressive than ever on an iPhone. “From the Design Team’s point of view, our goal was to create something almost editorial and allow the user to create a lock screen that really… looks like a great magazine cover or movie poster, but to do it in a way that we hope is very simple to create, very fun, even with a lot of automation there, ”Dye said. This “magazine look” is achieved through a collection of new controls and customizations that bring together updated time, graphics, photos and deep technology that identifies good lock screen images and can combine them with elements in new ways. Instead of a screen that you can update with a favorite photo, but otherwise you can not change, iOS 16 will allow you to dig into the lock screen by holding your finger on it. This will open a collection of Lock Screen options and the ability to customize each lock screen to your liking. At the heart of all this customization and new lock screen appearance is the photo you choose – or not. IOS 16 will have a lot of pre-built-in lock screen options to help push you to the look and style you think will look best on your smartphone, without depriving users of the ability to make the changes they want.

The subject is photos

There is a lot of artificial intelligence behind putting her hair over time, while not completely changing what makes this interface so unique to Apple. (Image: Apple) Starting with the iPhone X and introducing portrait photography, Apple has embarked on a journey of understanding photos that has evolved into machine learning that can now understand what makes a good screen lock photo. “[There are] In fact, there are about a dozen neural networks that judge photography based on whether it is a desirable subject, if there are people there, how they are framed and trimmed in photography, their expressions. “All of these things allow us to automatically come up with really great, exciting choices for people and then put them on screen in a way that makes them feel almost completely young,” Federighi said. Choosing and suggesting photos that deserve Lock Screen is one thing, but with iOS 16, Apple makes images – or rather the theme – an integral part of the interface. About a dozen neural networks judge a photo based on whether it is a desirable subject Craig Federigi The “magazine look” mentioned in Dye is more than just the overall composition of the Lock Screen elements. It is that fluff of dog fur or the swelling of flowing hair that intersects with the element of time and, instead of sitting behind the numbers, spreads over it. It is an exciting – and professional – look that is created automatically. Apple calls it “segmentation.” Creating this look is something that Dye and his design team have been dreaming of for years. “We wanted to achieve this look, but the segmentation has become so good that we feel really comfortable putting it on. [in there]. “Unless segmentation is just ridiculously good, it breaks the illusion.”

Discovery

Dragging and dropping a subject from an image has to do with segmentation. (Image: Apple) Partitioning in iOS 16 actually goes beyond the lock screen. During the WWDC keynote address, Federighi showed how an iPhone user could touch and hold a photo of a bulldog on the beach and just drag the bulldog to, for example, Messages. It’s not a big leap to see the connection between segmentation on the lock screen and dragging and dropping perfectly selected items from one photo to another. “You’re right when you connect these two things, and we’ve developed some new neural networks – using a technique called ‘attention’ that allows us this new level of precision in locating and segmenting them – that we could apply to other cases. “Given what you’ve seen us do to lift and allow interactive photo uploads, what ‘s also really amazing is that, using Apple’s neural camera, we’re able to do that in about 100 milliseconds.” Federighi explained. . This speed is evident in the partitioning of the lock screen, which makes the hair overlap instantly. The intelligence that goes into the device and through the Apple Neural Engine on the A15 Bionic processor allows Apple, as Federighi explained, “to take a picture we’ve never seen before and figure it out and segment it, and let it the interaction so fast that we can do it the moment your finger hits the glass “. This segmentation, which reminds us of what the Google Pixel can do with its magic eraser, looks like a technological leap. “It’s definitely an area we’re working on, the area of ​​depth and segmentation, but you’re right that this year we had some discoveries that we could apply to this problem,” Federighi added. Apple’s understanding of your photos expands and helps you make filter adjustments that complement the image elements, although calling them filters is a misnomer of the styles that allow you to apply iOS 16 to lock screen photos. Instead of a series of image filters, Apple uses this segmentation knowledge to deliver a personalized set of impressions. “These styles are so much more than filters,” Dye said. “We use essentially segmentation, tonal values, all the understanding of the scene to really help us identify how we can cleverly offer a variety of treatments for each photo. Which is also nice because Apple’s very good at it. “Design technology and engineering work together to offer something really, I think, very beautiful.” Instead of eight or a dozen sets of filters, you can only be offered two styles for one photo and it is unlikely to be the same two if you choose a different lock screen photo. Dye told us that if the system does not think the photo will be great, it will not …