At WWDC 2023, like every year, Apple presented a series of innovations for the protection of privacy and data security on the new operating systems for iPhone, iPad and Mac that will be released between the summer and autumn. In the period in which Meta avoids the launch of Threads in Europe and blocks accounts because it is aware of violating the Digital Markets Act and the generative AIs of OpenAI, Google and Microsoft pose new risks and dilemmas on online data sharing, the innovations introduced by Apple go in total contrast with the sector.
With Apple's artificial intelligence, the iPhone can give a voice to those who are losing it
by Bruno Ruffilli
On-premise generative AI
iOS 17, iPadOS 17, and macOS Sonoma introduce new machine learning-based features that demonstrate how AI models can be leveraged while minimizing the amount of data users have to share. The most striking example is the new Personal Voice function, which allows you to generate a synthetic voice equal to that of the user with a processing process that takes place entirely locally, on the device.
The new feature was built from the ground up with privacy at its core, Apple told us in a briefing dedicated to the new features. The model is generated on iPhone overnight when it's charging and not being used, and is not shared in any way with third-party apps. The only case in which the user voice model “leaves” the iPhone is in the case of backups to iCloud, which can still be encrypted end-to-end.
Filters for sensitive content
Personal Voice joins a now very large number of functions of the iPhone, iPad and Mac based on the use of machine learning (Apple never uses the generic wording "artificial intelligence"). One of these is the filter to limit the sharing of sensitive content on the iPhones of underage users.
With iOS 17, iPadOS 17 and macOS Sonoma the filter has been expanded to work not only on Messages but also on other apps such as Facetime (which can now receive video messages) and even Airdrop. The processing of images and videos also in this case takes place entirely on the device, without any sharing of data externally.
Sensitive image alerts can be activated from family sharing for children's smartphones, but are now also available for adults who want to activate them. The option (deactivated by default) is available in the Privacy settings. The system is based on the same filter, and avoids receiving unwanted nude or pornographic photos via Airdrop, while sharing a contact or on Facetime.
Apple is also working on implementing a framework based on the same model of moderation of visual content and aimed at third-party developers. Thanks to a new API, developers will be able to integrate the sensitive content filter into their apps with a couple of lines of code. For many minor messaging apps, social networks, dating and the like, the new API could offer an alternative, free and quick solution to the age-old problem of moderating pornographic content uploaded to platforms.
Apple and Google work together to prevent unwanted tracking
by Bruno Ruffilli
Less tracking on Safari
With the new operating systems, Apple will also strengthen the protection against tracking of data and users for commercial purposes on Safari. The most important and most feared news from advertising networks is the introduction of a new system for removing tracking parameters (such as clickID) from links. The browser will be able to automatically remove any code added at the end of the URL in order to track or recognize the user, as already happens when sharing a link in Messages.
This type of codes are particularly harmful during incognito browsing sessions, because they allow the session and the user to be identified despite the protection of anonymous browsing. For added security of private browsing, private sessions on macOS Sonoma will also be protected with Touch ID or password when Safari is not in use, so the session remains private and protected even when the user temporarily steps away from the computer.
Why using the same password is dangerous and 5 password managers to fix it
by Emanuele Capone
Access to photos controlled
On iOS 17 and iPadOS 17, Apple has redesigned the alert window for accessing the photo library. The pop-up now indicates the total number of photos and videos present, in order to remind the user that granting an app access to the entire library means offering a complete view of one's collection of images and videos.
The new pop-up also displays a selection of various images to offer a visual hint as to what type of content the application can access. The company explained that the selection of images is done with the same technology that generates memories on the Photo app: the selection aims to show different photos that contain selfies or portraits (your own or of friends and family), places, important moments or images of pets.
On the new operating systems, Apple has also strengthened a series of key functions dedicated to user safety. A particularly interesting novelty is the sharing of login credentials saved in iCloud Keychain with other family members. In the settings, under Family Password, you can add Apple IDs with which to share certain passwords. Different users will then be able to use their credentials to auto-complete when logging in. The system also works with Passkeys.
Also new for the Lockdown mode, an extreme security mechanism intended for subjects who may be targeted by spyware and high-profile attacks such as journalists or government officials. The system now provides an activation notice on all devices which suggests the user to extend the protection to all their Apple terminals.
Compared with the previous version, the updated Lockdown Mode further strengthens the restriction of data sharing, media access, app sandboxing and network access security. The purpose of the new features is to further limit the potential attack surface. For the first time since launch, lockdown mode will also be available on the Apple Watch.
Apple CEO Tim Cook: "Apps can change the world"
by our correspondent Bruno Ruffilli