From smart AirPods Pro to the home assistant: the smart news according to Apple

From smart AirPods Pro to the home assistant: the smart news according to Apple

[ad_1]

Active noise canceling headphones have many advantages but also some disadvantages. For example, they do not allow you to hear the important sounds: from an alarm sounding to a person turning to talk to us. Thanks to machine learning, Apple has solved this problem. In a few weeks, with the arrival of the new software version, the AirPods Pro 2, the latest version of the sound canceling earphones, will have a new Adaptive Audio mode.

The system will allow you to activate the noise cancellation but the software will be intelligent enough to understand if there are particular voices or sounds that we want to hear: conversations, important noises and so on. As explained, the feature will come “later” and will only available for AirPods Pro 2. The other models with active sound cancellation (1st generation AirPods Pro and AirPods Max) can’t have it because they don’t have the H2 chip, which is powerful enough to make everything work.

This is among the news that emerged during the Apple developer conference, the WWDC held in Cupertino a few weeks ago and during which many news have been announcedamong which the one that has perhaps monopolized the attention of the media, that is the Vision Pro viewer. But there were actually many other innovations and not just for Mac, iPhone and iPad. There are also a number of new features that are possible mainly thanks to artificial intelligence. Indeed, to machine learning, which is pretty much the same thing (here our glossary) even if Apple employees persist in not using the expression orally or on the official website. Just as, during the keynote in which the viewer was presented, no one ever talked about the metaverse (although everyone must have certainly thought so).

Beyond these verbal acrobatics, the new intelligent noise cancellation feature for the AirPods Pro could still be very interesting, especially if developed further, perhaps as a form of filter and protection for noisy environments: those attending concerts (or the musicians themselves) are exposed to an excessive amount of decibels which can damage hearing. Instead, the mechanical locking of the silicone adapter earphones and the intelligent external audio filter combined could make them a very useful tool. But it is not the only application made possible by the new algorithms developed by Apple.

Google I/O 2023

5 ways to personalize your Android smartphone using AI

by our correspondent Emanuele Capone



Machine learning, the glue of systems

Among the novelties for the home that Apple is building, with machine learning that holds together and makes the new ways to use software presented at WWDC, there are those for the Apple TV. The device that connects to the TV screen, with the new version of the operating system, will be able to operate FaceTime and later other dedicated apps with the camera of an iPhone or iPad. It is the Continuity Camera function already active for a year between Mac and iPhone / iPad that now lands on the big screen in your home or office and promises quality meetings. In addition, Apple TV then gains new features in the management of screen sharing and multi-user with separate profiles.

It’s not over: always using machine learning it will be possible use photographs or short animations as a screen saver of them, with missing frames to stabilize the computer-interpolated slow-motion movements. A use conceptually not different from deepfakes but for a good purpose. Apple is also improving the management of many features: participation in the car or at home in sharing the audio of Music or other apps (which the company calls SharePlay, convenient for particularly long journeys by car) and the improved AirPlay connections which allow you to connect to compatible monitors and televisions not only in the office but also in hotels for example.

On the automotive and home front, Apple is working to improve the CarLock support, i.e. the possibility of using the iPhone as a key to open and start the car, and the management of padlocks and digital locks, which can be managed remotely with the opening history that allows you to see who has done them in the last few days . Fundamental in this area is the growth of the safe standard Matter, the result of an alliance between manufacturers that is bearing fruit with dozens of compatible intelligent objects announced every month.

If some features (such as the one that will allow find the Apple TV remote using the phone) are convenient but traditional, others are possible thanks to the behind-the-scenes work of machine learning. Like the one that, in a house with various company devices, learns what the user’s preferences are depending on the time of day and the individual room and proposes the correct automation (the new episode of the favorite podcast in the HomePod above the fridge while cooking, for example) always with the philosophy that machine learning is used for make life easier but must respect privacy. If you switch devices, the training (which requires no user activity) starts again in the background.

Apple never utters the word artificial intelligence, continues to defend user privacy but demonstrates, with novelty which will also arrive in the coming weeks on the AirPods, HomePod and Apple TV front, which with machine learning it is possible to do a lot and in a safe way.

[ad_2]

Source link