Apple focuses on intuitive AI (Indian Express)
Friday 15 September 2023 / 17:37
At the recent Wonderlist event, Apple stunned the world with its most innovative iPhone Pro models ever. However, the tech giant was conspicuously silent about anything related to artificial intelligence.
Technology experts see this as Apple’s way of looking at the current wave of artificial intelligence, which has swept the technology industry.
The company has apparently chosen something known as intuitive AI, rather than generative AI, whose main function is to deliver some subtle AI-powered changes to everyday use cases, such as photography and answering calls.
Through its latest event, Apple put an end to all the rumors and speculation surrounding its plan to infuse generative artificial intelligence into its latest iPhones. However, it should be noted that it continues to work on its own generative artificial intelligence framework, Ajax AI, which competes with GBT Chat.
The company had previously chosen to ignore generative artificial intelligence, during its developer conference, which it held earlier this year.
One can get a better understanding of Apple’s artificial intelligence by referring to the new chip it designed for the iPhone 15 Pro models, the A17 Bionic chip, which is considered its most powerful chip yet, which is designed to add more power to its machine learning algorithms.
How does Apple use intuitive AI?
The most noteworthy feature is the use of machine learning to recognize the user’s voice. This allows the device to quiet background noise on calls. Besides, the camera and computational photography also use AI features. This includes automatic detection of people and pets, to provide insights that can enable the user to turn these images into selfies at a later stage.
Apple is also planning to introduce some more exciting features with the latest iOS 17 operating system. These features include extensive predictive text suggestions from your keyboard, automated transcription of voicemail messages, etc. Although these additions may not be as exciting as an AI chatbot, they provide greater convenience for users.
In addition to the above, the point and talk feature in the Magnifier app allows those with poor vision to read labels on things. They simply need to point the phone towards the body and the device will read the message. For users with speech issues, Apple’s latest operating system can produce an artificial voice similar to their own voice. To do this, they simply need to read 15 minutes of text input.
The AirPods have also been updated with some great artificial intelligence features, which blend music or calls with background sounds in the adaptive sound feature. Likewise, the much-talked-about double-tap feature in the Apple Watch 9 is powered by machine learning, according to what the newspaper reported. Indian Express.