Tech
Apple iPhone 16 has GenAI and hearing aid Airpods
At the recent launch event in Cupertino, Apple Intelligence debuted, marking a significant milestone for the tech giant.
The new iPhone series is Apple’s first to fully integrate on-device artificial intelligence (AI) capabilities, including the latest A18 series of chipsets. The rollout of Apple Intelligence features will begin in the US in English in October 2024, with plans to expand to additional countries in December and support more languages in 2025.
While Apple was the last major OEM to incorporate AI into its devices, the company reiterated the same AI announcements made at its Worldwide Developers Conference (WWDC) in June 2024, without introducing anything new. Apple Intelligence will focus on language, image, and action generation, leveraging personal context and supported by various on-device generative models. These models will dynamically learn from and adapt to individual users.
While Apple Intelligence will include both on-device generative models and cloud-based models, on-device processing is at the core of Apple’s intelligence. Siri will play a central role in its intelligence efforts, offering conversational context, the ability to switch between text and voice, on-screen awareness, and the capability to perform actions within an app.
Competition with Google’s Gemini AI
Google bought the fight to Apple’s backyard by aiming to integrate its Gemini AI models into 200 million Android devices by the end of 2024, prompting Apple to announce its own AI features.
However, Apple’s intelligence will initially be in beta mode on its latest models, with integration into the company’s ecosystem occurring gradually over the years. Meanwhile, Google’s latest innovation, its GenAI-based voice assistant, Gemini Live, is already accessible not only on the newly announced Pixel 9 series of smartphones, but also on an entire array of Android devices.
This upcoming iPhone cycle is crucial for Apple, as it aims to recover from lacklustresales, particularly in China, navigate regulatory scrutiny, and regain market cap lost to Microsoft and Nvidia due to their AI business. Apple will need to effectively promote its AI capabilities to succeed.
Market potential for Apple AirPods Pro 2
The integration of hearing aid functionality into earbuds has long been delayed in the industry. Apple’s AirPods Pro 2 has become the first consumer device to incorporate hearing aid capability, making it accessible to users.
Machine learning models facilitate the enhancement of the user’s hearing based on a preliminary test. All Apple consumer devices owned by a user will integrate the hearing score obtained through the AirPods Pro 2, automatically adjusting their audio output accordingly. This exemplifies the seamless functionality within the company’s ecosystem, which Android rivals will struggle to replicate.
The inclusion of hearing aid capability is likely to prompt competitors to adopt similar features swiftly. Apple’s third-gen AirPods have experienced lacklustre sales, but the enhancement of the AirPods Pro represents a significant improvement. With an estimated 1.5 billion individuals worldwide experiencing varying degrees of hearing loss, the market potential for this product is substantial.
“Apple iPhone 16 has GenAI and hearing aid Airpods” was originally created and published by Verdict, a GlobalData owned brand.
The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site.