Apple brings ChatGPT to Apple Intelligence, debuts Visual Intelligence in latest developer beta
10/24/2024 00:30Apple is rolling out its latest iOS developer beta complete with ChatGPT integration for Apple Intelligence.
Apple (AAPL) is preparing to roll out the first raft of its much-hyped Apple Intelligence features next week. But the company is already preparing its next phase of AI offerings, complete with ChatGPT integration and Visual Intelligence, as part of its latest developer beta available Wednesday.
The software — iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2 — will run on the iPhone 15 Pro and iPhone 15 Pro Max, the iPhone 16 line, and iPads running on Apple’s A17 Pro and M1 chips or later, as well as Macs equipped with the M1 and newer. A developer beta is a type of software that provides app developers with upcoming features before they’re released to the public so that they can integrate them into their apps.
Apple’s initial Apple Intelligence update, which includes notification summaries, Writing Tools for summarizing and rewriting text, and the AI image editing Clean Up tool in Photos, is unlikely to get consumers to start buying up the company’s latest iPhones, iPads, or Macs in droves. They just don’t offer enough new functionality to justify spending hundreds of dollars on new devices at this point.
But couple those first Apple Intelligence capabilities with the ones included in the iOS 18.2, iPad 18.2, and MacOS 15.2 developer betas, and you can begin to see Apple’s strategy for turning its push into generative AI into the kind of product that gets customers excited to upgrade.
Wall Street is certainly hoping that’s the case too. Investors and analysts are looking toward Apple Intelligence as the catalyst for the next major iPhone sales cycle. Shares of Apple are up some 35% over the last 12 months, and 19% since the company announced Apple Intelligence during its Worldwide Developer Conference (WWDC) in June.
The latest Apple Intelligence developer beta includes some of the more intriguing features Apple showed off at WWDC. The most interesting is its ChatGPT integration with Siri. Apple says Siri will automatically detect when it needs help answering a question and ask if you’d like to use ChatGPT each time it attempts to access the chatbot.
The idea is to provide users with peace of mind that they aren’t constantly sending information to ChatGPT parent OpenAI. To that end, Apple says ChatGPT won’t store your data and won’t use your data for training its OpenAI models. What’s more, Apple says it will obscure your IP address when you use the bot and that you don’t need to sign up for a ChatGPT account to access it.
If you do sign into your ChatGPT account, however, your data will be covered by ChatGPT's policies.
Apple is also integrating ChatGPT into its system-wide Writing Tools so you’ll be able to ask the bot to help you compose text, as well as generate images for use in select apps.
In addition to ChatGPT integration, Apple’s latest developer beta includes its Visual Intelligence tool. Available for the iPhone 16 line via the new Camera Control button, Visual Intelligence will allow you to point your camera at, say, a restaurant and immediately get information like its hours of operation and reviews.
Apple says Visual Intelligence will also summarize text you point your camera at, read text out loud, detect phone numbers and emails and offer to add them to your contacts, copy real-world text, and scan QR codes. The company says you’ll also be able to point your camera at an item and see where to buy it via Google or get more information about it via ChatGPT.
On the creativity front, Apple is adding the ability to tweak text via Writing Tools by describing how you want to change it. For instance, if you write an invitation to a party, you can tell Writing Tools to make the text sound more enthusiastic.
Apple’s Image Playground, meanwhile, is a set of options that allow you to create an image using a range of concepts that you can combine or transform a photo of a person from your photo library. You’ll be able to change animation and illustration styles and save images to your own library.
The Image Wand option lets you turn a rough sketch into a complete image, while Genmoji allows you to create generative AI-based emojis. You can, for instance, describe something like a pumpkin spice latte and Genmoji will create one that you can then drop into your texts.
Genmoji also understands context when it comes to people in your contacts. So you’ll be able to tell it to make a Genmoji about your brother Michael running, and it will make an emoji of him doing just that.
Apple Intelligence is one of the most consequential products in Apple’s recent history. But it’s also a big risk for the tech titan. Despite companies like Google, Samsung, Microsoft, Intel, AMD, and Qualcomm touting AI features for smartphones and PCs, there’s still no solid proof that consumers are especially interested in generative AI versus, say, better hardware like faster devices, larger displays, or better battery life.
That said, Apple is releasing a slew of application programming interfaces or APIs for Apple Intelligence that will allow developers to take advantage of its various capabilities. And if developers build the kind of apps that pique consumers’ interest, Apple’s bet is far more likely to pay off.
Email Daniel Howley at [email protected]. Follow him on Twitter at @DanielHowley.
Read the latest financial and business news from Yahoo Finance.