
Tim Cook and Craig Federerghi on Apple WWDC 2025.
Jason Hiner/ZDNet
After the WWDC 2025 -keynote it is official – Apple goes in a different direction in AI than the rest of the industry.
There were none Bizarse promises or predicted breakthroughsIn contrast to last year’s moonshots “Personal context” And a re -devised version of Siri. And there was very little report of chat with bots.
Also: The 7 best AI functions announced on Apple’s WWDC that I can’t wait to use
Instead, Apple takes the less speculative and more established strengths of large language models (LLMS) and integrates it each piece in the iPhone and other devices, without even having to mention the word AI.
An AI for iPhone (and other Apple devices)
First and foremost, Apple is Live translation function. Language translation is one of the things that LLMS really does well. In most cases you have to copy and paste a chatbot or use a language app such as Google Translate to take advantage of those super powers. In iOS 26Apple integrates its live translation functions directly into the messages, Facetime and Telephone apps, so that you can use the function at the places where you have conversations.
Then there is Visual Intelligence. Apple now lets you use it from each app or screen on your phone by integrating it directly into the screenshot interface. With iOS 26, Visual Intelligence can now recognize what is on your screen, understanding the context and recommending actions. The example shown in the keynote was a flyer of events where you make a screenshot and automatically creates visual intelligence.
This is actually a step in the direction of an AI agent, one of the most popular – and sometimes overhyped – technical trends of 2025. I look forward to trying this function and seeing what else it can do. I was very lucky with the help of the Samsung/Google version of a function like this one Circle-to-search. Another new thing that visual intelligence will let you do in iOS 26 is asking Chatgpt Questions about what you have recorded on your screen. Visual intelligence can also take the text of what you have recorded in your screenshot and it loud for reading or summarizing.
Also: Apple’s Goldilocks approach from AI on WWDC is a winner. This is why
Another of the excellent LLM options that have been improved this year can be seen ShortcutsWho can now use the Apple Intelligence models. For example, you can create a shortcut that each file that you save on the desktop on macOS 26 would use Apple Intelligence to examine the content of the file (while retaining privacy), and then categorize it and move it to one of the different folders you mentioned based on categories you do. You can even automate this when you save a file on the desktop, which again looks more like an AI agent.
Another way in which Apple is tapping on LLMS in the new functionality in the Share button in iOS 26. Apple Intelligence will use its generative models to analyze the list and to change each tasks into the category you choose in the Reminders app. If it is a long list, you can even break the AI in subcategories for you, again using LLM’s Natural Language Processing (NLP) options.
Apple’s AI for developers
Finally, while most of the leading players in Generative AI Usually offers both a chatbot for the general public as a coding companion for software developers – because these are two of the things that LLMs are best known for – Apple did not say much about building on WWDC 2025. With thousands of developers on the campus of Apple Park, it may seem like the perfect time to talk about both.
But everything that Apple would say about the next version of Siri Die has long needed to think again that it still works on it and that it will not release the next Siri until it meets the high standards of the company for user experience.
And when it comes to programming mates, Apple did not reveal its own coding Copilot for developer tools such as Swift and Xcode – after a promise from Swift Assist on last year’s WWDC. Instead, Apple made a few major movements to enable developers. It opened its own Foundation Models Framework to enable developers to tap the forces of Apple Intelligence – with only three lines of code in Swift, Apple claims. Moreover, it all happens on devices and without costs.
Also: How Apple has just changed the developer world with this AI announcement
And in XCode 26, Apple will now enable developers to use the generative coding companion of their choice. Chatgpt is integrated as standard in Xcode 26, but developers can also use their API tests from other providers to bring those models to XCode. Apple sees this as a fast -moving space and wants developers to have access to the latest tools of their choice, instead of limiting developers to only things built by Apple.
All in all, Apple makes a lot of pragmatic choices when it comes to AI, leaning in the things that do best LLMs, and simply use generative AI to make better functions on his phones, laptops and other devices.
Stay informed by Apple’s newest AI developments and the rest of the AI ecosystem Tech today newsletter.
Source link
, , #Apples #DeChatbotAIF #completed, #Apples #DeChatbotAIF #completed, 1749653747, apples-de-chatbot-aif-from-ai-is-almost-completed