Apps that generate custom to-do lists are completely different from apps for playing first-person shooters. Apps that allow you to order personal taxis and book hotel rooms are very different from apps that can design 3D objects.
Many apps are more than a clean interface. For example, consider Instacart. Certainly, the app has a database of products to choose from, an e-commerce component for managing purchases and billing, and a messaging interface between customers and shoppers.
Also: Best AI for coding in 2025 (and those not used – including deepseek R1)
However, it also has a vast infrastructure of trading with food outlets that allows them to maintain inventory updates. It has mapping and route optimization capabilities to manage shoppers and optimize their experience.
Writing code is not only complicated, but interdisciplinary. At least that’s because of a rather big project.
Apple’s Vision
This gives you the context for today’s topic. Vibe coding app using Siri.
The amazing people at 9to5mac recently ran an article.Apple wanted to use Siri to vibrate people to the Code Vision Pro app“Apple came from a report explaining that even those who don’t know Computer Code want to be able to build AR apps on their headsets via Siri Voice Assistant.
Apparently, Apple executives have discussed such features, but they have not yet been implemented.
On the one hand, the idea seems to be a slut. How many times has Siri ruined merely just transcribing a text message to a friend? To assume that Siri can do things as complex and powerful as creating apps would assume that Siri is not the rather simple AI we all know and love.
Also: How to use this AI tool to build an app with just one prompt
But let’s assume that Apple Intelligence could ultimately rise above the crushing disappointment it was, and that Siri ultimately has the same AI skills as ChatGPT or Google Gemini. Then there is a starting point.
I’ve repeatedly shown that AIS can code. My recent tests included a hit base road home run with ChatGpt and Gemini Pro 2.5.
Then it’s not unrealistic to think Apple (someday soon?) has Siri that works at least at the level of its competitors.
What does Apple need to do to enable code (aka AI code) to create a vibe in apps using Siri? Three key factors, technology, the relationship between Apple and coding, and managed expectations should be explained.
Technology is here
There are several precedents to the idea that you can explain an app in a sentence, and AI can write it. Last week, GitHub Spark showed us how to build code analysis tools from a single statement. Yes, the interface was ugly. Yes, I tried somewhat uselessly to refine it, but in reality, AI built a working app from a single statement description.
Immediately after ChatGpt was a huge hit, I asked them to create a complete WordPress plugin that includes a user interface. The plugin was pretty simple and required multiple sentences, but ChatGpt was amazed at his ability to get the job done.
So it may take a little time to get it right, but technology is there to do the job.
Apple’s history empowering citizen developers
Apple has a long history of empowering developers, but it also misunderstands what is needed for development. know. i was there. The Apple II was a hit on the original consumer computer, not only because of its friendly shell, but also because Apple included a basic programming language, which allowed access by new users.
When Apple introduced MAC, it also introduced a huge library of books, including interface and coding guidelines, allowing third-party developers to create Mac apps that look like Mac apps.
Apple introduced each of these products with the recognition that developers, developers and developers encourage hardware acceptance. After all, that’s what you can do with a machine that makes them worthwhile, right?
Also: Brace Yourself: Thanks to AI, the era of “civic developers” creation apps is here
Apple’s first major low-code product was the innovative hypercard. This is a tool that allows you to draw the user interface and connect modules with minimal code. (Because I started the first company to build tools for hypercard developers and run Apple’s hypercard projects.)
But Apple had a big disconnect. I remember sitting in the product manager’s office at Apple’s HyperCard and listening. Apple users don’t want customization, he told me.
But every day I spoke to school teachers, sports coaches, doctors, merchants, small business owners, and even occasionally big budget film directors and sitcom stars.
Also: the most popular programming languages (and what they mean)
Other low-code tools introduced by Apple include Automator, Shortcuts, Playgrounds (as part of Swift), and Xcode Interface Builder. Apple has messed around with AR creation tools like Reality Composer, which was introduced in 2019. This allowed developers to place drag and drop 3D assets, animation and basic interactions without writing code.
We see these tools as resources that empower citizen developers. These people aren’t necessarily the first developers, but they are willing to learn the skills they need to get the job done. Not everyone wants to build apps, but there are many people who want to develop apps just because they think they can be rich, but there are also surprisingly large and diverse groups who want to build apps because they want their computers to do independent tasks.
Management of expectations
This will be the heart of all AI that codes the masses. A naive newbie wants to issue a single line command and suddenly take the helm of the next billion dollar Uber.
That’s never possible, but there’s a full potential for AI coding tools to help Uber developers maintain and improve their code.
AI tools can code apps. I saw it with examples of Github Spark and WordPress plugins. You can also create a drag-and-drop interface for an interactive experience. Real-life composers helped people do that six years ago.
The real question is what kind of apps can you build with AI? How much work do you want to do? To what extent does AI handle iterations and incremental improvements?
Also: This AR headset changes the way surgeons are seen inside patients
So far, AI has been terrible at gradually improving work. They work much better when asked to replicate something perfectly, but there are some new elements. This makes it particularly difficult to get AI and make incremental changes without any underlying changes to the iteration randomly.
Some projects are not practical in regards to things like coding your chair commands, which seem to imply atmospheric coding. For example, it may be possible for non-coder or low-coder to build AR and VR environments, but for a team of highly experienced engineers to build an AR experience, a spine surgeon can confidently cut and repair a patient’s spine.
As you are considering vibe coding to create apps, it is important to recognize that such tools are suitable for specific applications (particularly form-based apps), and that it doesn’t work very well with other types of applications, especially large and small titles that drive a billion-dollar business.
Draw a view
In marketing, the term “painting” refers to the practice of presenting a marketing message vividly enough to create a mental image, thereby capturing both the essence of what you are trying to sell and the imagination of your prospects. Although this practice often exaggerates the real experience of using products, it resonates with prospects and drives sales.
So, is it ridiculous to expect people to have a vibe in Code Vision Pro apps with SIRI? It’s a serious picture of the scenery there, I tell you.
First of all, selling Vision Pro is a struggle. For those who need the device, it would be better if they could build their own applications, as the device is not selling enough to justify the development efforts of more commercial coding shops.
Also, the killer feature of Apple Vision Pro is finally here
Secondly, Siri still requires a lot of work before most of us trust it to send the text properly.
But is AI-supported AI-supported application development for AR and VR a part of a potential future? Yeah! That’s not impossible. The technology is already here (not Apple Intelligence). The rest is a gradual improvement issue, knowing what works and what needs help, and waiting for it to be implemented.
Conclusion
Please keep your expectations down. Find out what tools, what they work well, and where you run into the walls. I doubt Apple will approve a lot of amateur coded VR and AR apps for the App Store, but there is no doubt that there is a great job and you can see great work by people who don’t code for a living.
Also: My favorite XR glasses for productivity and travel got 3 major upgrades
Conclusion: Coding your AI Vibe Vision Pro AI app with Siri is not an unrealistic expectation. But before we get there, there’s a few work needed and you need to keep managing your expectations.
What do you think? Can you watch the app build just by explaining to Siri? Have you tried low-coded and no-coded tools like Hypercards, Shortcuts, and Reality Composers? Do you think Apple is on the right track with this vision, or are we just drawing Vista a little more vividly? Please let us know in the comments below.
You can follow daily project updates on social media. Be sure to subscribe My Weekly Updated Newsletterand follow me on twitter/x @davidgewirtz,On facebook facebook.com/davidgewirtzon Instagram instagram.com/davidgewirtzin blue skiing @davidgewirtz.comand on youtube youtube.com/davidgewirtztv.
Get ZDNET’s biggest story every Friday Review Newsletter Week.