GenAI Killed the Software App Star
The Buggles, Freezers, and AI Tools that Disrupt Past Patterns of Human-Computer Interaction
In early September (which now seems like forever away), when I was reading about all the pending changes to our interactions with genAI tools, I was reminded of this classic 1970s song recorded at the same time by two groups: The Camera Club and the Buggles.
Some thoughts on this:
This song is apropos in multiple ways, but the fact that it was recorded at essentially the same time by two groups is particularly appropriate. Claude and ChatGPT seem to be working with similar clientele. I’m not saying that, like the Buggles, one of them is going to overpower the other in terms of social memory. But one (ChatGPT) definitely has wider capabilities at the moment.
This song is about the society of the 1970s being terrified of present and future technological advancements significantly altering the way that media was created. I do not have to tell you that we are having similar fears (even those of use who are more “pro-” than “anti-” AI.
One of the songwriters, Trevor Horn, was inspired to write it by reading a 1960 sci-fi dystopian short story by J.G. Ballard. “I… had this vision of the future where record companies would have computers in the basement and manufacture artists.”
The Human Is Still the Boss
From the 2000s to the 2010s (in my memory), some believed that software apps were the end-all, be-all of tech progress, and each app had its own task: one for writing, another for design, yet another for calculations. Now, genAI is shaking things up, bringing together these tasks into one flexible system that can help you write, format, code, and create—often in a single session. But as powerful as genAI may seem, it’s essential to remember that AI is a tool, not a collaborator or a creative equal. AI doesn’t think; it doesn’t feel. It doesn’t make decisions. It just follows commands. Keeping humans in control of this process is not just practical; it’s necessary, as I have written before.
A recent article in Analytics India Mag (AIM) notes that with the recent developments in AI tools, most notably the “Canvas” and “Artifacts” features of ChatGPT and Claude, users can created “on-demand software” instead of relying on professional developers. Some members of AIM hypothesized that just as no-code programs brought software development to the general public and reduced the need for professional developers, Claude and ChatGPT are reducing the public’s reliance on no-code offerings.
This reminds me of the axiom popularized by David Wiley, “Don’t focus on your solution, focus on the problem.”
One popular way to describe this principle is the progress of keeping food and other things cold throughout time. For a more detailed look at this process, read Steven Johnson’s How We Got to Now.
First, people had root cellars that they dug deep in the ground. Then, people began to cover things with ice that was delivered by an “ice-man” (hence the popular play, “The Ice-Man Cometh.”). When freezer or “refrigerator” cars were developed, this eliminated the need for an ice-man except in the most rural communities. Ice boxes also reduced the frequency with which one would need to purchase ice. Then, freezers were invented, as well as the ability to make ice on one’s own home. The need for constant freezer electric power was further reduced when coolers and gel ice packs were invented.
With each new progress, the people who invested in and promoted the current solution as the “only way” lost money. People who were comfortable changing from the past solution to the new one were able to adapt quickly to a new solution to an old problem.
To be clear, ice itself didn’t go away (we still make ice today, even though we don’t necessarily need it to keep food fresh). Ice packs and coolers exists, but we still use freezers. There are four or five ways that we can keep food frozen and eatable, and we use them all in different ways, for specific contexts and purposes.
Human-Machine Interactions Lead to Human-Machine Products
This image was created by Ideogram, on August 8, 2024.
ChatGPT Voice Mode and the Role of Human Direction
AI voice modes, like ChatGPT’s latest, have gotten surprisingly good at holding conversations. You can talk to it about history, brainstorm ideas, or ask it to explain a concept—all in a style that feels natural and fluid. But without thoughtful questions, careful guidance, and clear boundaries, the voice mode is just a sophisticated text processor that has audio output.
I talked in my post on “Artificial Agency” about the fact that because the AI tool is communicating in a voice, people think of it as a person. This has facilitated a whole new way of navigating historical people and ideas. This could disrupt certain ways of interrogating historical records. Teachers could ask students to upload historical documents to an AI tool, have a “conversation” with it, and record their interview.
Artificial Agency: Characters, Friends, and Social Trends
NOTE: For the “NotebookLM” version of this post, see the widget below:
Human input keeps the conversation focused, ensuring that it doesn’t devolve into random or irrelevant responses. Users set the tone, the purpose, and the direction. AI may be able to keep up, but only humans can lead. This distinction is key as we explore new interaction styles because the AI, no matter how engaging, is still just a machine.
Creating A 50-50 Digital Alloy
***NOTE: In my mind, the words “teacher”, “trainer,” and “instructor” are interchangeable, as are the words “student,” “learner,” and “trainee.” The concepts in this post are applicable in formal and informal education, in online and in-person courses, and in educational, corporate, institutional, and public instruction.
Claude Artifacts and ChatGPT Canvas: Tools, Not Teammates
Anthropic’s Claude Artifacts and OpenAI’s ChatGPT Canvas are turning heads by offering creative and coding assistance directly within an AI-powered workspace. Both services offer real-time previews—Claude’s is more ideal for developers who want immediate feedback as they code or design—and Canvas is perfect for long-term projects with its intuitive editing and content management. But both systems require human oversight, input, and direct human manipulation for the highest-quality output. You, as the user, make the decisions, add the final touches, and determine what’s worth keeping or discarding. You decide how to export what and in which files. Claude is better for websites or code files. ChatGPT is better for presentation slides, documents, PDFs, etc.
With these solutions, it is not difficult to imagine the impact that these tools will have on the market for the traditional content creation or development applications.
These tools are like well-equipped workshops where your hands remain on the controls. It’s tempting to think the AI is a partner, but it’s more of a high-tech assistant, one that waits for your command to move forward. In this emerging AI-powered “app store,” human agency matters even more; without it, there’s no direction, no true creation, and certainly no accountability.
More on ChatGPT Canvas will come in a future article.
Script Programming and Chrome Extensions: Reinvented, Not Replaced
Script programming and Chrome extensions were once the ultimate ways to customize tech, helping us automate tasks or collect specific data from the web. Now, GenAI is capable of stepping in to do even more of this work, adapting on the fly without the need for scripts or plugins. While this streamlines things, the crucial element here remains the human user. Only you know what outcomes you need, and only you have the insight to determine if AI-generated solutions actually fit.
There’s a growing perception that AI is here to “take over” all tech tasks, but the reality is that users still guide and shape these interactions. AI might be able to aggregate data or present it, but users set the parameters. Without your goals, intentions, and adjustments, the AI’s outputs are just educated guesses.
Perplexity and SearchGPT: AI Does the Searching, Humans Set the Criteria
AI-driven search options like Perplexity and SearchGPT are great examples of AI’s evolving role in tech. They have moved past traditional search by focusing on context and answering questions in conversational ways. But they only work because users set specific search parameters and frame meaningful queries.
This shift emphasizes the importance of human judgment. AI may deliver answers, but it lacks the wisdom to evaluate which details are meaningful or how to connect these pieces to your broader goal. SearchGPT finds websites and can summarize what it finds in various formats (and can even find relevant images if you want), but it’s up to you to decide if those summaries hold value for your project or purpose.
More on SearchGPT (including the fact that it is NOT a search engine) will come in a future article.
ChatGPT as a Formatting Tool: The User’s Hand Is Always on the Wheel
ChatGPT now boasts document formatting abilities, helping users turn rough drafts into polished pieces without relying on separate word processors. I hae been able to direct ChatGPT to successfully fulfill tasks that its developers had not told people about. I made it create a PDF before the developers even released a model with PDF creation as a public ability.
Formatting and structuring are a helpful upgrade to simple text generation, but the responsibility to craft a coherent, compelling document lies with the writer. AI can follow instructions to structure and organize text, but it cannot understand your intent, tone, or priorities in the same way you do.
Using ChatGPT or Claude to format a report or create a proposal requires active oversight. AI might get the sections and headings right, but only a human writer can ensure the document’s flow, accuracy, and voice. AI formatting tools serve as powerful assistants, but the author’s judgment remains central.
You.com: AI’s Take on the News, Curated by the User
With You.com, AI has entered the news aggregation arena, presenting a tailored, streamlined way to stay updated. Yet, as with search engines, the AI lacks perspective. It doesn’t know which news stories you’ll find insightful or which angles you care about most. Users must still curate their news feeds by selecting preferred topics, sources, and formats. AI can pull up the latest headlines, but only you can determine which news matters.
The Bottom Line: AI as the Tool, Human as the Creator
Generative AI is changing the game, not by replacing traditional apps but by centralizing their functions in one adaptable system. However, it’s essential to remember that AI’s complexity does not make it an independent creator. It’s a tool—more sophisticated, but not fundamentally different from a pen or a camera. Humans set the goals, and humans make the calls. Every output that AI generates requires human direction and refinement to transform it from a machine-made artifact into something truly meaningful.
In the end, AI might automate tasks, provide recommendations, and streamline processes, but only humans bring purpose, discernment, and creativity. GenAI may have killed the “app star,” but it’s the human behind the interface who ensures that the song, the code, the document, or the design tells a story worth sharing.