Sitemap

AI Needs UI

6 min readJul 21, 2025

It seems like every day, someone who doesn’t know anything about design proclaims “UI is going away” thanks to advances in AI. The logic goes that soon we’ll just converse with an AI assistant to get everything done. We won’t need any of these pesky menus, buttons, maybe not even screens.

But user interfaces aren’t disappearing; they’re evolving. Remember when mobile became ascendent 20 years ago? We added new UI components to take advantage of what mobile could do. AI is the same. AI makes great UI more important than ever so that we can understand and use it effectively, building better mental models of what this technology can and cannot do. We cannot know AI capabilities and limitations solely from a text box.

A Text Box isn’t the Ultimate UX

Let’s stop pretending that a single chat box (or a voice assistant) is the pinnacle of user experience. Conversational AI is powerful, but one size doesn’t fit all for interactions. In many cases, a visual interface is far more efficient and user-friendly than typing or speaking. Consider voice assistants: Alexa and Siri were originally voice-only, but even Amazon realized pure voice has limits — hence the Echo Show and devices with screens. Why? Because humans consume visual information faster than spoken information. We can read ~250 words per minute but speak or listen at ~150 wpm. If you ask an AI assistant for the top five movies playing tonight, do you really want to sit and listen as it reads a list aloud? Probably not. It’s so much faster to glance at a screen showing movie titles or thumbnails. Scanning items on a screen takes a fraction of the time it would take to hear them. Multimodal beats monomodal: combining AI with visual cues or GUI elements gives a better experience than voice or text alone. A UI should be as visible as it needs to be.

There’s also a practical accessibility aspect: not everyone can (or wants to) type elaborate prompts or talk to their devices all day. Typing is a skill we in tech take for granted, but many people aren’t fast typists. Speech recognition still makes mistakes and isn’t ideal in noisy environments or for all languages/accents. And crafting the “perfect prompt” to coax the right answer from an AI is a new kind of literacy not everyone has—or will ever have. Relying 100% on chat or voice UIs would exclude or frustrate a lot of users. Sometimes tapping a button or swiping a screen is just easier and more universally accessible. Stop making users guess a secret formula for completing their tasks.

AI is Driving More UI Innovation, Not Less

The rise of AI is leading new kinds of UI, not a UIpocalypse. We’re already seeing the advent of UI for AI: interfaces designed specifically to harness AI’s power without dumping the burden on the user to craft perfect prompts. Instead of hiding functionality behind a blank text box, give people intuitive controls to direct the AI. Imagine an image editing AI. Rather than forcing the user to type “make the sky brighter and remove the tree on the right,” why not let them click or highlight the parts of the image they want changed? Select a region and adjust a slider, or paint over the object to remove. Tools, not just text boxes. This kind of direct manipulation is often more precise and user-friendly than playing AI Mad Libs with a prompt.

In professional software, we see this trend clearly. The most successful implementations of AI are blended into existing UIs. Photoshop, Figma, Canva — all these creative apps have added AI features by augmenting their toolsets, not by throwing up a chat window and saying “you figure it out.” These tools integrate AI with minimal or no prompting required from the user. The AI operates behind the scenes, while the user continues to interact through familiar menus, buttons, and canvas controls. This means less cognitive load on the user. They invoke AI powers through a well-designed UI affordance, instead of recalling arcane commands. The interface remains in the foreground, guiding the AI and the user. It’s a partnership: the AI does heavy lifting, but the UI gives the human understandable control and feedback.

AI is also enabling hyper-personalization of interfaces. Rather than one generic UI to rule them all, AI can tailor the layout, content, and functionality to each user’s needs in real time. Far from disappearing, UIs might become even more present but highly individualized. The future of UX could be one where every interaction is an individualized experience, with interfaces adapting on the fly to a user’s context and preferences. Your app might rearrange itself or surface different options than mine does, because the AI behind it knows what each of us likely wants next. This kind of dynamic, adaptive, context-aware UI will keep users more satisfied than any one-size-fits-all chat prompt could. AI is helping create UIs that are smarter and more responsive — not making UI obsolete.

Even AI Assistants Need an Interface (or, The “AI OS” Still Has a UI)

Let’s address the idea of the AI operating system — the vision that an AI agent that will manage apps and tasks for you, so you never have to touch the UI. It’s true that we’re moving towards AI that can orchestrate between apps and other software. Researchers and companies are building AI agents that can click, scroll, and type in apps on your behalf, essentially using the UI just like a human would. So yes, an AI might navigate between your calendar, email, and booking app to schedule a meeting while you sit back. But notice what that means: the UI still exists, only now the AI is a user of it as well. If anything, it puts more pressure on designers to create clear, consistent UIs because both humans and AI bots need to understand them. It just operates the GUI like we do. If the GUI is messy, the AI struggles too.

Sure, many Agents will bypass the UI entirely, going straight to APIs to perform actions. But what happens when there’s a problem? Or options to choose from? Even if an AI agent does 90% of a multi-step task behind the scenes, the user still wants visibility and control over critical points. You’ll likely have a UI showing what the AI is doing, or a confirmation step for key decisions. For example, you might tell an AI OS “Book me a flight to London next month.” The AI can do the searching and narrow down options, but chances are you will want to pick from a short list of, say, three flight options it presents rather than just having it decide for you. How will those be shown to you? Not via it reading all details aloud! You’ll hopefully get a nice interface with the flight times, prices, airlines… aka a visual comparison UI so you can quickly choose the one you prefer. The AI might then handle the purchase transaction in the background, but you stay in the loop through the interface. In domains like finance, medical, or anything high-stakes, a “human in the loop” via a clear UI is essential for trust. We simply aren’t going to accept an invisible AI doing everything with zero UI feedback. We’ll demand dashboards, notifications, and controls to monitor and adjust what our AI assistants do.

I also want to push back on the notion that voice or text-based UI isn’t UI. It is. Just because an LLM can craft words doesn’t mean we should delegate design responsibility to it entirely. Just like with visuals, there are good and bad ways to respond, to provide feedback, to prompt for more information.

The Future is AI and UI, Together

Rumors of UI’s demise are greatly exaggerated. User interfaces are not going away — they’re shifting, adapting, and working in tandem with AI. From multi-modal experiences that blend conversational AI with visual elements, to adaptive UIs personalized by AI, to new design patterns for AI-first products, it’s an exciting evolution. But nowhere in this future does the UI vanish into a black box. Good UI will be a competitive advantage and a key to unlocking AI’s potential for users.

Some believe designers should stop polishing interfaces because AI agents will soon make front-ends irrelevant. Sure. Ok. That perspective ignores how real people behave and what they need. Until the day we humans evolve into purely voice- or text-driven beings (hint: not happening), we’ll benefit from having visual affordances and interactive elements to engage with our tech. Maybe not all the time, but definitely some of the time. The chat box is not the endgame of UX; it’s just one tool in a very large toolbox. The future of UX isn’t AI versus UI. It’s AI augmented UI.

--

--

Dan Saffer
Dan Saffer

Written by Dan Saffer

Designer. Product Leader. Author. Professor.

Responses (18)