How AI Assistant made a dumb search a little smarter

This is the story of designing Webex’s first smart search experience.

This design story starts like most. With a problem.

I got the call to help design AI search with the lead engineer, Uday, on the Control Hub AI team after our work on the Cisco AI Assistant character prompt.

Why? Global search in Control Hub was basically broken.

As a user, you get something like this more often than not.

The framework before relied on manual indexing from other teams, with zero standards or governance.

So obviously, things stopped getting indexed. No indexing. No search.

We needed a way to index pages without doing it all by hand. Luckily, Uday’s team was already exploring solving this problem for AI Assistant responses.

Then someone (can’t remember who, so I’m claiming it) goes: “Wait, can we use the same tech as what AI Assistant is using to for search?”

Answer: yes. It worked, thanks to a combo of Page Assist and AI-driven unique attribute labeling.

Which brings us to the design part.

Step one was a quick audit of what was already out there. At the time, Google and Perplexity were the big ones experimenting with similar search ideas.

(Little shoutout to Jason Fox for the elevator chat to jump start my brain. It’s a great tool. Highly recommend.)

Anyways, what really mattered was defining the intent. The intent defined the structure.

Like, ask how do I screw in a lightblub, you want steps. Ask what is SSO, you want a summary.

Research and early AI Assistant testing showed users were turning to AI to solve these things anyways.

So upping search functionality, seems like any easy win with things like:

  1. An overview or definition of a setting (like what is SSO?)

  2. Fast look up and navigation of individual settings (turning profile avatars on/off, currently buried under three layers of navigation)

  3. Suggested search-related questions (search “virtual background”, you might want to know how to adjust their settings)

This approach also helped build trust, making the AI Assistant feel more useful in context, as if it was actually paying attention to what users needed where they were.

Forgive the messy file. We were moving fast.

You know the drill: feedback, tweak, more feedback, more tweaks.

Got to my final version.

It’s simple. It's low-key.

I went with the same search component users already knew.

Nothing flashy. Just familiar but powerful in the best way.

After a bit more back-and-forth with engineering (I’ll spare you the yada yada)…

Basically, this also lined up with us redesigning the AI Assistant so it matched other versions across the company (work I was involved in) to make it look, interact, and feel as similar as possible.

So we work in the new design of the AI Assistant in the scope of this work too.

And then… we got it live.

Not just a prototype. Production. Real.

Experience is still young (as of 2025), but early usage is showing 15% higher click-through rates on settings and double the activation rates of the AI Assistant.

Previous
Previous

One menu to control them all

Next
Next

All we saw was a sea of errors