“Our data science teams went back and looked at some of the three million questions we’ve collected over time, and there’s a significant amount of questions that we see time and time again, around topics such as configuration management. Or ‘how do I do this in my network environment,’ and ‘what happens when I add some feature that I’m not familiar with?’” Ni said. “In the past, we would point you to a document that might be five pages, or it could be 50 pages, right? And it was incumbent on the end user to kind of comb through that documentation to figure out how to configure something or locate that specific section. Now, we generate specific, optimized responses and specific documentation to vastly improve accuracy, speed and detail,” Ni said.
“Accurately understanding the intent of a user’s question is paramount for better responses,” Ni added. “This can be a significant time saver for network operators trying to find a documentation answer they’re looking for.”
The genAI LLM search support is available now. It’s built into HPE Aruba Networking Central’s AI Search feature and expands upon existing ML-based AI capabilities to provide deeper insights, better analytics, and more proactive skills, Ni said.