I haven’t tested this summarizing feature, but I can share some thoughts regarding this feature on the practical perspective, which I hope you may find helpful in determining the direction of this project.
On my experience with Google and Brave Search, one thing AI is particularly bad at is being infoboxes. Hallucination is not actually unique to AI, but while search engines easily reveal off-topic results in a simple glance, AI summarizers create or string together wrong results that can sound plausible to someone who lacks deep knowledge about a certain topic (which, unfortunately, is often the reason someone is searching in the first place). So although a number of people love its convenience, it being muddled by too many unrecognizable false positives makes it terribly unreliable to me that I frequently just ignore AI-made infoboxes.
On the other hand, I noticed that AI excels in seeding ideas. The times when I’m having difficulty thinking of keywords due to lack of knowledge on a subject, or queries in question form that are hard to distill into keywords, AI can come handy in giving me keywords that I can use, or websites where I can start the search.
With that, I’ll mostly agree with @mike’s idea of a separate URL, like what you did with the RAG search. The referenced websites being listed along with the response make it easier to check the quality of those websites’ contents, and to spot hallucinations by comparing their contents to the response. Moreover, the related questions section helps in narrowing down vague searches.