Kavli Affiliate: Dan Luo
| First 5 Authors: Manisha Mukherjee, Sungchul Kim, Xiang Chen, Dan Luo, Tong Yu
| Summary:
The Adobe Experience Platform AI Assistant is a conversational tool that
enables organizations to interact seamlessly with proprietary enterprise data
through a chatbot. However, due to access restrictions, Large Language Models
(LLMs) cannot retrieve these internal documents, limiting their ability to
generate accurate zero-shot responses. To overcome this limitation, we use a
Retrieval-Augmented Generation (RAG) framework powered by a Knowledge Graph
(KG) to retrieve relevant information from external knowledge sources, enabling
LLMs to answer questions over private or previously unseen document
collections. In this paper, we propose a novel approach for building a
high-quality, low-noise KG. We apply several techniques, including incremental
entity resolution using seed concepts, similarity-based filtering to
deduplicate entries, assigning confidence scores to entity-relation pairs to
filter for high-confidence pairs, and linking facts to source documents for
provenance. Our KG-RAG system retrieves relevant tuples, which are added to the
user prompts context before being sent to the LLM generating the response. Our
evaluation demonstrates that this approach significantly enhances response
relevance, reducing irrelevant answers by over 50% and increasing fully
relevant answers by 88% compared to the existing production system.
| Search Query: ArXiv Query: search_query=au:”Dan Luo”&id_list=&start=0&max_results=3