Consistency-Aware Parameter-Preserving Knowledge Editing Framework for Multi-Hop Question Answering

Kavli Affiliate: Long Zhang

| First 5 Authors: Lingwen Deng, Lingwen Deng, , ,

| Summary:

Parameter-Preserving Knowledge Editing (PPKE) enables updating models with
new or corrected information without retraining or parameter adjustment. Recent
PPKE approaches based on knowledge graphs (KG) to extend knowledge editing (KE)
capabilities to multi-hop question answering (MHQA). However, these methods
often lack consistency, leading to knowledge contamination, unstable updates,
and retrieval behaviors that fail to reflect the intended edits. Such
inconsistencies undermine the reliability of PPKE in multi-hop reasoning. We
present CAPE-KG, Consistency-Aware Parameter-Preserving Editing with Knowledge
Graphs, a novel consistency-aware framework for PPKE on MHQA. CAPE-KG ensures
KG construction, update, and retrieval are always aligned with the requirements
of the MHQA task, maintaining coherent reasoning over both unedited and edited
knowledge. Extensive experiments on the MQuAKE benchmark show accuracy
improvements in PPKE performance for MHQA, demonstrating the effectiveness of
addressing consistency in PPKE.

| Search Query: ArXiv Query: search_query=au:”Long Zhang”&id_list=&start=0&max_results=3

Read More