Kavli Affiliate: Long Zhang | First 5 Authors: Lingwen Deng, Lingwen Deng, , , | Summary: Parameter-Preserving Knowledge Editing (PPKE) enables updating models with new or corrected information without retraining or parameter adjustment. Recent PPKE approaches based on knowledge graphs (KG) to extend knowledge editing (KE) capabilities to multi-hop question answering (MHQA). However, these methods […]
Continue.. Consistency-Aware Parameter-Preserving Knowledge Editing Framework for Multi-Hop Question Answering