Hierarchical Self-Prompting SAM: A Prompt-Free Medical Image Segmentation Framework

Kavli Affiliate: Jing Wang

| First 5 Authors: Mengmeng Zhang, Xingyuan Dai, Yicheng Sun, Jing Wang, Yueyang Yao

| Summary:

Although the Segment Anything Model (SAM) is highly effective in natural
image segmentation, it requires dependencies on prompts, which limits its
applicability to medical imaging where manual prompts are often unavailable.
Existing efforts to fine-tune SAM for medical segmentation typically struggle
to remove this dependency. We propose Hierarchical Self-Prompting SAM
(HSP-SAM), a novel self-prompting framework that enables SAM to achieve strong
performance in prompt-free medical image segmentation. Unlike previous
self-prompting methods that remain limited to positional prompts similar to
vanilla SAM, we are the first to introduce learning abstract prompts during the
self-prompting process. This simple and intuitive self-prompting framework
achieves superior performance on classic segmentation tasks such as polyp and
skin lesion segmentation, while maintaining robustness across diverse medical
imaging modalities. Furthermore, it exhibits strong generalization to unseen
datasets, achieving improvements of up to 14.04% over previous state-of-the-art
methods on some challenging benchmarks. These results suggest that abstract
prompts encapsulate richer and higher-dimensional semantic information compared
to positional prompts, thereby enhancing the model’s robustness and
generalization performance. All models and codes will be released upon
acceptance.

| Search Query: ArXiv Query: search_query=au:”Jing Wang”&id_list=&start=0&max_results=3

Read More