Skip to content

Latest commit

 

History

History
82 lines (58 loc) · 3.4 KB

README.md

File metadata and controls

82 lines (58 loc) · 3.4 KB

LoGU: Long-form Generation with Uncertainty Expressions

1Fudan University
2University of Cambridge
3Tencent AI Lab

Version Stars Issues

image

Introduction

While Large Language Models (LLMs) demonstrate impressive capabilities, they still struggle with generating factually incorrect content (i.e., hallucinations). A promising approach to mitigate this issue is enabling models to express uncertainty when unsure. Previous research on uncertainty modeling has primarily focused on short-form QA, but real-world applications often require much longer responses. In this work, we introduce the task of Long-form Generation with Uncertainty (LoGU). We identify two key challenges: Uncertainty Suppression, where models hesitate to express uncertainty, and Uncertainty Misalignment, where models convey uncertainty inaccurately.

To tackle these challenges, we propose a refinement-based data collection framework and a two-stage training pipeline. Our framework adopts a divide-and-conquer strategy, refining uncertainty based on atomic claims. The collected data are then used in training through supervised fine-tuning (SFT) and direct preference optimization (DPO) to enhance uncertainty expression. Extensive experiments on three long-form instruction following datasets show that our method significantly improves accuracy, reduces hallucinations, and maintains the comprehensiveness of responses.

image

How to Install

You can use the following commands to install the environment for LoGU:

conda create -n LoGU python==3.8
conda activate LoGU
pip install -r lf_requirements.txt
pip install -r vllm_requirements.txt

Run

Try the following command to test our method on Bios, LongFact, WildHallu:

  • Generate answers
cd ./scripts
bash generate_vllm_responses.sh
  • Calculate Factual Accuracy(FA)
bash eval_pipeline.sh
  • Calculate Uncertain Precision(UC)
bash generate_unc_answers.sh
bash factcheck_unc_answers.sh

Training Data

Coming Soon!

We also provide some uncertainty expression models on the huggingface model hub for fast trail:

Model Link
rhyang2021/uncertain_llama3_8b HuggingFace
rhyang2021/uncertain_mistral_7b HuggingFace

If you have any questions, please feel free to email me or drop me an issue.