You are here

Consumer Health Question Answering

Project information
Research Area: 

The consumer health question answering project was launched to support NLM customer services that receive about 90,000 requests a year from a world-wide pool of customers. The requests are categorized by the customer support services staff and are either answered using about 300 stock answers (with or without modifications) or researched and answered by the staff manually. Responding to a customer with a stock reply takes approximately 4 minutes; answering with a personalized stock reply takes about 10 minutes. To reduce the time and cost of customer services, NLM launched the Consumer Health Information and Question Answering (CHIQA) project.  The CHIQA project conducts research in both the automatic classification of customers’ requests and the automatic answering of consumer health questions.

The analysis of the requests identified subsets of reference questions that could be answered automatically.  LHC researchers have developed a customer service support system that categorizes the incoming requests and prepares answers for review by staff responding to customer requests.  The system combines sophisticated statistical methods with knowledge-based natural language processing techniques.  The pilot system was integrated in customer services workflow in May 2014. As the system matures, it could immediately provide answers to customers while they are visiting NLM Web pages.


Question Decomposition Data

Question Type Data

CHQA Named Entity Dataset

Consumer Health Spelling Error Dataset

Demner-Fushman D, Mrabet Y, Ben Abacha A. Consumer health information and question answering: helping consumers find answers to their health-related information needs. Journal of the American Medical Informatics Association. Journal of the American Medical Informatics Association, 2020: 27 (2), 194-201.
Savery M, Ben Abacha A, Gayen S, Demner-Fushman D. Question-Driven Summarization of Answers to Consumer Health Questions. arXiv preprint arXiv:2005.09067.
Sarrouti M, Ben Abacha A, Demner-Fushman D. Visual Question Generation from Radiology Images. Proceedings of the First Workshop on Advances in Language and Vision Research.
Sarrouti M, Quatik EL, Alaoui S. SemBioNLQA: A semantic biomedical question answering system for retrieving exact and ideal answers to natural language questions. Artif Intell Med. 2020 Jan;102:101767. doi: 10.1016/j.artmed.2019.101767. Epub 2019 Nov 28.
Ben Abacha A, Demner-Fushman D. A Question-Entailment Approach to Question Answering. arXiv:1901.08079.
Ben Abacha A, Hasan SA, Datls W, Liu J, Demner-Fushman D, Muller H. VQA-Med: Overview of the medical visual question answering task at imageclef 2019. CEUR Workshop Proceedings, 9-12, 2019.
Goodwin T, Demner-Fushman D, Fung K, Do P. Overview of the TAC 2019 Track on Drug-Drug Interaction Extraction from Drug Label. Proceedings of the Text Analysis Conference (TAC) 20 19, Gathersburg, MD, USA, November 12-13, 2019.
Mrabet Y, Demner-Fushman D. On Agreements in Visual Understanding. 2019 Conference on Neural Information Processing Systems. 2019 Conference on Neural Information Processing Systems, December 8-14, 2019. Vancouver, Canada.
Lu C, Payne A, Demner-Fushman D. Classification Types: A New Feature in the SPECIALIST Lexicon. AMIA Fall Symposium, 2019.
Goodwin T, Demner-Fushman D. Bridging the Knowledge Gap: Enhancing Question Answering with World and Domain Knowledge. arXiv preprint arXiv:1910.07429