You are here

Bridging the Knowledge Gap: Enhancing Question Answering with World and Domain Knowledge.

Printer-friendly versionPrinter-friendly version
arXiv preprint arXiv:1910.07429
Abstract: 

In this paper we present OSCAR (Ontology-based Semantic Composition Augmented Regularization), a method for injecting task-agnostic knowledge from an Ontology or knowledge graph into a neural network during pretraining. We evaluated the impact of including OSCAR when pretraining BERT with Wikipedia articles by measuring the performance when fine-tuning on two question answering tasks involving world knowledge and causal reasoning and one requiring domain (healthcare) knowledge and obtained 33:3%, 18:6%, and 4% improved accuracy compared to pretraining BERT without OSCAR and obtaining new state-of-the-art results on two of the tasks.

Goodwin T, Demner-Fushman D. Bridging the Knowledge Gap: Enhancing Question Answering with World and Domain Knowledge. arXiv preprint arXiv:1910.07429