Core Concepts
This research introduces TabVer, a novel system that integrates arithmetic reasoning into natural logic inference, enabling more accurate and explainable fact-checking of claims against tabular data.
Stats
TabVer achieves an accuracy of 71.4 on FEVEROUS, outperforming both fully neural and symbolic reasoning models by 3.4 points.
When evaluated on TabFact without any further training, TabVer remains competitive with an accuracy lead of 0.5 points.
In a few-shot setting with 64 training instances on the tabular subset of the FEVEROUS dataset, TabVer outperforms previous symbolic reasoning systems, including LPA, SASP, and Binder, with a lead of 10.5 accuracy points over the best-performing baseline, Binder.
TabVer outperforms the highest-scoring neural entailment model, a classifier-version of the same language model used by TabVer, by 3.4 accuracy points.
Only for 36.3% of cases is TabVer's claim prediction maintained when adding 1 to the original number in the claim.
Quotes
"Fact verification on tabular evidence incentivises the use of symbolic reasoning models where a logical form is constructed (e.g. a LISP-style program), providing greater verifiability than fully neural approaches."
"This paper is the first attempt to extend natural logic inference for fact verification to the tabular domain."