This is a news story, published by Ars Technica, that relates primarily to the University of Oxford news.
For more Ai research news, you can click here:
more Ai research newsFor more news from Ars Technica, you can click here:
more news from Ars TechnicaOtherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
false answers. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest valid answers news, large language models news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
alternative facts LLMsArs Technica
•74% Informative
Large language models give blatantly false answers to queries and do so with confidence that's indistinguishable from when they get things right.
Researchers from the University of Oxford say they've found a relatively simple way to determine when LLMs appear to be confabulating that works with all popular models and across a broad range of subjects.
The new research is strictly about confabulations, and not instances such as training on false inputs.
VR Score
84
Informative language
89
Neutral language
57
Article tone
informal
Language
English
Language complexity
53
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
long-living
External references
no external sources
Source diversity
no sources
Affiliate links
no affiliate links