This is a news story, published by Quanta Magazine, that relates primarily to * Liu news.
For more * Liu news, you can click here:
more * Liu newsFor more Ai research news, you can click here:
more Ai research newsFor more news from Quanta Magazine, you can click here:
more news from Quanta MagazineOtherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
typical neural network. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest regular neural network news, alternative neural network design news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
multilayer perceptronQuanta Magazine
•81% Informative
A Kolmogorov-Arnold network ( KAN ) is based on a mathematical idea from the mid-20th century that has been rediscovered and reconfigured for deployment in the deep learning era.
KANs go about function fitting — connecting the dots of the network’s output — in a fundamentally different way than MLPs.
They’re also learnable, so they can be tweaked with far greater sensitivity than the simple numerical weights.
The Kolmogorov-Arnold theorem essentially provides a blueprint for such a structure.
The KAN can approximate smooth functions with smooth functions.
The biggest advantage that KANs hold over other forms of neural networks lies in their interpretability.
With each one, the network is able to align with a more complicated output function.
A paper by Yizheng Wang of Tsinghua University and others that appeared online in June showed that their neural network “significantly outperforms” MLPs for solving partial differential equations.
Liu and Tegmark ’s KAN paper quickly caused a stir, garnering 75 citations within about three months .
Liu is striving to make KANs more practical and easier to use.
VR Score
88
Informative language
94
Neutral language
59
Article tone
informal
Language
English
Language complexity
51
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
detected
Known propaganda techniques
not detected
Time-value
long-living
External references
6
Source diversity
3
Affiliate links
no affiliate links