Generative AI in the era of 'alternative facts'
|
MIT Open Publishing Services
Article
While neural networks used in practice are often very deep, the benefit of depth is not well understood. Interestingly, it is known that increasing depth is often harmful for regression tasks. In this work, we show that, in contrast to regression, very deep networks can be Bayes optimal for classification. In particular, this research provides simple and explicit activation functions that can be used with standard neural network architectures to achieve consistency. This work provides fundamental understanding of classification using deep neural networks, and the research team envisions it will help guide the design of future neural network architectures.
|
MIT Open Publishing Services
|
Harvard Business Review Press
|
Arxiv
|
Arxiv
|
bioRxiv
|
Nature
|
Arxiv
|
Pancreas
|
Science
|
Cell Systems
|
Arxiv
|
Radiological Society of North America
|
Nature
|
Arxiv
|
Science Direct
|
PNAS
|
Nature
|
Arxiv
|
Journal of Clinical Oncology
|
Proceedings of Machine Learning Research
|
Dynamic Ideas
|
Science
|
Little, Brown and Company
|
Arxiv
|
Dynamic Ideas
|
Advances in Neural Information Processing Systems
|
International Journal of Computer Vision