A Gentle Introduction to Deep Nets and Opportunities for the Future

2Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.

Abstract

The first half of this tutorial will make deep nets more accessible to a broader audience, following “Deep Nets for Poets” and “A Gentle Introduction to Fine-Tuning.” We will also introduce, gft (general fine tuning), a little language for fine tuning deep nets with short (one line) programs that are as easy to code as regression in statistics packages such as R using glm (general linear models). Based on the success of these methods on a number of benchmarks, one might come away with the impression that deep nets are all we need. However, we believe the glass is half-full: while there is much that can be done with deep nets, there is always more to do. The second half of this tutorial will discuss some of these opportunities.

References Powered by Scopus

On the dangers of stochastic parrots: Can language models be too big?

2941Citations
N/AReaders
Get full text

Universal language model fine-tuning for text classification

1851Citations
N/AReaders
Get full text

What bert is not: Lessons from a new suite of psycholinguistic diagnostics for language models

424Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Emerging trends: Deep nets thrive on scale

3Citations
N/AReaders
Get full text

torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Church, K., Kordoni, V., Marcus, G., Davis, E., Ma, Y., & Chen, Z. (2022). A Gentle Introduction to Deep Nets and Opportunities for the Future. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 1–6). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-tutorials.1

Readers' Seniority

Tooltip

Researcher 6

46%

PhD / Post grad / Masters / Doc 5

38%

Professor / Associate Prof. 1

8%

Lecturer / Post doc 1

8%

Readers' Discipline

Tooltip

Computer Science 13

76%

Linguistics 2

12%

Neuroscience 1

6%

Agricultural and Biological Sciences 1

6%

Save time finding and organizing research with Mendeley

Sign up for free