1
Inductive biases, pretraining and fine-tuning jointly account for brain responses to speech
2
Thinking ahead: spontaneous prediction in context as a keystone of language in humans and machines
3
Transformers: State-of-the-Art Natural Language Processing
4
Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence
5
Language processing in brains and deep neural networks: computational convergence and its limits
6
Lack of selectivity for syntax relative to word meanings throughout the language network
7
Wiring Up Vision: Minimizing Supervised Synaptic Updates Needed to Produce a Primate Ventral Stream
8
A map of object space in primate inferotemporal cortex
9
On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior
10
Language Models are Few-Shot Learners
11
Comparing Transformers and RNNs on predicting human sentence processing data
12
No evidence for differences among language regions in their temporal receptive windows
13
A Systematic Assessment of Syntactic Generalization in Neural Language Models
14
Experience Grounds Language
15
Incremental language comprehension difficulty predicts activity in the language network but not the multiple demand network
16
What Limits Our Capacity to Process Nested Long-Range Dependencies in Sentence Comprehension?
17
What Happens To BERT Embeddings During Fine-tuning?
18
What counts as an exemplar model, anyway? A commentary on Ambridge (2020)
19
Lossy‐Context Surprisal: An Information‐Theoretic Model of Memory Effects in Sentence Processing
20
5分で分かる!? 有名論文ナナメ読み:Jacob Devlin et al. : BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding
22
Fine-grained neural decoding with distributed word representations
23
Controversial stimuli: pitting neural networks against each other as models of human recognition
24
Unsupervised Cross-lingual Representation Learning at Scale
25
Contrasting the impact of cytotoxic and cytostatic drug therapies on tumour progression
26
Inducing brain-relevant bias in natural language processing models
27
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
28
HuggingFace's Transformers: State-of-the-art Natural Language Processing
29
The neurobiology of language beyond single-word processing
30
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
31
Linking artificial and human neural representations of language
32
Neural dynamics of semantic composition
33
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
34
The Representation of Semantic Information Across Human Cerebral Cortex During Listening Versus Reading Is Invariant to Stimulus Modality
35
Topographic Deep Artificial Neural Networks (TDANNs) predict face selectivity topography in primate inferior temporal (IT) cortex
36
Brain-Like Object Recognition with High-Performing Shallow Recurrent ANNs
37
CTRL: A Conditional Transformer Language Model for Controllable Generation
38
Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks
39
How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings
40
The Domain-General Multiple Demand (MD) Network Does Not Support Core Aspects of Language Comprehension: A Large-Scale fMRI Investigation
41
A critique of pure learning and what artificial neural networks can learn from animal brains
42
fMRI reveals language-specific predictive coding during naturalistic sentence comprehension
43
RoBERTa: A Robustly Optimized BERT Pretraining Approach
44
Topic Modeling in Embedding Spaces
45
XLNet: Generalized Autoregressive Pretraining for Language Understanding
46
COMET: Commonsense Transformers for Automatic Knowledge Graph Construction
47
A Structural Probe for Finding Syntax in Word Representations
48
Interpreting and improving natural-language processing (in machines) with natural language-processing (in the brain)
49
BERT Rediscovers the Classical NLP Pipeline
50
What do you learn from context? Probing for sentence structure in contextualized word representations
51
SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
52
Evolving Images for Visual Neurons Using a Deep Generative Network Reveals Coding Principles and Neuronal Preferences
53
Recurrence is required to capture the representational dynamics of the human visual system
54
The Lottery Ticket Hypothesis at Scale
55
Cross-lingual Language Model Pretraining
56
Transformer-XL: Attentive Language Models beyond a Fixed-Length Context
57
A Unified Theory Of Early Visual Representations From Retina To Cortex Through Anatomically Constrained Deep CNNs
58
Word meanings and sentence structure recruit the same set of fronto-temporal regions during comprehension
59
Neural population control via deep image synthesis
60
Composition is the Core Driver of the Language-selective Network
61
Neural-Symbolic VQA: Disentangling Reasoning from Vision and Language Understanding
62
Predictive Processing: A Canonical Cortical Computation
63
Language Modeling Teaches You More Syntax than Translation Does: Lessons Learned Through Auxiliary Task Analysis
64
Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like?
65
A Neural Model of Adaptation in Reading
66
Modelling the N400 brain potential as change in a probabilistic representation of meaning
67
Does the brain represent words? An evaluation of brain decoding studies of language understanding
68
Neural Network Acceptability Judgments
69
Incorporating Context into Language Encoding Models for fMRI
70
A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy
71
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
72
An Evaluation of Subject-Paced Reading Tasks and Other Methods for Investigating Immediate Processes in Reading 1
73
Toward a universal decoder of linguistic meaning from brain activation
74
Promises and limitations of human intracranial electroencephalography
75
On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization
76
Deep Learning: A Critical Appraisal
77
Large-Scale, High-Resolution Comparison of the Core Visual Object Recognition Behavior of Humans, Monkeys, and State-of-the-Art Deep Artificial Neural Networks
78
Sensory cortex is optimized for prediction of future input
79
Deep convolutional models improve predictions of macaque V1 responses to natural images
80
Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models
81
The Natural Stories corpus: a reading-time corpus of English texts containing rare syntactic constructions
82
SemEval-2017 Task 1: Semantic Textual Similarity Multilingual and Crosslingual Focused Evaluation
83
Toward Goal-Driven Neural Network Models for the Rodent Whisker-Trigeminal System
84
Using stochastic language models (SLM) to map lexical, syntactic, and phonological information processing in the brain
85
MEG Evidence for Incremental Sentence Composition in the Anterior Temporal Lobe.
86
Domain-General Brain Regions Do Not Track Linguistic Input as Closely as Language-Selective Regions
87
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
88
Neurophysiological dynamics of phrase-structure building during sentence processing
89
On the Robustness of Convolutional Neural Networks to Internal Architecture and Weight Perturbations
91
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
92
Pointer Sentinel Mixture Models
93
Neural correlate of the construction of sentence meaning
94
Connectivity precedes function in the development of the visual word form area
95
The grammar of mammalian brain capacity
96
SQuAD: 100,000+ Questions for Machine Comprehension of Text
97
Comparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondence
98
Prediction During Natural Language Comprehension.
99
Abstract linguistic structure correlates with temporal activity during naturalistic comprehension
100
Language structure in the brain: A fixation-related fMRI study of syntactic surprisal in reading
101
Natural speech reveals the semantic maps that tile human cerebral cortex
102
Syntactic processing is distributed across the language system
103
Exploring the Limits of Language Modeling
104
Neural responses to grammatically and lexically degraded speech
105
What do we mean by prediction in language comprehension?
106
Cortical Tracking of Hierarchical Linguistic Structures in Connected Speech
107
Simple Learned Weighted Sums of Inferior Temporal Neuronal Firing Rates Accurately Predict Human Core Object Recognition Performance
109
GloVe: Global Vectors for Word Representation
110
Aligning context-based statistical models of language with brain activity during reading
111
A functional dissociation between language and multiple-demand systems revealed in patterns of BOLD signal fluctuations.
112
Performance-optimized hierarchical models predict neural responses in higher visual cortex
113
Reworking the language network
114
The P-chain: relating sentence production and its disorders to comprehension and acquisition
115
One billion word benchmark for measuring progress in statistical language modeling
116
Distributed Representations of Words and Phrases and their Compositionality
117
Selective and Invariant Neural Responses to Spoken and Written Narratives
118
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
119
The effect of word predictability on reading time is logarithmic
120
A Model of Language Processing as Hierarchic Sequential Prediction
121
How do forward models work? And why would you want them?
122
Whatever next? Predictive brains, situated agents, and the future of cognitive science.
123
Rational integration of noisy evidence and prior semantic expectations in sentence interpretation
124
ImageNet classification with deep convolutional neural networks
125
Canonical Microcircuits for Predictive Coding
126
Thought Beyond Language
127
The Winograd Schema Challenge
128
The origin of extracellular fields and currents — EEG, ECoG, LFP and spikes
129
The cortical language circuit: from auditory perception to sentence comprehension
130
Multi-column deep neural networks for image classification
131
Functional specificity for high-level linguistic processing in the human brain
132
Insensitivity of the Human Sentence-Processing System to Hierarchical Structure
133
How to Grow a Mind: Statistics, Structure, and Abstraction
134
Cortical representation of the constituent structure of sentences
135
New method for fMRI investigations of language: defining ROIs functionally in individual subjects.
136
Data from eye-tracking corpora as evidence for theories of syntactic processing complexity
137
Predicting Human Brain Activity Associated with the Meanings of Nouns
138
Expectation-based syntactic comprehension
139
Random Features for Large-Scale Kernel Machines
140
Cognition and anatomy in three variants of primary progressive aphasia
141
Voxel-based lesion–symptom mapping
142
A Probabilistic Earley Parser as a Psycholinguistic Model
143
Incremental interpretation at verbs: restricting the domain of subsequent reference
144
Computing the discrete-time 'analytic' signal via FFT
145
Language acquisition in the absence of explicit negative evidence: how important is starting small?
146
Toward a connectionist model of recursion in human linguistic performance
147
Syntactic ambiguity resolution in discourse: modeling the effects of referential context and lexical frequency.
148
Linguistic complexity: locality of syntactic dependencies
149
Modeling the Influence of Thematic Fit (and Other Constraints) in On-line Sentence Comprehension
150
The Effects of Visual Masking on Recognition: Similarities to the Generation Effect
151
The Contributions of Verb Bias and Plausibility to the Comprehension of Temporarily Ambiguous Sentences
152
Human Brain Language Areas Identified by Functional Magnetic Resonance Imaging
153
A Probabilistic Model of Lexical and Syntactic Access and Disambiguation
154
Integration of visual and linguistic information in spoken language comprehension.
155
The lexical nature of syntactic ambiguity resolution
156
Semantic Influences On Parsing: Use of Thematic Role Information in Syntactic Ambiguity Resolution
157
Learning and development in neural networks: the importance of starting small
158
Verb-specific constraints in sentence processing: separating effects of lexical preference from garden-paths.
159
The Crosslinguistic Study of Sentence Processing.
160
Distributed representations, simple recurrent networks, and grammatical structure
161
Finding Structure in Time
162
On language and connectionism: Analysis of a parallel distributed processing model of language acquisition
163
Word learning in children: an examination of fast mapping.
164
First impressions: Children’s knowledge of words gained from a single exposure
165
Predictive coding: a fresh view of inhibition in the retina
166
Paradigms and processes in reading comprehension
167
Acquiring a Single New Word
168
Eye movements in reading and information processing.
169
Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition
170
computational convergence and its limit
171
Thinking ahead: prediction in context as a keystone of language in humans and machines
172
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
173
Language Models are Unsupervised Multitask Learners
174
Evolving Images for Visual 716
175
ATOMIC: An Atlas of Machine Commonsense for If-Then Reasoning
176
Social IQa: Commonsense Reasoning about Social Interactions
177
Building an ACT-R Reader for Eye-Tracking Corpus Data
178
Predictive power of word surprisal for reading times is a linear function of language model quality
179
Improving Language Understanding by Generative Pre-Training
180
From the SelectedWorks of Marcel Adam Just 1980 A theory of reading : From eye fixations to comprehension
181
The ERP response to the amount of information conveyed by words in sentences
182
First steps towards an intelligent laser welding architecture using deep neural networks and reinforcement learning
183
Human intracranial recordings and cognitive neuroscience.
184
Neural dissociation of algebra and natural language
185
exical and syntactic representations in the brain : An fMRI investigation with ulti-voxel pattern analyses
186
Automatically Constructing a Corpus of Sentential Paraphrases
187
On the computational architecture of the neocortex
189
Book Reviews: The Syntactic Process
190
Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects.
191
Exploring the Limits: Europe’s Changing Communication Environment
192
The lexical nature of syntactic ambiguity resolution [corrected].
193
Preface: Cerebral Cortex Has Come of Age
194
Distributed hierarchical processing in the primate cerebral cortex.
195
Maximum likelihood models for sentence processing research
196
Neocognitron: A hierarchical neural network capable of visual pattern recognition
197
Computational psycholinguistics View project Psycholinguistics View project.
198
Annals of the New York Academy of Sciences the Anatomy of Language: a Review of 100 Fmri Studies Published in 2009
199
Opinion TRENDS in Cognitive Sciences Vol.10 No.10 Computational principles of working memory in sentence comprehension
200
Frontiers in Systems Neuroscience Systems Neuroscience
201
Whatever Next? Predictive Brains, Situated Agents, and the Future of Cognitive Science.
202
SI-5 -Model performance on diverse language tasks vs. model-to-brain fit
203
SI-4 -Language specificity
204
2019) and report the final task 1290 score as accuracy for SST-2, MNLI, RTE, and QNLI, Matthew's Correlation for CoLA, the average of accuracy and F1 score for 1291
205
Performance on next-word prediction selectively predicts model-to-brain fit. Performance on GLUE tasks was 1297 evaluated as described in SI-5
206
Model architecture contributes to brain predictivity and untrained performance predicts trained performance
207
Science Current Directions in Psychological Good-enough Representations in Language Comprehension on Behalf Of: Association for Psychological Science