#78 – Prof. NOAM CHOMSKY (Special Edition)

Patreon: https://www.patreon.com/mlst
Discord: https://discord.gg/ESrGqhf5CB

In this special edition episode, we have a conversation with Prof. Noam Chomsky, the father of modern linguistics and the most important intellectual of the 20th century.

With a career spanning the better part of a century, we took the chance to ask Prof. Chomsky his thoughts not only on the progress of linguistics and cognitive science but also the deepest enduring mysteries of science and philosophy as a whole – exploring what may lie beyond our limits of understanding. We also discuss the rise of connectionism and large language models, our quest to discover an intelligible world, and the boundaries between silicon and biology.

We explore some of the profound misunderstandings of linguistics in general and Chomsky’s own work specifically which have persisted, at the highest levels of academia for over sixty years.

We have produced a significant introduction section where we discuss in detail Yann LeCun’s recent position paper on AGI, a recent paper on emergence in LLMs, empiricism related to cognitive science, cognitive templates, “the ghost in the machine” and language.

Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Walid Saba

Pod version: https://anchor.fm/machinelearningstreettalk/episodes/MLST-78—Prof–NOAM-CHOMSKY-Special-Edition-e1l0760

00:00:00 Kick off
00:02:24 C1: LeCun’s recent position paper on AI, JEPA, Schmidhuber, EBMs
00:48:38 C2: Emergent abilities in LLMs paper
00:51:32 C3: Empiricism
01:25:33 C4: Cognitive Templates
01:35:47 C5: The Ghost in the Machine
02:00:08 C6: Connectionism and Cognitive Architecture: A Critical Analysis by Fodor and Pylyshyn
02:20:12 C7: We deep-faked Chomsky
02:29:58 C8: Language
02:34:34 C9: Chomsky interview kick-off!
02:35:32 Large Language Models such as GPT-3
02:39:07 Connectionism and radical empiricism
02:44:37 Hybrid systems such as neurosymbolic
02:48:40 Computationalism silicon vs biological
02:53:21 Limits of human understanding
03:00:39 Semantics state-of-the-art
03:06:36 Universal grammar, I-Language, and language of thought
03:16:20 Profound and enduring misunderstandings
03:25:34 Greatest remaining mysteries science and philosophy
03:33:04 Debrief and ‘Chuckles’ from Chomsky


(Currently incomplete, we will add to this)

LeCun Path to Autonomous AI paper

Tim’s marked up version:

Emergent Abilities of Large Language Models [Wei et al] 2022

Connectionism and Cognitive Architecture: A Critical Analysis [Fodor, Pylyshyn] 1988

Ghost in the machine
https://news.ycombinator.com/item?id=26448901 (thanks to user tikwidd for your analysis)

Noam Chomsky in Greece: Philosophies of Democracy (1994) [Language chapter]

Richard Feynman clip

Chomsky Bryan Magee BBC interview:

Randy Gallistel’s work (question 3)
Helmholtz “NNs : they’ve damn slow”
Purkinje cells

Barbara Partee

Iris Berent

Iris Berent

Penrose Orch OR

Alan Turing “Systems of Logic Based on Ordinals”

Fodor “The Language of Thought”

Least Effort

structure dependence in grammar formation

three models

Darwin’s problem

Descartes’s problem

Control Theory

Thanks to Pikachu for helping us get the Chonsky NVidia Tacotron 2 model working.

YouTube Source for this AI Video

AI video(s) you might be interested in …