#78 – Prof. NOAM CHOMSKY (Special Edition)

Patreon: https://www.patreon.com/mlst
Discord: https://discord.gg/ESrGqhf5CB

In this special edition episode, we have a conversation with Prof. Noam Chomsky, the father of modern linguistics and the most important intellectual of the 20th century.

With a career spanning the better part of a century, we took the chance to ask Prof. Chomsky his thoughts not only on the progress of linguistics and cognitive science but also the deepest enduring mysteries of science and philosophy as a whole – exploring what may lie beyond our limits of understanding. We also discuss the rise of connectionism and large language models, our quest to discover an intelligible world, and the boundaries between silicon and biology.

We explore some of the profound misunderstandings of linguistics in general and Chomsky’s own work specifically which have persisted, at the highest levels of academia for over sixty years.

We have produced a significant introduction section where we discuss in detail Yann LeCun’s recent position paper on AGI, a recent paper on emergence in LLMs, empiricism related to cognitive science, cognitive templates, “the ghost in the machine” and language.

Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Walid Saba

Pod version: https://anchor.fm/machinelearningstreettalk/episodes/MLST-78—Prof–NOAM-CHOMSKY-Special-Edition-e1l0760

00:00:00 Kick off
00:02:24 C1: LeCun’s recent position paper on AI, JEPA, Schmidhuber, EBMs
00:48:38 C2: Emergent abilities in LLMs paper
00:51:32 C3: Empiricism
01:25:33 C4: Cognitive Templates
01:35:47 C5: The Ghost in the Machine
02:00:08 C6: Connectionism and Cognitive Architecture: A Critical Analysis by Fodor and Pylyshyn
02:20:12 C7: We deep-faked Chomsky
02:29:58 C8: Language
02:34:34 C9: Chomsky interview kick-off!
02:35:32 Large Language Models such as GPT-3
02:39:07 Connectionism and radical empiricism
02:44:37 Hybrid systems such as neurosymbolic
02:48:40 Computationalism silicon vs biological
02:53:21 Limits of human understanding
03:00:39 Semantics state-of-the-art
03:06:36 Universal grammar, I-Language, and language of thought
03:16:20 Profound and enduring misunderstandings
03:25:34 Greatest remaining mysteries science and philosophy
03:33:04 Debrief and ‘Chuckles’ from Chomsky

References;

(Currently incomplete, we will add to this)

LeCun Path to Autonomous AI paper
https://openreview.net/forum?id=BZ5a1r-kVsf

Tim’s marked up version:
https://acrobat.adobe.com/link/review?uri=urn:aaid:scds:US:8c5260f5-8959-3f11-bb3b-befb3bc65f13

Emergent Abilities of Large Language Models [Wei et al] 2022
https://arxiv.org/abs/2206.07682

Connectionism and Cognitive Architecture: A Critical Analysis [Fodor, Pylyshyn] 1988
http://ruccs.rutgers.edu/images/personal-zenon-pylyshyn/docs/jaf.pdf

Ghost in the machine
https://psychology.fandom.com/wiki/Ghost_in_the_machine
https://forum.wordreference.com/threads/in-an-aristotelian-sense.3350478/
https://news.ycombinator.com/item?id=26448901 (thanks to user tikwidd for your analysis)

Noam Chomsky in Greece: Philosophies of Democracy (1994) [Language chapter]

Richard Feynman clip
https://vimeo.com/340695809

Chomsky Bryan Magee BBC interview:

Randy Gallistel’s work (question 3)
Helmholtz “NNs : they’ve damn slow”
Purkinje cells

Barbara Partee

Iris Berent

Iris Berent

Penrose Orch OR
https://en.wikipedia.org/wiki/Orchestrated_objective_reduction
https://en.wikipedia.org/wiki/Shadows_of_the_Mind

Alan Turing “Systems of Logic Based on Ordinals”
https://rauterberg.employee.id.tue.nl/lecturenotes/DDM110%20CAS/Turing/Turing-1939%20Sysyems%20of%20logic%20based%20on%20ordinals.pdf

Fodor “The Language of Thought”

Least Effort
http://materias.df.uba.ar/dnla2019c1/files/2019/03/scaling_in_language.pdf

structure dependence in grammar formation
https://www.jstor.org/stable/415004

three models
https://chomsky.info/wp-content/uploads/195609-.pdf
https://en.wikipedia.org/wiki/Transformational_grammar

Darwin’s problem
https://chomsky.info/20140826/

Descartes’s problem
https://en.wikipedia.org/wiki/Mind%E2%80%93body_problem

Control Theory
https://en.wikipedia.org/wiki/Control_(linguistics)

Thanks;
Thanks to Pikachu for helping us get the Chonsky NVidia Tacotron 2 model working.

YouTube Source for this AI Video

AI video(s) you might be interested in …