Paul SMOLENSKY — Johns Hopkins University, Microsoft Research AI — Human language is profoundly sha

Page 1

Human language is profoundly shaped by the brain

Paul Smolensky Microso2 Research AI Johns Hopkins University

Overview: Prince & Smolensky, 1997, Science

SOPHI.A AI and CogniBon Sophia AnBpolis 8 Nov 2018


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is op.mal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computa.onal theory of cogni.on ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


unlike organs with a physical funcBon (heart, lungs) there is no pre-exisBng science of the brain’s funcBon: cogniBon

To understand how the brain achieves its funcBon, we need a science of the funcBon: cogniBve science.

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


in nonword reading from a computer, people map: pixel pa_erns ⟶ speech interested not in individual events, but generalizaBons C is pronounced [k] except a2er I, E, H sound pa_erns pixel pa_erns eliciBng this defy elicited by this defy physical definiBon physical definiBon

To understand how the brain achieves its funcBon, we need a science of the funcBon: cogniBve science. Evidence for cogniBve science is generalizaBons about human behavior. StaBng these generalizaBons requires abstracBons. FuncBon of these abstracBons is to compute. ComputaBonal systems can be precisely described at mulBple levels: virtual machines

contexts governing abstracBons defined not physically, but by other abstracBons

defined not in physical terms, but by formal principles: axioms axiomaBc knowledge -- mathemaBcs -- is the most certain knowledge we have (and has been for > 2000 years!)

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


Noam Chomsky: observable-based accounts of language are extremely vague: abstract theories are precise especially w.r.t. language

To understand how the brain achieves its funcBon, we need a science of the funcBon: cogniBve science. Evidence for cogniBve science is generalizaBons about human behavior. StaBng these generalizaBons requires abstracBons. FuncBon of these abstracBons is to compute. ComputaBonal systems can be precisely described at mulBple levels: virtual machines

restricBng science to observables bankrupted psychology (behaviorism) defined not in physical terms, but by formal principles: axioms axiomaBc knowledge -- mathemaBcs -- is the most certain knowledge we have (and has been for > 2000 years!)

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


To understand how the brain achieves its funcBon, we need a science of the funcBon: cogniBve science. Evidence for cogniBve science is generalizaBons about human behavior. StaBng these generalizaBons requires abstracBons. FuncBon of these abstracBons is to compute. ComputaBonal systems can be precisely described at mulBple levels: ‘virtual’ (abstract) machines

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computa.onal theory of cogni.on ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① Cogni.on is combinatorial computa.on new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


knowledge of arithmeBc, algebra; language 5(2 + 3) = ?

S

x 5

+ 2 3

NP VP Frodo lives

Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Ⓜ︎​︎ Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

➔ Frodo

lives +

Frodo lives =

Ⓜ︎​︎ Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


Frodo

Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


×

role

Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Tensor Product RepresentaBons (TPRs)

Frodo filler

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪

Frodo


Many powerful symbolic computaBons can be carried out on TPRs using neural computaBon. In AI applicaBons, TPRs can be learned in deep neural networks that outperform comparable previous models and have greater interpretability. The roles learned for natural language input or output (e.g., quesBon-answering, image capBoning, mulBple NLP tasks) have much grammaBcal content. [see mul+ple examples on arXiv]

Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Tensor Product RepresentaBons (TPRs)

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


Currently using newly developed MVPA methods designed to look for TPRs in fMRI images of the brain processing combinatorial language sBmuli. Synchronous-firing binding is a TPR in spaceBme. TPR model captures error pa_erns in spelling for brain-damaged paBents. SpaBal encoding in parietal cortex is a TPR.

Key to human cogniBon: produc+vity Explained through the combinatorial strategy Human cogniBon can: (i) decompose a novel situaBon into familiar parts; (ii) process the parts; (iii) recombine the results The parts can be: Ⓜ︎​︎ symbol structures ⓜ neural acBvity pa_erns

Tensor Product RepresentaBons (TPRs) new theory of neural coding Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① Cogni.on is combinatorial computa.on new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computa.on is op.mal sa.sfac.on of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


ⓐ −2 +3 ⓒ

ⓜ-desideratum: ⓐ and ⓑ should not both be active Neural computaBon is maximizaBon of Harmony ( = well-formedness): (else −2 from Harmony) maximal saBsfacBon of ⓜ-desiderata ⓜ-desideratum: ⓐ and With TPRs cogniBve computaBon is maximizaBon of Ⓜ︎​︎-desiderata ⓑ should both be acBve (if so +3 to Harmony)

Ⓜ︎​︎-desiderata for sentences SUBJ: a sentence has a subject. conflict! FULLINT: No meaningless words How to express <runs>?

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


ⓜ-desideratum: ⓐ and ⓐ ⓑ should not both be active Neural computaBon is maximizaBon of Harmony ( = well-formedness): (else −2 from Harmony) −2 maximal saBsfacBon of ⓜ-desiderata ⓑ conflict! With TPRs cogniBve computaBon is ⓜ-desideratum: ⓐ and +3 maximizaBon of Ⓜ︎​︎-desiderata ⓑ should both be acBve Desiderata conflicts resolved by weighBng ⓒ (if so +3 to Harmony) variant: ranking — OpBmality Theory (Prince & Smolensky 1991/1993/2004) Ⓜ︎​︎-desiderata for sentences English: SUBJ ≫ FULLINT SUBJ: a sentence has a subject. conflict! FULLINT: No meaningless words Many theorems; much so2ware for How to express <runs>? automaBcally compuBng typologies; rich field of learning algorithms (roa.rutgers.edu) English piove rains

-3

Italian

-2

SUBJ FULLINT H *

-3

it rains

*

-2

it rains it

**

-4

Understanding the brain requires an absTypology by re-weighBng -3 -2 tract computaBonal theory of cogniBon FULLINT SUBJ H ① CogniBon is combinatorial computaBon new theory of neural coding * -2 ② Neural computaBon is opBmal saBsfac* -3 Bon of desiderata new grammars ** -6 ③ Neural gradience percolates up to give Weaker constraint acBve novel Gradient Symbolic ComputaBon ⓪


What does this English -3

-2

SUBJ FULLINT H rains

*

-3

it rains

*

-2

it rains it

**

-4

have to do with neural computaBon?


☞ it rains it rains it rains rains it


it rains it rains it rains rains it


it rains it rains it rains rains it


p � eH/T

☞

it rains it rains it rains rains it


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computa.on is op.mal sa.sfac.on of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic Computa.on ⓪


Phonology: French liaison Gradient input ➛ discrete output ‘peBte copine’ peBt copine input ☟ op+mize & pe.Bt.co.pine output ‘peBt copain’ peB(0.5⋅t) copain ☟ pe.B.co.pain ‘peBt ami’ peB(0.5⋅t) (0. 3⋅t + 0.3⋅n + 0.3⋅z)ami ☟ pe.B.ta.mi ‘peBt héro’ peB(0.5⋅t) éro ☟ pe.B.é.ro & 15 more pa_erns Smolensky & Goldrick 2016 ROA 1286

A gradient symbol structure

0.7⋅A 0.3⋅A A in role +0.2⋅B +0.4⋅C blend A & B a filler blend

Understanding the brain requires an abstract computaBonal theory of cogniBon ① CogniBon is combinatorial computaBon new theory of neural coding ② Neural computaBon is opBmal saBsfacBon of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic ComputaBon ⓪


combinatorial

abstract

opBmal

Human language is profoundly shaped by the brain Crux: grammaBcal computaBon

Fundamental: Type of computaBon

GrammaBcal computaBon is neural computaBon

Thank you for your attention!

Understanding the brain requires an abstract computa.onal theory of cogni.on ① Cogni.on is combinatorial computa.on new theory of neural coding ② Neural computa.on is op.mal sa.sfac.on of desiderata new grammars ③ Neural gradience percolates up to give novel Gradient Symbolic Computa.on ⓪

Gratefully acknowledge support of US NSF INSPIRE grant BCS-1344269


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.