Events

(In collaboration with: Román Orús, Roger Martin, Michael Jarret, and Ángel Gallego.)  

There is a sense in which an object occupying two different configurations as in (1a) is puzzling, as is the fact that it can be pronounced “up” (1b) or “down” (1c), but not in both sites (1d) (similar considerations obtain for interpretation):  

(1a)   [Armies [arrived armies]]

(1b)   Armies arrived.

(1c)   There arrived armies.

(1d) *Armies arrived armies.

Also puzzling is the fact that the behavior in (1) apparently extends to situations as in (2); but not completely, given that (2c) is ungrammatical:

(2a)  [Armies tried [armies to [arrive armies]]]

(2b)  Armies tried to arrive.

(2c) *There tried to arrive armies.

(2d) *Armies tried (armies) to arrive armies.

While linguists have familiar names for these situations, questions remain about their behavior, and it is still unclear how the objects involved (if they are even the same type) relate to regular configurations that clearly do not involve such distributed behaviors.  

The present study was conceived as a way to deal with “long-distance” dependencies of this sort, for which we propose an approach that is common in other disciplines, though it has not been systematically pursued in syntax. We begin by considering Chomsky’s (1974) distinctions in (3) for the “parts of speech”:  

(3) a. Noun: [+N, -V] b. Adjective: [+N, +V] c. Verb: [-N, +V] d. Adposition: [-N, -V]

To this, we apply the Fundamental Assumption in (4):  

(4a) Attribute “N” is to be taken as the unit of the real numbers, 1

(4b) Attribute “V” is to be taken as the unit of the imaginary numbers, √-1 = i.

It is easy to see that this assumption results in vectors as in (5), a minimal extension of which (substituting the vectors for the matrix diagonal) gives us the matrices in (6):  

(5a) [1, -i]  (5b)  [1, i]  (5c) [-1, i]  (5d) [-1, -i].  

(6a) ( 1 0 0 -i ) (6b) ( 1 0 0 i ) (6c) ( -1 0 0 i ) (6d) ( -1 0 0 -i )

Matrices as in (6) are very well understood, and under common assumptions concerning the anti-symmetry of Merge can be shown to yield an interesting group:  

(7a) First Merge (for head-complement relations) is represented as matrix multiplication.

(7b) Elsewhere Merge (for head-specifier relations) is represented as a tensor product.

Applying (7a) reflexively to all the Chomsky matrices in (6) yields the same result:  

(8) ( 1 0 0 -1 )  

This is a well-known object: the 3rd Pauli matrix, Z. Further first mergers among the Chomsky matrices, (8), or the output of these combinations, yields three more matrices:  

(9a) ( -1 0 0 1 ) (9b) ( -1 0 0 -1 ) (9c) ( 1 0 0 1 )

All objects in (9) are within the Pauli group, leading credence to the claim in Piattelli-Palmarini and Vitiello (2015) that the Pauli matrices play a central role in “projection.” Here, they emerge from operating with Chomsky matrices under minimal assumptions (matrix multiplication). It can be proven that objects of the Pauli (10a) and Chomsky (10b) sort arrange themselves into the 32-element group in (11):

(10a) I= ( 1 0 0 1 ) X= ( 0 1 1 0 ) Y= ( 0 -i i 0 ) Z= ( 1 0 0 -1 )

(10b) C1= ( 1 0 0 -i ) C2= ( 1 0 0 i ) S1= ( 0 1 -i 0 ) S2= ( 0 1 i 0 )  

(11) Gcp ={±I ±X ±Y ±Z ±iI ±iX ±iY ±iZ

±C1 ±C2 ±S1 ±S2 ±iC1 ±iC2 ±iS1 ±iS2}

Having a group to work with is useful. Just as Chomsky’s matrices correspond to “lexical categories”, we expect “grammatical categories” to be other elements within the group. Moreover, only certain correspondences between “heads” (lexical items in the group) and “projections” (other elements in the group) yield “endocentric” structures. This is particularly so if we interpret “syntactic label” as in (12):

(12) The “label” of a matrix obtained by Merge is its determinant (product of the items in the matrix’s main diagonal minus the product of the items in the off-diagonal).  

Given this principled notion of “label”, the only first-merges that yield “projected” head-complement structures are those in (13a), and the only elsewhere-merges that yield equally projected head-specifier structures are those in (13b):  

(13a-i) Complements of nouns/adjectives are PP and related grammatical projections.

(13a-ii) Complements of verbs/prepositions are NP and related grammatical projections.

(13b) Specifiers of any of the categories are NP and related grammatical projections.

This result has never been predicted in a unified manner, much less without reliance on external (interface) considerations. Moreover, the tensor-product space resulting from multiplying all 32 matrices among themselves has well-known properties.  

Among the 1024 tensor products that the group allows, several matrix combinations are orthogonal. This means that matrices under such conditions have tight relations, which allows us to think of them in pairs (of specifiers). Orthogonal matrices, when added, yield objects with a “dual” character, of the sort seen, for instance, in UP and DOWN situations of an electron’s angular momentum (which is where the Pauli matrices were first used) – the probability of each relevant state being half. When multiplied, the product does not commute, so we find ourselves in the situation that the operation in one direction is not the same as the same operation in the opposite direction. This yields a characteristic “uncertainty” in which the two states are not simultaneously realizable. It is in part this character that is responsible for the appearance of the Pauli group in the context of quantum computation. We have a mathematically identical scenario, this time involving two orthogonal specifiers – which if they are of this sort, literally superpose. Thus, the strange behavior of so-called copies can be deduced as superposition within this tensor-product space, so long as we define “chains” as in (14):  

(14) A chain C {A, B} is the sum of two orthogonal specifier matrices A and B.

We have yet to fully pursue the next step in the reasoning, but the conditions that yield chain superposition entail the possibility of matrix entanglement of different chains into a super-chain. Superposition happens within a chain; matrix entanglement, across chains, if “prepared” into one of these super-chains. Mathematically, there is no problem yielding entanglement, but the empirical program is to determine when to allow it. We want to prevent it in situations as in (15d), which mimic those in (15b):

(15a)  Armies seem [armies to have arrived armies]

(15b) *Armies seemed [armies have arrived armies]

(15c)  Armies tried [ARMIES to have arrived armies]

(15d) * Armies tried [ARMIES have arrived armies]