This is a practice talk for a talk being given at the Society of Computation in Linguistics meeting, co-located with the LSA in January 2020. The abstract for the SCiL talk is below.
We ask how the representation of person features in syntax affects learning in a Bayesian model, focusing on the Person Case Constraint (PCC). In PCC languages, certain clitic combinations are disallowed with ditransitive verbs. We compare a simple theory of the PCC, where person features are represented as atomic units, to a feature-based theory of the PCC, where person features are represented as feature bundles. We find that both theories can learn the target grammar given enough data, but the feature-based theory requires substantially less data. These results suggest that developmental trajectories could provide insight into representations in this domain.