This year’s Baggett Lectures will present an approach to unifying discrete symbolic and continuous neural computation: Gradient Symbolic Computation (GSC). In this first of three lectures in the Baggett Lecture series, GSC’s novel neural architecture — capable of encoding and processing symbol structures — will be presented, and the new grammatical theories that emerge from this architecture will be described and illustrated: theories in which grammars are evaluators of well-formedness, and grammatical structures are those that are maximally well-formed or optimal. Gradient Symbol Structures will be defined: these are structures (such as phonological strings or syntactic trees) in which each single location hosts a blend of symbols, each present (or active) to a continuously variable degree.