The association between word forms and their meanings is in constant flux. In this work we apply computational linguistics to model meaning change, extending classic hypotheses from historical linguistics that date back to the 19th century. The model draws on vector semantics, in which the meaning of a word is represented as a point in a vector space, and a word's evolution as a path through this space over time. We trace the semantic evolution of 10,000 English words over the last century, uncovering novel quantitative regularities in how words shift in meaning and also how new words arise, and show that we can predict semantic shifts decades into the future.
This talk describes work with Will Hamilton and Jure Leskovec.