The term "bootstrapping" appears frequently in the literature on child language acquisition, but is often defined vaguely (if at all) and can mean different things to different people. In this talk, I define bootstrapping as the use of structured correspondences between different levels of linguistic structure as a way to aid learning, and discuss how probabilistic models can be used to investigate the nature of these correspondences and how they might help the child learner. I will discuss two specific examples, showing 1) that using correspondences between acoustic and syntactic information can help with syntactic learning ("prosodic bootstrapping") and 2) that using correspondences between syntactic and semantic information in a joint learning model can help with learning both syntax and semantics while also simulating important findings from the child language acquisition literature.