Semantics is a key feature of language, but whether or not music can activate brain mechanisms related to the processing of semantic meaning is not known. We compared processing of semantic meaning in language and music, investigating the semantic priming effect as indexed by behavioral measures and by the N400 component of the event-related brain potential (ERP) measured by electroencephalography (EEG). Human subjects were presented visually with target words after hearing either a spoken sentence or a musical excerpt. Target words that were semantically unrelated to prime sentences elicited a larger N400 than did target words that were preceded by semantically related sentences. In addition, target words that were preceded by semantically unrelated musical primes showed a similar N400 effect, as compared to target words preceded by related musical primes. The N400 priming effect did not differ between language and music with respect to time course, strength or neural generators. Our results indicate that both music and language can prime the meaning of a word, and that music can, as language, determine physiological indices of semantic processing.