Emotion discrimination from electroencephalogram (EEG) has gained attention the last decade as a user-friendly and effective approach to EEG-based emotion recognition (EEG-ER) systems. Nevertheless, challenging issues regarding the emotion elicitation procedure, especially its effectiveness, raise. In this work, a novel method, which not only evaluates the degree of emotion elicitation but localizes the emotion information in the time-frequency domain, as well, is proposed. The latter, incorporates multidimensional directed information at the time-frequency EEG representation, extracted using empirical mode decomposition, and introduces an asymmetry index for adaptive emotion-related EEG segment selection. Experimental results derived from 16 subjects visually stimulated with pictures from the valence/arousal space drawn from the International Affective Picture System database, justify the effectiveness of the proposed approach and its potential contribution to the enhancement of EEG-ER systems.