Brain-Computer Music Interfaces (BCMIs) aim to allow users to control music using their brain activity information. In this thesis the design and implementation of a BCMI for controlling the expressive content of musical pieces using emotions is presented. Human emotions can be characterized as a combination of arousal and valence values. However, the variability of these values across different subjects complicates the use of emotions for controlling music expression. Two experiments are presented to study the best approach for calculating arousal and valence value boundaries. The obtained results indicate that using images with emotional content in the process of calibrating the BCMI is less reliable than instructing the subjects to consciously modulate their excitation/relaxation state. Another conclusion is that the computed valence value is less reliable than the arousal value for controlling the BCMI. The impact of using both music and visual feedback in the BCMI is investigated and the main conclusion is that most of the users participating in the experiment are able to control better the BCMI when receiving only musical feedback compared to both music and visual feedback.