Monitoring the affective states of a person can be highly relevant for numerous disciplines, including adaptive user interfaces, entertainment, ergonomics, medicine and therapy. In many situations, the affective state of a user is not easily observable from outside by audio or video, but may be identified by a brain-computer interface (BCI). Functional near-infrared spectroscopy (fNIRS) is a brain imaging modality gaining rising attention in the BCI community. However, fNIRS emotion recognition studies have only analyzed stimulus-locked effects. For realistic human-machine interaction scenarios, the point of time of an emotion-triggering event and the time span of an affective state are unknown. In this paper, we investigate a BCI that monitors the affective states of the user continuously over time (i.e. asynchronous BCI). In our study, fNRIS signals from eight subjects have been recorded at eight prefrontal locations in response to three different classes of affect induction by emotional audio-visual stimuli plus a neutral class. Our system evaluates short windows of 5 s length to continuously recognize affective states. We analyze hemodynamic responses, present a careful evaluation of binary classification tasks, compare time-domain and wavelet-based signal features, and investigate classification accuracies over time.