Affective annotation refers to the process of labeling media content based on the emotions they evoke. Since such experiences are inherently subjective and depend on individual differences, the central challenge is associating digital content with its affective, interindividual experience. Here, we present a first-of-its-kind methodology for affective annotation directly from brain signals by monitoring the affective experience of a crowd of individuals via functional near-infrared spectroscopy (fNIRS). An experiment is reported in which fNIRS was recorded from 31 participants to develop a brain-computer interface (BCI) for affective annotation. Brain signals evoked by images were used to draw predictions about the affective dimensions that characterize the stimuli. By combining annotations, the results show that monitoring crowd responses can draw accurate affective annotations, with performance improving significantly with increases in crowd size. Our methodology demonstrates a proof-of-concept to source affective annotations from a crowd of BCI users without requiring any auxiliary mental or physical interaction.