This abstract points to a great example of the work being done to verify what AVS/Entrainment enthusiasts already know – that emotions, brain rhythms and audio-visual stimulation are usefully inter-related.
When looking for answers that apply to people in general, statistical science has a lot to offer – it’s nice knowing that what you think is happening with your sessions has a sound underlying mechanism. On the other hand, once you become familiar with your own personal responses, emotion/stimulus correspondences much better than “the average recognition rate of 56.66% and 66.67%” can be routinely achieved, i.e. knowing what your own EEG typically looks like, you will be able to tell a lot more about yourself than you could with a “blind” EEG, and you will quickly be able to predict your own response to a particular stimulus.
Abstracts like this are a goldmine for session developers – tantalising snippets just begging to be incorporated into AVS sessions.
With such safety and simplicity, it’s easy to underestimate what can be achieved with sound and light.