To understand the brain, we need to understand how it uses information. To measure specific brain activity, scientists devise unique methods that aren’t necessarily compatible with each other. This means that around the world, useful data are being generated in different formats, making it hard to collaborate on projects across continents. CARMEN Neuroscience Platform solves this with a secure cloud repository, using standardised metadata, and format conversion.
I created aural and visual metaphors for measuring the brain in action. For example, in one shot the brain is wrapped for a relief map of activity. In another shot, we tune in to a neurone like it’s a radio station.
Initial work was done in Blender, with render passes and virtual cameras exported to After Effects for compositing. Here, Rowbyte’s Plexus plugin was used to add more neuronal network detail to the shots, and Element 3D was used to bring the 2D website comp into this CGI world.
The first thing that comes to mind when you hear the word ‘Carmen’ is Bizet’s opera; it would have been a terrible cliché to use that particular piece. Instead I opted for a concert music sound of staccatto woodwind and strings, referencing the abundant brain activity stored by CARMEN. I produced an original track in FL Studio by slicing up lots of different orchestral sample libraries.