The Gender Generator: Towards a Brain-Computer Interaction to Evoke Gender Dysphoria Symptoms

February 07, 2018

, Extended Abstracts CHI 18,

Watch Gender Generator Demo VideoScience and technology have profoundly affected our abilities to observe, transform, and manipulate bodily functions as well as our concepts of the body. Recent research into empathic technologies and empathic embodied technologies suggests that our embodied experience through virtual reality strongly influence our cognitive state and social biases. However, this trend of examining the effect embodied experience has on the mind is complicated when the target group of the empathic exercise may experience a disconnection between their physical body and their experienced body. Recent investigations indicate a potential biological basis for gender dysphoria. This brings into question how to create empathic technologies to explore the embodied experience of a transgender person, when a transgender person may not necessarily identify with their own embodied experience? To explore this, we present three Brain Computer Interfaces (BCI) for the development of empathic technologies beyond immersive embodiment. Hex Plexus investigates BCI music composition practices as the basis for a series of mobile performances.  Synapstraction is an installation which allows users to create abstract paintings based on neuro-feedback. The Gender Generator allows users to explore gender expression and construction in the form of a P300-based BCI paradigm. We conducted a series of user studies to understand the effectiveness of these machines as expressive tools and educational experiences. These projects address the notion of empathic technologies for body-dysmorphic users by building towards a Machine-Empathy Interface system which serves to create an empathic link between users.

Synapstraction: Brain-Computer Interaction for Tangible Abstract Painting

March 01, 2017

Performance, Digital Arts Expo 2017, Black Visual Arts Center

Watch Synapstraction Demo Video Synapstraction is a brain-computer interaction project which allows users to create an abstract painting based on neuro-feedback. The system uses a special headset that measures the electrical activity of a person’s scalp processed by machine learning algorithms to discriminate a stimuli’s effect on the brain. Each visitor is invited to enter the installation and approach one of five “sense” stations, each with an electroencephalography (EEG) headset. We then measure the event related potential (ERP) elicited by the visitor’s brain using the EEG when the visitor receives 1 of 5 stimuli. These 5 stimuli correspond to the 5 senses (sight, touch, taste, sound, smell). In addition, each stimuli is mapped to a specific creative instrument such as a paintbrush, sponge, or marker. Once the visitor has secured their headset, they are presented with a stimulus (e.g. a Chopin Nocturne if at the “sound” station or fresh ground clove for the “scent” station). Our system then uses a machine learning method called linear discriminant analysis to map the activity of the visitor’s brain while experiencing the stimuli to acoustic frequencies which actuate the painting implements. After visiting each of the 5 “sense” stations within the installation, the participant is invited to keep their finished painting.The participant’s brain serves as a conduit, translating the stimulation of each sense into the finished image. In this way, the usual methodology of an artist using their senses to create a media object is inverted; the senses use the artist to create an image. Synapstraction largely takes its aesthetic interests from the abstract expressionists of the 20th century and its conceptual framework from aleatory artists such as John Cage. Unlike Cage, however, Synapstraction maps all senses to image, and renders the consumption of sensual stimuli as an act of image creation. The material of this artwork doesn’t necessarily lie in the paintings themselves, nor the equipment used in the installation, but instead rests in the speculative reconsideration of potential alternative roles for human senses in art making. This project premiered during the Digital Arts Festival at the Black Visual Arts Center in 2017. Press: Junction Magazine

PsycheVR: Virtual Reality and Biofeedback to Promote Wellness in Astronauts

February 01, 2016

Project, NASA, DALI Lab

Watch Synapstraction Demo Video PsycheVR is a virtual reality and biofeedback project conducted in collaboration with the Space Medicine Labratory at the Geisell School of Medicine and National Aeronautics Space Administration (NASA). The project consists of developing virtual reality content to promote relaxation when the user is confined to small, isolated spaces for long durations of time. The products of this project were used in trial experimentation with NASA astronauts intended for future missions to Mars. We also prototyped biofeedback mechanisms to take in a user’s biometric state as determined by GSR, EEG, facial expression and other stress indicators and subsequently adjust the content of the VR enviornment to assit calming the user.