Posts by Collection



The Gender Generator: Towards a Brain-Computer Interaction to Evoke Gender Dysphoria Symptoms

Published in Extended Abstracts CHI 18, 2018

Watch Gender Generator Demo VideoScience and technology have profoundly affected our abilities to observe, transform, and manipulate bodily functions as well as our concepts of the body. Recent research into empathic technologies and empathic embodied technologies suggests that our embodied experience through virtual reality strongly influence our cognitive state and social biases. However, this trend of examining the effect embodied experience has on the mind is complicated when the target group of the empathic exercise may experience a disconnection between their physical body and their experienced body. Recent investigations indicate a potential biological basis for gender dysphoria. This brings into question how to create empathic technologies to explore the embodied experience of a transgender person, when a transgender person may not necessarily identify with their own embodied experience? To explore this, we present three Brain Computer Interfaces (BCI) for the development of empathic technologies beyond immersive embodiment. Hex Plexus investigates BCI music composition practices as the basis for a series of mobile performances.  Synapstraction is an installation which allows users to create abstract paintings based on neuro-feedback. The Gender Generator allows users to explore gender expression and construction in the form of a P300-based BCI paradigm. We conducted a series of user studies to understand the effectiveness of these machines as expressive tools and educational experiences. These projects address the notion of empathic technologies for body-dysmorphic users by building towards a Machine-Empathy Interface system which serves to create an empathic link between users.

Recommended citation: Davis, Josh Urban. (2018). "The Gender Generator: Towards a Brain-Computer Interaction to Evoke Gender Dysphoria Symptoms." Extended Abstracts Human Factors in Computing (CHI18). Montreal, QC.

Indutivo: Contact-Based Object-Driven Interactions with Inductive Sensing.

Published in User Interface Software Technology (UIST18), 2018

Watch Indutivo Demo Video We present Indutivo, a contact-based inductive sensing technique for contextual interactions. Our technique recognizes conductive objects (metallic primarily) that are commonly found in households and daily environments, as well as their individual movements when placed against the sensor. These movements include sliding, hinging, and rotation. We describe our sensing principle and how we designed the size, shape, and layout of our sensor coils to optimize sensitivity, sensing range, recognition and tracking accuracy. Through several studies, we also demonstrated the performance of our proposed sensing technique in environments with varying levels of noise and interference conditions. We conclude by presenting demo applications on a smartwatch, as well as insights and lessons we learned from our experience.

Recommended citation: Jun Gong, Xin Yang, Teddy Sayed, Josh Urban Davis, Xing-Dong Yang. “Indutivo: Contact-Based Object-Driven Interactions with Inductive Sensing.” ; Proc of User Interface Software Technology (UIST). Berlin, Germany 2018.


PsycheVR: Virtual Reality and Biofeedback to Promote Wellness in Astronauts


Watch Synapstraction Demo Video PsycheVR is a virtual reality and biofeedback project conducted in collaboration with the Space Medicine Labratory at the Geisell School of Medicine and National Aeronautics Space Administration (NASA). The project consists of developing virtual reality content to promote relaxation when the user is confined to small, isolated spaces for long durations of time. The products of this project were used in trial experimentation with NASA astronauts intended for future missions to Mars. We also prototyped biofeedback mechanisms to take in a user’s biometric state as determined by GSR, EEG, facial expression and other stress indicators and subsequently adjust the content of the VR enviornment to assit calming the user.

Synapstraction: Brain-Computer Interaction for Tangible Abstract Painting


Watch Synapstraction Demo Video Synapstraction is a Brain-Computer Interaction project which allows user’s to create an abstract painting based on neurofeedback from various stimuli. Using electroencephalography (EEG) headsets, we measure the event related potential (ERP) elicited by each user’s brain when recieving one of 5 stimuli. These 5 stimuli coorespond to the 5 senses (sight, touch, taste, sound, smell). In addition, each stimuli is mapped to a specific creative instrament such as a paintbrush, sponge, marker, etc. Each participant is invited to try on the EEG and intake stimuli and in doing so, create a paitning. When the participant is finished, they are invited to take their finished painting with them. This project premiered during the Digital Arts Festival at the Black Visual Arts Center in 2017. Press:Junction Magazine

IllumiWear: A Bendable Interactive Fiber-Optic eTextile for Audio and Visual Interactions


Watch IllumiWear Demo VideoWe present IllumiWear, a novel eTextile prototype that uses fiber optic cables as interactive input and visual output. Fiber optic cables are separated into bundles and then woven like a basket into a bendable glowing fabric. By equipping light emitting diodes to one side of these bundles and photodiode light intensity sensors to the other, loss of light intensity can be measured when the fabric is bent. The sensing technique of IllumiWear is not only able to discriminate between discreet touch, slight bends, and harsh bends, but also recover the location of deformation. In this way, our computational fabric prototype uses it’s intrinsic means of visual output (light) as a tool for interactive input. We provide design and implementation details for our prototype as well as a technical evaluation of it’s effectiveness and limitations as an interactive computational textile. In addition, we examine the potential of this prototypes interactive capabilities by extending our eTextile to create a tangible user interface for audio and visual manipulation.Download paper here

Download here


Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.