HOME

---

---

---

---

 

FEELTRACE and the Emotions (after Charles Darwin)

 

Debra Swack

 

 

ABSTRACT

Rapid changes in science, technology and new media will lead to more sophisticated ideas about what it means to be human, in thought, body, emotional response and artistic expression. New relationships will form between humans, machines and animals with the human functioning as a networked resource that can be accessed globally over the internet.

Genetically emotionally or otherwise enhanced individuals could become the fashionable norm; synthetic biology could replace plastic surgery, with the further complication of not knowing where those genetic modifications will take them as individuals or us as a species.

This paper documents both the technical and theoretical development of the collaborative interactive new media video project "The Emotions (after Charles Darwin)" which explores some of the above concepts. "The Emotions" first tries to establish the existence of the universality of emotions at a biological level, as empirically measured and documented by the results of the control group (non-autistic subjects, as the goal is to document "normal", i.e. universal emotional response) at the Brain Mind Institute in Switzerland (and later through FEELTRACE). Secondly, it suggests the potential for subsequent futuristic misuse through genetic and or technological modification (demonstrated by the observer's ability to interactively modify or transform a given emotion's video stream at will).

Keywords

Cognitive and computational neuroscience, embodiment, bioethics, emotions, interactivity, FEELTRACE, Plutchik, amygdala, face perception, synthetic biology. computer science


 

INTRODUCTION

Princeton's WordNet web dictionary defines universal behavior as a "convention or pattern characteristic of all members of a particular culture or of all human beings; some format of religion seems to a human universal."

Donald E. Brown, an anthropologist, shares that view and believes that certain behavioral traits including facial expressions of emotions are common to all humans irrespective of culture. He compiled a list of approximately 400 behavioral traits and their implications that is included as an appendix in Steven Pinker's book The Blank Slate: the Modern Denial of Human Nature. For example "ambivalence is meant to suggest that males engage in more coalitional violence" and "the facial expression of anger suggests rape proscribed."

Although Darwin was incredibly prescient in his discoveries about what role the nervous system might play in regulating emotions, developments in neuroscience did not begin until well over a 100 years later, partially due to the lack of sophisticated recording and analytical tools such as neuro-imaging and computation made easier, enhanced through software algorithms and applications executed on computers.

This co-mingling of previously unrelated and seldom overlapping disciplines means that new media itself, its practices, applications and theories will continue to be in constant flux and development. It used to be standard practice in beginning art classes to ask what is art? But now the question is not only what is art, but who or what makes art (i.e., sometimes art now takes on a life of its own, extending beyond the control of its creator).

For example, the interactive new media video project "The Emotions (after Charles Darwin)" attempts to prove the universality of emotions by transcending cultural categorizations such as species, race, age and gender and instead relates emotions to their neurobiological origins and functions. It further suggests that once empirically known, that this information can be used to genetically or technologically alter human emotion(s) in individuals or groups to create new beings or new emotional interiors that better conform to culturally desirable behaviors. This of course raises bioethical questions about the future nature of life for humans and animals; the embodiment and containment of the self and its symbiotic integration and enhancement with technology and machines.

"No Longer is human existence defined by its unique temporal and spatial coordinate; one body, one life in a specific space and time. Instead human life is increasingly defined by the agential, instrumental deployment of resources for bodily renewal, both its temporal and spatial context subject to extensions or translocations", according to Susan Merrill Squier, in Liminal Lives: Imagining the Human at the Frontiers of Biomedicine.

As Joanna Zylinska states in her book Bioethics in the Age of New Media, "This is by no means to suggest that the human has been reduced to information in the age of new media and that we can therefore do away with embodiment; it is only to point to the emergence of new discourses of the human which undermines its centering around some fixed biological characteristics or moral values."

She adds, "The human does not disappear from the kind of nonhumanist bioethics envisaged here: in fact, it functions as its strategic point of entry. What we are dealing with, however, is not so much a "human being" understood as a discrete and disembodied moral unity but rather a "human becoming"; relational, co-emerging with technology, materially implicated in sociocultural networks, and kin to other life forms."

Neil Badmington in Alien Chic talks about how recent trends in techno-science have unsettled post humanist critics. For example he talks about how Donna Haraway's "Cyborg Manifesto (1991)" first deconstructed humanist relationships such as organism/machine, reality/fiction/human/animal, physical/non-physical and self/other and replaced them with chimeras; cyborgian fabrications of machine and organisms. He goes on to say that the latest trend in post-humanism seems to involve merging with animals, which ironically was not a concept alien to Darwin 140 years ago when he studied, documented and sought to define similarities with animals' emotions and our own.

Badmington quotes numerous television and news reportage from Newsweek to Nature, who discovered that reason, tool use, tool making, altruism and language are not unique to humans, neither I might add, is making or performing music (In 2008 I presented "Birdsongs; the Language Gene", in the "Sonic Fragments Soundart Festival" at Princeton University which digitally reconfigures bird songs into human music).


DARWIN AND NEUROSCIENCE

Over a hundred years ago, Charles Darwin theorized that the universality of emotions existed in humans and animals at a biological level. He posed questions such as can we feel happy, sad or fearful when we are alone or are emotions a unique result of being with others in a social situation? He suggested that the reason for the universality of emotions was due to an underlying biological basis that communicated our needs to others. We experience an emotion and specific areas of the brain send signals to specialized muscle groups that respond to communicate our feelings.

Darwin believed that the following principles were responsible for most of the expressions and gestures involuntarily exhibited by humans and animals while experiencing emotions: habitual actions initiated by certain states of mind in order to relieve or gratify certain sensations, habitual inverse actions initiated by the exact opposite states of mind and actions initiated by the nervous system mostly independent from both will and habit.

In post Darwin times, scientists study what regions and chemicals in the brain control different emotions and if these regulators can be modified to elicit alternative results. For example, emotions are studied to determine their affect on the immune, cardiovascular and endocrine systems. There is also the possibility for misuse, what if we could invoke certain emotions in people at will through a drug or by permanently or temporarily altering structures in their brain? Perhaps at the same time we could remove their ability to feel remorse or guilt. Could this form of genetic intervention be used randomly against individuals or during war-time to induce people to commit violent acts?

The neuroscientist Joseph Ledoux says the brain has not evolved to the point where connectivity exists for cognitive systems to control our emotions. But even so, he says that wouldn't necessarily be good, because Mr. Spock (a character lacking in human emotions from the 60's TV show Star Trek) may not be an ideal kind of human that we'd like to become. Additionally, Ledoux talks about futuristically controlling undesirable emotions such as fear through drug regulation, stating that once we can identify the neurotransmitters that are involved in producing fear, we could create a chemical profile of fear in the amygdala and then develop a drug to attack it.

The amygdala is an almond-shaped structure in the frontal portion of the temporal lobe near the hippocampus in the brain that allows us to both feel and perceive negative emotions. It regulates our reactions to events that are important for survival such as the presence of danger, sexual partners, enemies, food and those in need. The amygdala works as a system with other related structures because unique sets of regions in the brain are connected to each other and work in tandem to control different emotions. It also plays an important role in emotional regulation and studies have shown that emotional disorders can manifest themselves both functionally and structurally (it can become asymmetrically enlarged in depressed individuals). Patients who have had their amygdala destroyed due to stroke are able to recognize all facial expressions except for fear.

The amygdala's connectivity with the neo-cortex is also not symmetrical; the amygdala's connection to the neo-cortex is much stronger than the neo-cortex's connection to it (as shown in David Amaral's studies of primate brains), which in part explains, according to neuroscientist Joseph Ledoux, why emotions are often hard to turn off once initiated. The body also releases hormones and long acting substances at the exact time that we experience strong emotions. Additionally, there is a relationship between the visual system and emotions. In The Expressions of the Emotions in Man and Animals, Darwin talks about the importance of visual cues when seeking mates, prey and avoiding danger, therefore it's not surprising that studies show that the visual cortex is more activated in response to visual emotional stimuli than visual non-emotional stimuli.

Darwin acknowledged individual variance in emotional reactivity due to differences in development (for example he noticed that insane persons had strong passions which they openly expressed). But he never addressed the idea of emotion regulation which didn't come into being until the development of neuroscience a hundred years later.

Davidson defines the study of individual differences in emotional reactivity and emotion regulation as affective style consisting of the threshold to respond, the magnitude of the response, the rise time to the peak of the response, the recovery function of the response and the duration of the response. The duration of emotional responding is important in understanding individual differences and can also indicate psychopathology since some mood disorders are associated with either an abnormally early onset or inability to turn off a response quickly enough.


THE EMOTIONS

"The Emotions" is a multi- channel interactive video where each of four panels will display close-up graphic, moving images of men, women and children of all ages and races, expressing a specific emotion such as happiness, sadness, fear or anger (categorized as such by the results of the control group). Each panel's images will morph/blend to form a continuous stream of soundless images whose emotion will not be identified so as to allow the viewer the ability to form their own conclusion as to what emotion they feel is being expressed (which will also test the universality of emotions).

A fifth panel will record live audience reaction/ participation at the actual site of the installation in order to test mirroring behavior of the emotions displayed in the other four panels. Additionally the observer could have the ability to interactively modify, convert or morph emotions; demonstrating a futuristic ability to alter emotions genetically and or technologically at will. "The Emotions" is a collaboration with the Brain Mind Institute in Switzerland whose experiments done using my photographs validates their universality as images of specific emotions and forms the basis for the video.

Shortly after "The Emotions" was accepted into the Rhizome Artbase (an affiliate of the New Museum) in the fall of 2007, I was contacted by Britt Russo, a neuroscientist who had seen the project posted on their web-site. She asked me if I would be interested in collaborating with her lab at the Brain Mind Institute in Switzerland and would allow them to use my photographs for emotion perception research in autistic subjects. The lab had never used photographs from life before, only those of staged actors. In return they would present my work at international meetings and publish it in scientific journals. Although the lab wanted to use my photographs for research in autism; a neurodevelopmental disorder that impairs social functioning, I knew I would be primarily interested in the results of the control group as I wanted to document what was perceived as "normal" or "neurotypical" response and therefore universal, not the responses evidenced solely in autistic patients. However I thought that I might learn more about emotional response in general; its measurement and analysis by including the observation of autistic patients since I had the opportunity.

At the first meeting I had with Britt in Manhattan in December 2007, she informed me about the institute and its practices. The Brain Mind Institute was considered a world-class research facility for neuroscience whose goal was to synthesize and create a knowledge base by advocating a multidisciplinary approach across disciplines and by linking different research laboratories.

As taken from their web-site: "The mission of the Brain Mind Institute is to understand the fundamental principles of brain function in health and disease, by using and developing unique experimental, theoretical, technological and computational approaches. The scientific challenge addressed by the BMI consists in connecting different levels of analysis of brain activity, such that cognitive functions can be understood as a manifestation of specific brain processes; specific brain processes as emerging from the collective activity of thousands of cells and synapses; synaptic and neuronal activity in turn as emerging properties of the biophysical and molecular mechanisms of cellular compartments." The group that I would be working with was headed by Dr. Nouchine Hadjikhani; a specialist in neuroimaging.


Testing at the BMI Lab

In the lab, functional magnetic resonance imaging (fMRI), Electroencephalography (EEG) and magnetoencephalography MEG) were used to visualize brain activity and electromyography (EMG) was used to measure facial muscle activity of autistic subjects while they viewed images of human emotional facial expressions (autistic people display different brain activity patterns and facial muscles reactions than normal or "neurotypical" people). A Tobii eye tracker was used to trace the path of the subject's eyes, while they viewed images.

According to Dr Hadjikhani's research, autism was thought to be related to the dysfunction of the mirror neuron system that plays a critical role in the perception of other people's intentions including empathy. Autism Spectrum Disorder (ASD) is a behaviorally defined neurodevelopmental disorder of early onset whose subjects suffer from a social disability that profoundly affects their ability to understand other people's feeling and to establish reciprocal rewarding relationships. The disorder manifests itself by exhibiting restrictive and or repetitive interests and behaviors. Persons suffering with ASD typically fail to engage in social interactions because of an inability to correctly interpret facial expressions and their meanings. Abnormalities in face perception (crucial to social-communicative competence) and the accurate identification of the deficient components of the face processing system are essential to the understanding of ASD.

The lab's primary area of study (Figure 1) was the functional and structural integrity of the social cognition network as it relates to autism and the amygdalas's connectivity to the mirror neuron system.

 

Figure 1. Social Cognition Network

 


A.   Elements of the network exert reciprocal influences on each other. Face processing deficits can arise from the dysfunction of one or more elements of the network and to or from each element's termination.
B. During face perception, the face identification system is activated in both healthy controls and in individuals with ASD when cued to look at the eye-region. However, face perception also activates areas of the MNS (see a and b) in healthy controls but these same areas remain quasi silent (see c and d) and exhibit a thinner cortex (see e) in individuals with ASD. The face processing difficulties exhibited by ASD individuals could be due to the dysfunction of the MNS.

In summary, the lab's studies showed cortical thinning of the mirror neurons system and an abnormal recruitment of mirror neurons areas during face perception as well as abnormal temporal activity in face-processing areas. They had also disproved a popular theory that said that autistic patients were lacking in the brain area devoted to face identification, opening up new therapeutic strategies and areas of inquiry.


Image Preparation

Britt sent me instructions on how I needed to prepare the photographic images for the MRI scanner experiments (Figure 2) to be performed by the autistic subjects and the control group (I would later extract the results of the control group and use them for my video). The goal was to make the photographs neutral and uniform in appearance, displayed with minimal luminance and no distracting background elements.

Each image was cropped from the hairline to the chin and formatted so that the eyes were always in the center of each photograph, therefore the autistic person did not have to move their eyes in order to focus on a red fixation cross while in the MRI scanner. Dr. Hadjikhani had discovered that by placing a red fixation cross in the center of each image and telling the subjects to focus on it while in the scanner that the fusiform face area was activated in autistic brains, just like it was activated in non-autistics.  Earlier studies had failed to show activation of the face area in autistics probably because they weren't actually looking at the faces in the photographs.  

The lab at first wanted me to mask out the backgrounds but then decided that they wanted to test (using an eye tracker) what part of the photograph the autistic person spent more time looking at; the faces or the backgrounds. Previous studies had found that autistic persons spent more time looking at backgrounds than at faces in photographs. They also performed experiments comparing responses to the staged photos of actors used by the lab with my photographs from life using magnetoencephalography (MEG) to visualize brain activity.

I adapted a lot of the lab's methodology not only in the way I prepared images for their experiments but also how I planned to later group (according to the results of the control group), animate and display them in the video. I wanted my images to appear as objective and scientific as possible. For example, I also centered the eyes in the images and completely masked out any distracting background elements in my photos, just as the lab did.


Figure 2. Modified photo for fMRI experiments

 

Luminance could be contained if desired by creating an adjustment layer in Photoshop. I planned to import the photos as multi-layered PSD image sequences into Photoshop Extended, edit them and export them as Photoshop live 3D layers into Adobe After Effects in order to animate them with behaviors and apply 3D lighting and camera effects. For the fifth channel I planned to hook up a digital camcorder to a projector to capture possible mirroring behavior and to also allow observers to possibly interactively modify, convert or morph emotions. I sent Britt a color-coded schematic of what I envisioned for 4 channels of my video consisting of the emotions happy, angry, surprise and sad. I wanted to relate each photograph graphically and logically to a specific emotion (Figure 3).

 

Figure 3. Color-coded Schematic for "The Emotions"


Plutchik's Emotional Index

The lab typically used black and white photos for their testing but decided to use my color images in an eye tracking experiment. They could then later convert them to black and white and flatten the luminance if needed (as previously shown to be necessary in early eye-tracking experiments) if the autistic subjects were distracted by the glare unavoidably caused by high-contrast lighting situations.

Britt sent me a schematic representation of Plutchik's color-coded "Emotional Index" which was comprised of eight basic emotions arranged as four pairs of opposites and their increasingly less intense variations (Figure 4). Plutchik believed that emotions were evolutionarily adaptive and part of a process involving both cognition and behavior. The cone's vertical dimension represents intensity and the circle represents degrees of similarity among the emotions.

She had the control group categorize each photo by choosing one of the words from the entire diagram instead of just limiting them to one of the eight basic emotions because she thought that would generate a more accurate rating given the subtlety of some of the photographs that I sent her.  

After the Plutchik test, an eye tracking pupillometry study would then be conducted on the control group subjects to systematically rate each photo by its emotional intensity; from bad through neutral through good. I could then select images by emotion and or emotional intensity to be used in the video. For example I could select faces that were rated high intensity (terror), medium intensity (fear) or low intensity (apprehension). Additionally by using Plutchik's Schematic I could relate each emotion for the video not just by emotional category and or intensity but also by its associative symbolic color as it appeared on the chart.

According to The Handbook of Psychological Testing by Paul Kline, Plutchik's Emotional Profile Index is based on eight basic emotions which are joy, acceptance, surprise, fear, sadness, disgust, expectation and anger. Individuals choose from pairs of personality traits that describe them and each trait results from combining two or more primary emotions (i.e, shyness implies fear; gloominess implies sadness). The results are then plotted on a circumplex arranged according to similarities and bipolarities.

A fMRI study was performed after rating the photographs by emotional intensity. Other considerations were evaluating direct verses indirect gaze, group make up and image order. The lab administered Oxytocin and using the eyetracker, found that Oxytocin reduced the activation of the amygdala while viewing photos of direct gazes, from neutral unfamiliar faces. This enabled the participants to feel more relaxed; which increased their amount of direct eye contact. In previous studies (Guastella, Mitchell and Dadds, 2007) Oxytocin was shown to greatly increase gaze enhancement to the eye region (the focal point for emotion, threat and interpersonal interest) which enabled participants to better detect emotions in others.

The lab sorted my photographs into direct and averted gaze because the brain responds more dramatically to direct gazes than averted ones. They were also grouped into children and adults. Two sets of images were created (so the lab could experiment with the same group of subjects but use a fresh set of faces) that were balanced in terms of age, sex, emotion and intensity.

A small pilot study was conducted to look at the possible effects of image order on each subject's ratings.  If presented one at a time, then ratings could be unduly influenced by the previously presented photo, for example, a mildly sad photo following an intensely happy one might be thought of as more intensely sad than it would be if presented by itself. If this proved to be the case, an entire set of photographs could instead be presented simultaneously, and each subject would be asked to rate individual photos relative to each other. There were disadvantages to this method but at least the lab would have a whole set of photographs that would be internally consistent.

 

Figure 4. Plutchik's Emotional Index

 

The order in which the photographs were presented was found to affect a perceived emotion's intensity. I would make the video accordingly, being careful to place photographs with similar ranked emotions and intensity ratings together contained within an individual video channel, which would have the affect of displaying a group of related photos simultaneously as described in the pilot study.

Additionally the lab was thinking of adding a self-recognition test into the protocol (it has been suggested that autistics have self face recognition deficits) by randomly inserting photos of the subject brought in from home and also by presenting new ones that the lab would take themselves but that the subject wouldn't see before the experiment.

The idea of the self recognition test reinforced my idea about including a 5th interactive "self-recognition" video channel (by hooking up a digital camcorder to a projector at the exhibition site) to record live emotional reaction including possible mirroring behavior and to allow the participant to be part of the experiment. Additionally the observer could have the ability to interactively "intensity, convert or morph emotions"; demonstrating a futuristic ability to modify emotions genetically and or technologically at will.

The "Intensify Emotion" command could use a slider to make emotions appear more intense. This could be achieved by interactively applying behaviors/animations globally to a specified video stream by using After Effects/Flash software (animations could be achieved by creating frame by frame parent/child relationships affecting the eye and mouth regions). "Morph Emotions" could utilize program/behaviors that scrambles all four channels simultaneously by selecting and replacing video content from each of the four channels at random. "Convert Emotion" could allow the user to morph any stream of emotions into another by utilizing parameters that would select and replace video content from one video stream to another. The original color filter associated with Plutchik's color coded schematic could be applied to the new video stream, maintaining its original Emotional Index categorization reference point.

The lab decided to organize an open-house of talks and presentations for the public to celebrate the first World Autism Day on April 2, 2008, as instituted by the U.N. They teamed up with two other autism labs, one that worked with rats and other with robots. They hoped that it would generate more research subjects and also enlighten the public about autism. The lab's areas of research (including the brain areas studied) and my collaborative role are graphically summarized in Figure 5.


Figure 5. Hadjikhani Autism Lab

 

We finished corresponding in the summer of 2008, as the research was completed and my photographic images were categorized and documented by the control group. Throughout our correspondence, I had Britt send me any relevant documentation on what her group under Dr. Hadjikhani was researching; the technological and computational tools used to both measure and record experiments and their theoretical methods, applications and implications. The photographs that I submitted to Britt were spontaneous photos from life, never posed and taken well before I had ever thought of doing the project (so I never associated any of them with a particular emotion). They were pretty objective, the only issue being that the person being photographed was sometimes briefly aware of my presence (the lab previously used only staged photographs by actors for their testing).

 

INCORPORATING FEELTRACE

One of the applications I am currently researching in 2011 is FEELTRACE; a computer software program developed by Roddy Cowie at Queens University in Belfast Ireland as part of the PHYSTA project. PHYSTA's (which first organized on September 1, 1998) institutional members also include Kings College (KCL), the Image, Video and Multimedia Systems Laboratory (NTUA), Katholieke Universiteit Nijmegen (KUN) and the University of Milan (UM). PHYSTA's goal is to review existing techniques and develop new ones in the areas of artificial intelligence and neural networks abilities to map signals to symbols (artificial neural networks are composed of interconnecting artificial neurons that mimic properties of biological neurons in the peripheral or central nervous system and may be used either to gain an understanding of biological neural networks or for solving artificial intelligence problems without necessarily creating actual biological systems), understanding human emotions and human-computer interactions and associated software applications, developing feature representations from emotionally coded facial signals and speech to be used as test materials. According to the "Test Material Format and Availability Report", PHYSTA's goal is not to simply attach categorical labels to extreme emotional states but to capture the more complex richness of naturalistic emotion based scenarios experienced in daily life in database form to be used as visual, audio and audio-visual testing and training materials for analyzing emotions.

FEELTRACE allows participants to continuously track emotional content of an audio-visual stimulus as it changes (recording any variations and gradations) over time. FEELTRACE replays video from MPEG files on a single-screen; emotion related audio-visual video material is presented on the left side while the right side simultaneously displays an activation-evaluation circle which allows the subject to analyze and track the material.

FFELTRACE would allow me to also research and perhaps incorporate elements of sound within "the Emotions" video (which currently has no audio) or to create a separate companion audio project on emotions so that I can test what type of presentation has the most emotional power and immediacy for both subjects and viewers; video, audio or audio-visual material?

 

FEELTRACE OVERVIEW

The book jacket for Plutchik's The Emotions: Facts, Theories, and a New Model published in 1962, states that "the author draws an analogy between emotions and colors-primary emotions are likened to primary colors; and, like colors, they tend to be grouped in pairs of complementary relationship. Interesting parallels can be drawn between complementary color pairs such as red and green, or yellow and blue, and emotions such as love and hate, or joy and sorrow."

In Plutchik's Emotional Index, primary emotions are arranged as colored hues that vary in degree of intermixture (saturation) and intensity around in a circular color wheel like structure. Primary emotions opposite each other are complementary in the sense that mixing them would, according to Plutchik, "create the psychic or biological equivalent of gray" (similar to the result achieved by mixing their equivalent paint colors). Since adjacent emotions are more similar than those further away they are color coded as such. Mixing or combining the primary emotions in various proportions will produce all of emotions that we know and experience in life.

FEELTRACE (see Figure 6) uses a color-coded cursor derived from Plutchik's Emotional Index that changes color as you move it around the circle. The cursor (when pressed) changes to a colored disc depending on the position it's in (green indicates a highly positive emotional state, yellow indicates a highly active emotional state, blue indicates a very inactive emotional state, and red indicates a highly negative emotional state). For example, if I FEELTRACED a surprised face for "the Emotions" video although the face would display a high level of activation the subject would probably place the cursor near the top of the circle but not in either of the very positive or very negative regions.

 

Figure 6. Example of FEELTRACE display during a tracking session (cursor changes color clockwise starting at top left at red/orange, yellow beside at the active/ passive axis, bright green at the negative/ positive axis and blue-green at the bottom right).

 


FEELTRACE also uses activation and evaluation spaces and landmark words. Activation-evaluation spaces visually represents emotional states in two dimensions and are represented in FEELTRACE by an x and y axis drawn within a circle on a computer screen. Evaluation runs along the x axis from left to right from very negative to very positive and activation runs along the y axis from top to bottom from very active to very passive.

Activation measures how dynamic the subject's emotional state is (for example exhilaration involves a very high level of activation and boredom enlists almost no activation). Evaluation measures the degree of positive or negative feeling associated with the emotional state (for example happiness involves a very positive evaluation while despair involves a very negative one). In addition to how dynamic and positive or negative the emotional state is the space can be defined as naturally circular (like Plutchik's Emotional Index). The most intense emotional states located on the circle's perimeter and are equidistant from an emotionally neutral point with alert neutrality at the circle's center.

Landmark words identifying the most intense emotional states provided by Plutchik and Russell were placed around the circle's periphery. Further within the circle, less extreme descriptive emotion words were placed at coordinates extracted from Whissell's published tables which provided empirical evidence for valid activation-evaluation co-ordinates of common emotion words (the landmark words were later revised according to FEELTRACE's own research as part of the BEEVer Basic English Emotion Vocabulary project).

Landmark words orient the subject so that they can easily relate their current mouse position within the FEELTRACE circle to everyday categorical descriptions of emotionality. In addition to landmark words, the color-coded cursor represents time indirectly by continuously visually recording the current mouse position while displaying previous ones that get smaller and appear further away over time.

 

FEELTRACE AND SOUND

FEELTRACE can also be used to analyze the emotive qualities of music or sound. A study was performed to test FEELTRACE'S ability to capture emotional variations using musical extracts chosen for their ability to evoke consistent emotional responses at specific activation –evaluation spaces of the FEELTRACE circle. Some were intended to evoke relatively neutral emotions while others passages chosen for their ability to provoke strong emotions were expected to demonstrate distinct changes from one part of the extract to the next. Ten subjects rated the passages. Figure 7 shows examples of the ratings produced on one of the dramatic samples chosen to show significant change, the "Great Gate of Kiev" from "Mussorgsky's Pictures at an Exhibition." Utilizing FEELTRACE's color coding, each vertical line represents a cursor position (its length is the distance from the center of the activation-evaluation circle to the cursor position, and its color is the color that the cursor takes on at that position).

 

Figure 7. FEELTRACE ratings by 3 raters for the "Great Gate of Kiev" (musical passage from Mussorgsky's Pictures at an Exhibition) exhibiting three separate phases (the horizontal axis shows time in seconds).

 

The results exhibited some degree of change from one emotional passage to another. After break points were identified responses were averaged within the time intervals defined for each passage. The middle section (about 60 to 100 seconds into the piece), was found to be consistently below the midline in activation, whereas the first and last passages were consistently above it.

A FEELTRACE file was produced by the musical study (each line consisted of a time code indicating the exact location in the music passage currently being analyzed followed by its associated activation and evaluation co-ordinates created by the subject's movement of the mouse at the same time). In addition to being used to visually generate Figure 7, additional software applications were developed to generate summaries. Mean and standard deviation for each cursor's x and y co-ordinate position (evaluation and activation space respectively) and the polar co-ordinates (symbolizing emotional orientation and emotion strength) were created for any given sequence.

An application that identifies "tunes" (active segments bordered by periods of silence) developed by PHYSTA was used in the music study to divide FEELTRACED files into significant sequences defined by temporal boundaries.

An experiment using 16 clips from real not acted interactions from TV shows lasting 15-30 seconds each was carried out to assess the reliability of the FEELTRACE system with these adaptations. 16 people rated each clip in a different order counter balancing the test design (there were two clips in each quadrant of activation-evaluation space exhibiting relatively strong emotions and two clips with the same people exhibiting neutral emotional states).

Mean co-ordinates and standard deviations were calculated for each subject's response to each clip. The results (see Figure 8) are graphically represented as ellipses, each with its center at the mean co-ordinates for the clip, and its radius in a particular direction equal to the standard deviation with respect to the appropriate axis of the spaces.

Significant differences in intensity were found to exist in highly emotional passages and between emotional and neutral passages of music (intensity is determined by the distance from each evaluation-activation x, y coordinate to the circle's center). Paired t-tests showed significant differences between the two emotional clips in three out of four quadrants which proves statistically that FEELTRACE is a reliable and precise measurement tool; it was able to discriminate between episodes of only moderately strong emotion within the same quadrant of a given activation/evaluation space.

 

             NEUTRAL

             EMOTIONAL

Figure 8. Summaries of FEELTRACE ratings in the validation experiment showing clips judged by the experimenter (ellipse height and width are the standard deviations of the average observer's activation and evaluation respectively).

 

CONCLUSION AND FUTURE WORK

In conclusion the interactive new media project "The Emotions (after Charles Darwin)"; a multi-channel interactive video consisting of multiple panels displaying close-up graphic, moving images of men, women and children of all ages and races, each expressing a specific emotion such as happiness, sadness, fear or anger (as categorized by the results of the control group) supported Darwin's ideas about the universality of emotions on a biological level.

A strong relationship was shown to exist between the control group's rating and ranking of each image's emotion (as determined by Plutchik's Emotional Index) and emotional intensity as determined by the battery of tests including pupillometry eyetracking after oxytocin administration, functional magnetic resonance imaging (fMRI), electroencephalography (EEG) and magnetoencephalography (MEG) to visualize brain activity and electromyography (EMG) to measure facial muscle activity.

So far emotions appear to be universal at a biological level which futuristically suggests that now that we know that, how can we modify them to elicit more desirable behaviors? Does the intensification, conversion and morphing (induced by the application of random software behaviors) of universal scientifically determined emotions used in this project bring up suggestive ideas about genetic and technological modifications of emotion regulation of the future?

In My Mother was a Computer by N. Katherine Hayes, she states "where the Holocaust and other atrocities provide horrifying examples of humans not counting as persons, intelligent software packages offer the spectacle of bots being mistaken for human interlocutors." She later states that "we are both in the world and of it- a truth that becomes only more inescapable as we create machines in our own image and envision ourselves as computational mechanisms like them."

Although acceptance and performance of universally endorsed behaviors and characteristics are necessary for all peoples and animals to effectively communicate and co-exist within groups, one of the primary dangers in proposals such as "The Emotions" is that if we were to use the results of the control group to develop a range of acceptable universal behaviors and then genetically alter subjects emotional capabilities like a plastic surgeon would so that they conform to them using synthetic biology and other appropriate methods, there may be unforeseen and equally undesirable consequences or dangerous side effects; both for the individual and for us as a species.
Ongoing work will include new methodologies used in emotion-related research such as FEELTRACE (that can potentially incorporate aspects of computer science, synthetic biology, genetics, game theory, robotics and artificial intelligence) and ways to represent these ideas (perhaps interactively) in a visual, audio or audio-visual fashion either as an enhancement to "The Emotions after Charles Darwin" or as a companion perhaps strictly audio work.

 


 

REFERENCES


Amaral, David G. 2003. The Amygdala, Social Behavior, and Danger Detection. Center for Neuroscience. University of California-Davis, Davis.

Badminton, Neil. 2004. Alien Chic. Posthumanism and the Other Within. Routledge, New York.

Bradley, Margaret M., Miccoli, Laura, Escrig, Miguel A. and Lang, Peter J. 2008. The Pupil as a Measure of Emotional Arousal and Automatic Activation. Pychophysiology. 45 (2008).

Brockman, John. 1997. Parallel Memories: Putting Emotions Back Into the Brain. Joseph LeDoux interviewed, http://www.edge.org/3rd_culture/ledoux/ledoux_p1.html.

Brown, D.E. 1991. Human Universals. McGraw-Hill, New York.

Brown, D.E. 2000. Human Universals and their Implications. In Being Humans: Anthropological Universality and Particularity in Transdiscliplinary Perspectives. Ed by Neil Roughly. Berlin: Walter de Gruyter. Pp 156-174.

Corden, Ben, Chilvers, Rebecca, Skuse, David. 2008. Avoidance of Emotionally Arousing Stimuli Predicts Social-perceptual Impairment in Asperger's Syndrome. Neuropsychologia 46 (2008) 137-147.

Cowie, Roddy, Douglas-Cowie, Ellen, Cox, Cate. 2005. Beyond Emotion Archetypes: Databases for Emotion Modeling using Neural Networks. Elsevier Ltd.

Cowie, Roddy, Schroeder, Marc, Sawey, Martin, Douglas-Cowie, Ellen, McMahon, Edelle, Savvidou, Suzie. FEELTRACE Application and instructions. Ver. 1.0. Viewed July 19th, 2011. http://www.dfki.de/~schroed/feeltrace


Darwin, Charles. 1998. The Expression of the Emotions in Man and Animals. Third Edition, Oxford University Press, New York (First Edition 1872, Murray, John, Great Britain).

Davidson, Richard J. 2003. Darwin and the Neural Bases of Emotion and Affective Style. Laboratory for Affective Neuroscience. University of Wisconsin, Madison.

Douglas-Cowie, Ellen, Cowie, Roddy, Schroder, Marc. A New Emotion Database: Considerations, Sources and Scope. 2000. ISCA Archive.Newcastle. Northern Ireland, UK. http://www.isca-speech.org/archive.


Eckman, Paul, Campos, Joseph J., Davidson, Richard J., de Waal, Frans B.M., editors. 2003. Emotions Inside and Out. 130 Years after Darwin's the Expression of the Emotions in Man and Animals. Annals of the New York Academy of Sciences. Volume 1000. New York.

Ekman, Paul. 2003. Emotions Revealed. Recognizing Faces and Feelings to Improve Communication and Emotional Life. Second Edition, Henry Holt and Company, New York.

Gusastella, Adam J., Mitchell, Philip B., Dadds, Mark R. 2007. Oxytocin Increases Gaze to the Eye Region of Human Faces. Biol Psychiatry (2007) 006-3223/07.

Hadjikhani N, Joseph R.M., Snyder J., Chabris C.F., Clark J. and Steele S., et al. 2004. Activation of the Fusiform Gyrus when Individuals with Autism Spectrum Disorder view Faces. NeuroImage 22 (2004) 1141-1150.

Haraway, Donna. 1991. Simians, Cyborgs and Women: The Reinvention of Nature. Routledge, New York.

Harrison, Neil, Singer, Tania, Rothstein Pia, Dolan, Ray J., Critchley and Hugo D. 2006. Pupillary Contagion: Central Mechanisms Engaged in Sadness Processing. Soc2 Cogn Affect Neurosci. PCM (2006).

Hayes, N. Katherine. 2005. My Mother was a Computer. University of Chicago Press, New York and London.

Klin, Ami, Jones, Warren, Schultz, Robert, Volkmar, Fred and Cohen, Donald. 2002. Defining and Quantifying the Social Phenotype in Autism. AM J Psychiatry 159:6 (2002).

Kline, Paul. 2000. Handbook of Psychological Testing. Taylor & Francis, New York.

Kollias, Stefanos, Piat, F. 1999. PHYSTA Test Material Format and Availability Report- Principled Hybrid systems: Theory and Applications, National Technical University of Athens, Greece.

LeDoux, Joseph, Debiec, Jacek and Moss, Henry, editors. 2003. The Self from Soul to Brain. Annals of the New York Academy of Sciences. Volume 1001, New York.

Pinker, Steven. 2002. The Blank Slate: the Modern Denial of Human Nature. Viking Press, New York.

Plutchik, Robert. The Nature of Emotions. 2004. American Scientist, Volume 89. http://www.americanscientist.org/articles/01articles/plutchik.html

Plutchik, Robert, Kellerman, Henry, Ed. 1989. Emotion-Theory, Research and Experience-Vol 4: the Measurements of Emotions. Academic Press Inc., New York.

Plutchik, Robert. Emotions: Facts, Theories and a New Model. 1962. Random House, New York.

Princeton WordNet Web Dictionary, viewed 10/28/10. http://wordnetweb.princeton.edu/perl/webwn

School of Life Sciences-Brain Mind Institute, viewed 09/21/09. http://bmi.epfl.ch

Spezio, Michael L., Adolphs, Ralph, Hurley, Robert S. E. and Piven, Joseph. 2007. Abnormal Use of Facial Information in High-Functioning Autism. J. Autism Dev. Disord (2007) 37:929-939.

Squier, Susan Merrill. 2004. Liminal Lives: Imagining the Human at the Frontiers of Biomedicine. Duke University Press, Durham.

Stock, Oliviero and WP8 members. Humaine D8a Report on Basic Cues and Open Research Topics in Communication and Emotions. 2004.

Waldby, Catherine. 2000. The Visible Human Project: Informatic Bodies and Posthuman Medicine. Routledge , New York.

Zylinska, Joanna. 2009. Bioethics in the Age of New Media. MIT Press, Massachusetts.

 

Debra Swack (www.debraswack.com) is a new media artist affiliated with Rhizome at the New Museum whose projects have been presented at the Re-New Digital Arts Festival in Copenhagen (Aalborg University Press), "Virtual Public Art" at the Philadelphia Festival of the Arts, "Robots and Representation" at Purdue University, "Post Human/Future Tense" at Columbia College, "25th Anniversary" at Xerox Parc ("Art and Innovation"; MIT Press), Real Art Ways (Sol LeWitt Collection), "Soundlab VII" in Cologne, the University of California ("After Media"; University of California Press), White Box Gallery, "Sonic Fragments" at Princeton University, "Tomorrow" at the New York Hall of Science (curated by Anne Barlow of the New Museum), "The Gun Show" at Aaron Packer Gallery, "Voices I" at the University of Illinois in Chicago, Northern Illinois University Museum, Banff Center for the Arts, the Arts and Genomics Center in Amsterdam ("Kloone4000") and Vancouver ("Allegories of the Genome") and the Beecher Center for Arts and Technology. She also does software testing and technical writing for the SUNY Research Foundation.

 

 

 

HOME

---

---

---

---

 

FYLKINGEN'S NET JOURNAL