Social AI Design
CHRISTINE MEINDERS
DSC_2282.jpg

Inter-Action: Intelligent Gestural Conversations Across Cultures

WHAT

Inter-Action is a jacket that uses a customizable gesture and touch language system to explore what it would feel like to receive varying levels of haptic touch at different locations on the body. The jacket can be controlled by the wearer (to turn it off and on, and map modular sensor locations), gestures of others (remotely), or a conversational user interface. This integrates digital and analog aspects of communication.  Inter-Action is more than an intelligent communication tool - it’s a haptic conversation.

WHY

Inter-Action addresses issues like access and control, evolving concepts of privacy, and how we communicate with our bodies. Questions like who is allowed to touch us, and where and why become increasingly relevant when communicating conversations with our body. 

Haptic communication has become increasingly adopted. Using synchronous and/or asynchronous behavior, wearers can use this modular system to immediately connect or delay a conversation, which would feel quite similar to a phone call, but through touch. A tap on the shoulder could be responded with a quick swipe to indicate “I can feel you, but let me call you back.” The conversation can be localized to the wearer based on their relationship to the sender, where they can touch through a system that is only recognizable to them. How we prioritize who we communicate with is a key to curating an intimate conversation. Trust in the communicator and system is important, as you wouldn’t want someone constantly touching you all day.

Interactions are moving from screen to verbal and touch-based, changing our communication culture to be more accessible in regard to age and gender. This brings up many questions involving privacy, bonding, and thinking with our bodies. 

RESEARCH

In my prototyping I suggested gathering movement data from the users of this haptic and/or voice based conversational system to recognize movement and voice (tone, frequency) patterns across cultures, associated with a specific word or concept. Using this information the conversational system could be used to communicate with users more effectively across cultures (and sub-cultures). 

    User Research

    • I used google forms to ask questions about touch.
    • I engaged in immersive research and had individuals interact with the "touch jacket" and  "touch tool". 
    • I collected data about comfort with touch (location, intention and privacy). 

    Literature Review Research: I conducted a literature review about touch (cultural, social and experiential). 

    • Explored what makes a touch realistic.
    • Examined touch considerations: element of surprise, flow of air (and rhythms), temperature of air, duration of air. 
    • Examined ideal touch: light, temperature should mimic that of a human body 89 F, not too fast or too slow. 
    • Looked at potential issues with air touching such as air pressure, portability and inappropriate use.

     

    LEARNINGS

    IMG_4390.jpg

    Insights: 

    • Touch, gestures, and vocals should be incorporated as a part of a CUI because they provide a more natural interface than typing on a keypad. This can help address the lack of connectedness felt when using CUIs.
    • Connecting intonation (through emotive analytics), gesture, and touch, may provide a feeling of connectedness.
    • Gestures combined with vocal intonation could provide more accurate data for the CUI to evaluate and usefully respond the emotional state of the user.

    Sociocultural Considerations: 

    • Access and technology (gender, age)
    • Culture translation social tool (additive touch from friends or physical support in a tense situation)
    • Regional dialects (such as personalized emoji languages)
    • Cultural issues re: touch project our ideal touch to the device?
    • Project our ideal touch to the conversational user interface?
    • Re-introduce touch into specific cultures

    Year: 2015

    Credits: ArtCenter/MDP Creative Tech student research project (advisor: Philip van Allen)