Physicalization of Social Physical Interaction

Initial thoughts

After previously we got an idea of how we can haptificate our data, we started thinking about the possibilities of physicalizations. To structure the work we again set a goal for ourselves in form of the question that our prototype has to answer:

HOW SOCIAL PHYSICAL INTERACTIONS’ EFFECT ON THE FEELINGS CAN BE EXPLORED THROUGH DYNAMIC PHYSICAL REPRESENTATION? 

This process was essentially driven by two core realizations:

  1. Emotionally important situations can happen anytime and anywhere.
  2. Emotions are hidden and change over time.

As well as our persona, introduced in a previous step.

As our persona wants to use our device to get an overview of their emotions the only logical consequence of the first realization is that we need to build a portable device. Only then a user is able to track his emotions everywhere. The second realization leads to the conclusion that the user also needs the possibility to revisit old situations, transfer the hidden data into something observable and to compare the data of different situations. Also because emotions are often hidden and can’t be expressed but are felt we want to give the possibility to explore the data by touch. Based on these thoughts we developed the following requirements:

  1. Emotions should be easy to track
  2. Information should be explorable through physical objects
  3. Information should be explored through touch
  4. Esthetically pleasing product which does not clutter the space
  5. More mindful in an enjoyable way

To be able to fulfill all of these we further decided to split the requirements into two devices, one input device which is transportable and an output device that enables the user to observe different situations at once or look into one situation in detail. 

Outputdevice

Tree 

From the functional perspective, we were looking for an “accommodation” for a set of physical user interfaces, one for each saved interaction. Also it should allow the users to browse the data and have more active interactions with the data of interest.

In the hedonic aspect, the tree was considered a conceptual metaphor in our design. On the one hand, as the hidden data to track with the device was rather emotional, we intended to proceed with a rather biomorphic and biophilic design to imply the fact that our physical interactions with others have influences on us and the influences will live with us in some way. On the other hand, the form also metaphorically presents that those influences fade with time, as the fruits and leaves come and go, and as we say, “Data dies.” 

From there we researched different concepts of how such a tree could be built. Here we looked at natural designs that would be able to grow and shrink and form fruits over conceptual trees that can drop metaphorical fruits. In the end, we decided on a combination of all these ideas with a natural-looking tree that bears multiple fruits that can be varied in height. This tree was finally built with wood.

The individual fruits are realized as modules in order to add further fruits to the tree if required. A single module consists of a stepper motor, which can lower and raise the fruits again.

Cubes

For the design of the individual fruits, we went back to our haptifications and revisited the cube shape. In an earlier iteration, this allowed us to distinguish between outside and inside. While the outer shell was directly apparent and encoded data, all information could only be obtained by touch to sense the filling.

To bring this concept into our physicalization,, we first investigated different actuators and how they could represent the data we were tracking. After extensive research into which of our data is categorical and which is metric, we developed the following coding scheme:

  1. Type of emotion felt – LED color / metric scale.
  2. strength of the emotion felt – LED brightness / metric scale
  3. motivation of interaction – vibration patterns / categories
  4. type of touch – rope length to the tree / metric scale

As material for the single cubes, we chose epoxy, because we are able to let the whole cube glow. The user is thus able to visually perceive some aspects of an interaction directly through the LEDs, but to perceive an interaction in its entirety, the cube must be touched to sense a vibration pattern. For this purpose, LEDs, a vibration motor and a touch sensor for activating a cube were embedded in each cube.

Inputdevice

The input artifact should support the user as already mentioned to transfer his felt emotions into data which he can be revisited later on. Therefore we developed an input cube, which receives the individual data as input and at the same time offers a representation of this input.

Thus, a selected vibration pattern can be felt directly on a built-in vibration motor. Built-in LEDs provide information about the selected color and the brightness set for it. In the current prototype, there is only no direct feedback for the set rope length. For the inputs, sliders and rotary potentiometers are used so that the user already has to think about his emotions during the input.

The finished product thus allows a user to track emotional data on the fly and later to revisit and compare it in a tree. In order to not only view the most recent interactions and to be limited by the number of cubes attached, but we have also further extended our tree with a button to scroll through the available data.

User Study

After we finished developing our prototype, we conducted a small study to test our product and develop it in the right direction in the future. For the study, we recruited 2 participants whose’ profile closely matched our persona. Both participants moved to Germany for education, are in their twenties and care about their mental well-being. 

AgeGenderReason of moving to GermanyRelationship status
P122FemaleEducationSingle
P224FemaleEducationMarried

Procedure

The study had three parts to give us as many insights on the current state of the prototype as we could get in the limited amount of time. The first part was journaling and was done remotely by participants before the parts two and three which were prototype testing and an interview that took place in-person over one session.

Phase 1: Journaling

A participant was asked to log 5 interactions during the day before the study in a written form. For each interaction the following aspects were to be recorded:

  1. What was the interaction (a hug, a high-five, a touch, a kiss…)
  2. What was the motivation behind it (a traditional greeting/goodbye, an emotional expression, a supportive gesture…)
  3. How did it make participant feel
  4. Was this feeling positive/neutral/negative
  5. How strong was this feeling (rate from 0 for week or 10 for strong)

The notes were to be taken in any form comfortable for the participant and then brought to the in-person study for phases two and three.

Phase 2: Prototype testing

A participant was asked to test the prototype. There were three tasks: to input 3 feelings after social physical interactions from the journal, then to locate them on the tree (there were only two shown) and, finally, to describe them from the physicalizations.

Phase 3: an interview

Prototype testing was followed by the interview where questions were asked about experience with the prototype – input and output modalities, comfort of use, etc.

While we got positive feedback about the choice of color to represent emotions and especially for the input cube, which was easy to understand and controllable, the user mentioned some negative aspects. For example, the rope length as a representation of the amount of body contact is confusing, and a reported mismatch between vibration patterns and the motivation that they encode. 

Altogether, we consider the prototype being successful for its purpose. Maybe not fully, yet it helped participants to become more aware of their emotions and draw some insights from the experience with the tree.