Sunday, 11 May 2014

INNOVATION- Final Leap Motion Tests

These are the final test that will be used as the main points of evidence in the research paper. These were developed iteratively from a number of previous experiments and were felt to be the most revealing tests.

First Person Scene Navigation


Manipulate Object


Pull Objects

 

Point


Tuesday, 6 May 2014

INNOVATION- Leap Motion UI Design Talk

Found a video from a Leap Motion Developer Meetup where they are presenting some findings from UI design research. While focusing exclusively on UI design they still raise some very interesting results that are highly applicable to my own project.

https://www.youtube.com/watch?v=XI3yvLOon08

Two of the most valuable points for the project I feel were that:

1- Users are not very aware of their hand position relative to the device when moving in Z space (forward and backward from the user). This must be taken into consideration when asking users to perform this gesture, particularly if accuracy on behalf of the user is required. Accurately pointing toward the screen (while physically moving the hand forward) also proves difficult, as our hands generally move in arcs and as such change their X/Y position as they move along Z.

2- Dynamic feedback in regard to on-screen position relative to user hand position is also absolutely crucial. Without this users can become very confused or disoriented, hampering the user experience. Any sort of small visual change to show awareness of user action is better and if it can reinforce action (or is some cases dissuade action) then this can only improve the experience.

INNOVATION- Hook/UDK system

This is the final Kismet node that handles the Leap Motion data. Using any number of these variables allows for a comprehensive system that can track the features of gesture that Hook identified in her research.

This is a simple example of how these variables can be used. This test is designed to register if a user is making slow or fast movements and respond in different ways. This example has the user spawning projectiles along the hands forward vector when a fast 'throw' gesture is made. Using scripts like this allows for great scope in how the Leap data can be used.

Monday, 5 May 2014

SPATIAL CONSTRUCTS- Final Screens





Final Shot

SPATIAL CONSTRUCTS- V4





SPATIAL CONSTRUCTS- V3





INNOVATION- Kristina Hook- Affective Interaction

After discovering Rosalind Picard and the field of Affective Computing I started to look for more research involved with this area and other academics writing about the potential for emotive communication with computers. I was also looking for more specific research around these principles that was more focused on how gesture can relate to this subject.

Through looking for other papers that reference Rosalind's work I discovered a researcher called Kristina Hook. Hook and her colleagues have focused on Affective Computing from an interaction design perspective: exploring how we can interface with computers through the communication of emotion. This is in order to add greater depth to the human-computer dialogue and create an environment that closer resembles human communication.

Hook's work was particularly relevant for the development of the project. Not only did it focus on affective computing through gesture interaction but it also included a series of design paradigms and an interaction model for affective gestures. These proved invaluable and formed the basis for the critical framework of the project.

My aim was now to see how well these interaction models held up in a digital entertainment context. Could affective interaction be used effectively in this environment? Can games utilize affective design principles when building interaction?

The discussed paper can be found here: http://soda.swedish-ict.se/145/1/PaperA.pdf

A breakdown of hand gestures through observation of numerous emotions portrayed by actors. (Hook et al 2003).












The Affective Gestural Plane model (Hook et al 2003).





Hook’s design principles are as follows:

Embodiment:

Embodiment is the notion that meaning can be created, manipulated and shared through engaged interaction with an artefact (Dourish 2001). This is to say that users can create and communicate meaning through interaction with a system and with each other through the system (Hook et al 2003). Embodied interactions must therefore physically capture an abstract emotional concept and allow for interpretation of meaning (Hook et al 2003).

Natural but Designed Expressions

When approaching gestural interaction there are often two opposing paradigms: designed gestures (Long et al 2000) and natural gestures (Cassell 1998). Designed gestures can be equated to sign language, having very specific semantics, while natural gestures are built around normal human expression (Hook et al 2003). Hook (2003) argues that affective interactions should strive to be both natural and designed: built from natural human action to facilitate affect but structured around meaning interpretable by a computer system. 

Affective Loop

The concept of the affective loop is to match the emotional communication channels of a computer system to those of a human (Hook et al 2003; Sundstrom 2005). By interacting with such a system, the users are engaging in an affective loop, where their emotional affect is created either by the interaction itself or is a result of the response to the interpretation of that interaction (Hook et al 2003; Sundstrom 2005). By interacting to that response the user is continuing the loop.


Ambiguity
Hook (2003) argues that ambiguity in an affective system is an integral component. Ambiguity allows for more personal expression and interpretation of emotion, giving users the opportunity to tailor the affective dialogue to their own life experiences, thus increasing affect (Hook et al 2003). Furthermore, an ambiguous system will also create a sense of mystery, keeping the user engaged (Hook et al 2003). Care must be taken however to ensure the system does not become too ambiguous as unclear communication of affect may cause frustration in the user (Hook et al 2003).