1

Interaction

European Flag

Interaction [inter- + action]: From Latin inter- ("between, amid"). From Middle English accion, from Latin actio ("act of doing or making").

The situation or occurrence in which two or more objects or events act upon one another to produce a new effect; a conversation or exchange between people; (Source: Wiktionary)

Collaborative Interfaces The term 'co-creation' was used in Virtual Worlds: No Interface to design (Bricken, 1991) to describe collaboration in cyberspace between engineers, designers, and participants. John Feland’s case study (2007) of a virtual innovation team discusses how design thinking is able to "evolve concepts across [...] technical, business, and human issues." In his thesis, Martin Mauve (2000) developed a communication protocol and a synchronous VRML web application, based on his model of distributed interactive media that includes shared whiteboards, networked computer games, and virtual environments. Thomas Leerberg refers to Stankiewicz' evolutionary technology systems in his discussion of the Topos virtual design space, while Developing Future Interactive Systems (Bellman, 2005) examines a collaborative environment where embodied utilities are themselves users interacting with each other as well as humans. In their paper entitled Environments for Creativity – A Lab for Making Things (2007), Ellen Yi-Luen Do and Mark Gross propose a shared design space that is extended indefinitely by building tools, iterative prototypes and 'objects to think with'. Mary Lou Maher et al. (2006) consider computer-mediated communication in the Virtual Design Studio, emphasising the role of shared representation in successful collaborative designs. Research by Albert Esterline et al. (2006) examines the LOGOS Multi-agent System which is a sophisticated environment developed by NASA. Software agents collaborate with each other, as well as human operators, while resolving faults, retrieving database information, or paging experts for assistance.
 
Responsive UI Design Users want the best possible interactive experience regardless of the time of day, their position on earth, or the device before them. To that effect, interface designers need to anticipate more complex fields of tension between humans and technology.

Adaptive layouting: Concepts for a responsive UI should consider parameters like device type, screen size, aspect ratio and screen resolution, as well as interaction paradigms and bandwidth. Adaptive layouting enables page contents to be scaled, compressed, repositioned, or replaced in order to fit the target system.

Content adaptation: Pictures stored in the cloud as 0s and 1s are useless to the human eye until they are interpreted by a local computer and converted into pixels on the screen. Content is like water that will only assume its final shape in conjunction with a vessel, or viewport.

Digital library: Icons, graphics and images should be loaded as required, or scaled to fit, in order to populate a certain layout.
  • Incremental adaptation: The most suitably dimensioned bitmap is selected by the browser depending on the available space [screen real estate].
  • Continuous adaptation: Digital fonts and vector-based images (SVG) are calculated by the browser - that is to say, adjusted mathematically depending on the available space.

Absolute and relative units: Device pixels are device-dependent. On a desktop monitor, a text with font-size 16px may be perfectly legible - but on another display with smaller pixels this very text could be too small. Code that includes relative units supports high-quality visual interfaces on various systems. If you define normal text with font-size 1em and headings with font-size 2em, you are telling device X to render your headings twice as big as your text. How many pixels this ultimately requires will be decided by the terminal device. (See also: Pixel to Em Converter)
 
UI Patterns There is a continuum that runs from routine design, where a problem may be well defined, through innovative design where "a network of options" initiates new goods, methods and meaning, to creative design which occurs when requirements are yet unknown.
"Patterns can be a description of best practices within a given design domain. They capture common solutions to design tensions" (Tidwell, 2006). UI Patterns promote user understanding because the essential elements of an interface can be associated with a known experience.

 
Pervasive Computing In pervasive computing, humans, objects, and environments cluster into adhoc ecologies to form a "multidimensional web of relationships" (Aarts and Marzano, 2003). The model of a 'pervasive interaction' needs to include: (1) The user who institutes and (2) a responsive structure that facilitates.
The evolution of a running system may be represented as an exponentiation - that is, a raised to the n-th power, where a is the responsive system and n refers to the number of instantiations. A dynamic constellation could be modelled in a multidimensional phase space, where N stands for the number of entities in the system and each entity is linked with three position- and three momentum-variables to describe its trajectory in space.
 
Smart Spaces "A smart space is an environment with numerous elements that sense + think + act + communicate + interact with people in a way that is robust, self-managing, and scaleable." (CSIRO [2004])

1) Smart spaces are fitted with sensors and multimodal interfaces to facilitate the management of, and interaction with, people, machines, artificial entities. They bear a relation to pervasive computing, involve a bundling of components and functions, and provide a technological basis for natural-/synthetic user experiences. This begins with room temperature control and simple device management, includes collaboration support and service discovery, and may extend to data visualization as well as telerobotics. Smart spaces often require software agent technology to ensure interoperability of multi-component systems, including the introduction of new mobile devices.

2) In Teleoperations, the computer-generated graphics of virtual reality (VR) provide the basis for an interface to produce mechanical change. This involves a servo-mechanical conversion of the operator's intent - that is to say, a "projection" of the senses, as well as physical movement, into the remote artificial body. The Virtual Environment Vehicle Interface (VEVI) developed at NASA thus enabled the teleoperation of mechanical robots with a user environment that incorporated machine vision and a head-mounted display (HMD).
Under interplanetary conditions, the transmission of commands is constrained by the speed of light, resulting in significant delays. In that case, the solution was to reintegrate text input and a windowed point-and-click interface. The Rover Control Workstation for the Mars Pathfinder mission relied on stereo images from the lander which were used to generate and display a three-dimensional terrain model in the operator's HMD. Based on this information, a computer model of the rover was first guided through virtual space using a joystick in order to plot a suitable path, the actual coordinate data of which were eventually uploaded to the physical rover along with scientific task instructions.
 
Data Visualization With a multi-representational database, users can absorb and comprehend larger amounts of alphanumeric, visual, and audio information. Visualisation includes the collection of data, processing for human readability, graphics algorithms, and computer displays. Data essentially consists of entities, or [visual] objects, and any relations between them. Entities and relations may have attributes with scalar- or vector quantities and support a number of operations.

Interactive data visualisation empowers the user to manipulate data, to navigate globally, locally, and contextually, and to solve problems. The focus-/context problem [movement between overviews and levels of detail] may be solved when information spaces respond to user commands by prioritising selected data objects spatially. Zooming Mechanisms provide dynamic, user-defined narrative paths through data as well as customised screen displays. Bifocal zooming affords context and focus simultaneously.
 
Cosmic Cubes (Doc. ref.)
Web Usability User-centered design for the Web is informed by HCI principles, user culture, and task analysis. The interface is not only a mechanism to advance users from one state of the system to the next - allowing them to access databases, scripts, or applications - but also a presentation metaphor to develop mental models.
Wayfinding provides interactive choices for users navigating globally, locally, and contextually. The interface becomes transparent ["State of Flow"] when user interactions occur at the semantic level of the space and intellect can be applied directly to the task.
Many interfaces rely on layout grids and modular units. Information on the web may be structured sequentially - perhaps leading the user through a series of topics. It can also be organised in a hierarchy of menus, outlines, and content pages. Information architecture often revolves around existing source material, or begins with a hierarchical shell followed by content insertions. Hypertext spaces promote rapid non-sequential reading, but long documents should not be 'chunked' at the expense of context. Browsing structures provide the overview and relations among units, while searching by keyword initiates a path to locate specific data.
Design Journal (Doc. ref.)