Limits...
ECCE Toolkit: Prototyping Sensor-Based Interaction

View Article: PubMed Central - PubMed

ABSTRACT

Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

No MeSH data available.


A screenshot of the web interface for the definition of Tinkerkit-based interactive entities. Users can (a) drag-and-drop sensors and actuators from a palette of components to the desired port.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC5375724&req=5

sensors-17-00438-f003: A screenshot of the web interface for the definition of Tinkerkit-based interactive entities. Users can (a) drag-and-drop sensors and actuators from a palette of components to the desired port.

Mentions: The Entities & Components Editor (Figure 3 and Figure 4) module enables new devices to be added to the ecosystem by (i) using existing mobile devices such as tablets or smartphones, laptops and multi-touch surfaces such as tabletops, see-through displays or projected surfaces or (ii) building custom sensor-based interactive objects with off-the-shelf micro-controllers, sensors and actuators. In the Entities & Components Editor, each entity is designed as the aggregation of different components both physical and digital. Examples of physical components that are (i) sensors such as accelerometers, gyroscopes, distance, luminosity, load and flex sensors; (ii) physical input devices such as potentiometers, joysticks or RFID readers; and (iii) actuators such as speakers, motors or LEDs. Digital components—the elements of the graphical interface—can be defined for entities that feature a display screen. They are labels, digital buttons, sliders, video streams and the like. New entities can be created by selecting from a list of predefined entities (Figure 1).


ECCE Toolkit: Prototyping Sensor-Based Interaction
A screenshot of the web interface for the definition of Tinkerkit-based interactive entities. Users can (a) drag-and-drop sensors and actuators from a palette of components to the desired port.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC5375724&req=5

sensors-17-00438-f003: A screenshot of the web interface for the definition of Tinkerkit-based interactive entities. Users can (a) drag-and-drop sensors and actuators from a palette of components to the desired port.
Mentions: The Entities & Components Editor (Figure 3 and Figure 4) module enables new devices to be added to the ecosystem by (i) using existing mobile devices such as tablets or smartphones, laptops and multi-touch surfaces such as tabletops, see-through displays or projected surfaces or (ii) building custom sensor-based interactive objects with off-the-shelf micro-controllers, sensors and actuators. In the Entities & Components Editor, each entity is designed as the aggregation of different components both physical and digital. Examples of physical components that are (i) sensors such as accelerometers, gyroscopes, distance, luminosity, load and flex sensors; (ii) physical input devices such as potentiometers, joysticks or RFID readers; and (iii) actuators such as speakers, motors or LEDs. Digital components—the elements of the graphical interface—can be defined for entities that feature a display screen. They are labels, digital buttons, sliders, video streams and the like. New entities can be created by selecting from a list of predefined entities (Figure 1).

View Article: PubMed Central - PubMed

ABSTRACT

Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

No MeSH data available.