Limits...
ECCE Toolkit: Prototyping Sensor-Based Interaction

View Article: PubMed Central - PubMed

ABSTRACT

Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

No MeSH data available.


Physical buttons (left) and touch sensors (right) are used to interact with a 3D replica of a funerary mask.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC5375724&req=5

sensors-17-00438-f012: Physical buttons (left) and touch sensors (right) are used to interact with a 3D replica of a funerary mask.

Mentions: Figure 12 shows examples of two implemented designs, one for each group using the same artifact (a 3D replica of a funerary mask). The first group wanted to develop a console with physical buttons that would display/hide projected content on the mask (on the left). To this end, the group used a pico-projector to overlay the additional content on the mask—they created a custom device as explained in Section 3.5—and an Arduino with Tinkerkit and four buttons, each one to display/hide the content accordingly. While the first group envisioned that the users would not be able to interact with the artifact directly (they considered the replica as if it were the real artifact), the second group decided to use the 3D replica as a prop to interact with the real artifact. Participants attached touch sensors on the replica (on the right), which would activate digital content. They decided to prototype the interface on the laptop and they envisioned the content to be displayed in the room or directly on the mask.


ECCE Toolkit: Prototyping Sensor-Based Interaction
Physical buttons (left) and touch sensors (right) are used to interact with a 3D replica of a funerary mask.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC5375724&req=5

sensors-17-00438-f012: Physical buttons (left) and touch sensors (right) are used to interact with a 3D replica of a funerary mask.
Mentions: Figure 12 shows examples of two implemented designs, one for each group using the same artifact (a 3D replica of a funerary mask). The first group wanted to develop a console with physical buttons that would display/hide projected content on the mask (on the left). To this end, the group used a pico-projector to overlay the additional content on the mask—they created a custom device as explained in Section 3.5—and an Arduino with Tinkerkit and four buttons, each one to display/hide the content accordingly. While the first group envisioned that the users would not be able to interact with the artifact directly (they considered the replica as if it were the real artifact), the second group decided to use the 3D replica as a prop to interact with the real artifact. Participants attached touch sensors on the replica (on the right), which would activate digital content. They decided to prototype the interface on the laptop and they envisioned the content to be displayed in the room or directly on the mask.

View Article: PubMed Central - PubMed

ABSTRACT

Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

No MeSH data available.