UBICOMP/ISWC ’16 ADJUNCT, SEPTEMBER 12-16, 2016, HEIDELBERG, GERMANY

Aduén Darriba Frederiks

Digital Life Centre Amsterdam University of Applied Sciences Wibautstraat 2-4, 1091 GM Amsterdam a.darriba.frederiks@hva.nl

Ben J.A. Kröse

Digital Life Centre Amsterdam University of Applied Sciences Wibautstraat 2-4, 1091 GM Amsterdam b.j.a.krose@hva.nl

Gijs Huisman

Human Media Interaction University of Twente

P.O. Box 217, 7500 AE, Enschede, the Netherlands gijs.huisman@utwente.nl

Abstract

We demonstrate a method that allows two users to commu- nicate remotely using their sense of touch by dynamically applying vibrotactile feedback to one user’s forearm using two different input methods. User input on a standard mo- bile touch-screen device or a purpose-built touch-sensitive wearable is analyzed in real time, and used to control in- tensity, location, and motion parameters of the vibrotactile output to synthesize the stroke on a second users arm. Our method demonstrates that different input methods can be used for generating similar vibrotactile sensations.

Author Keywords

Affective touch; Vibrotactile stimuli; Mediated social touch H.5.2 [User Interfaces]: Haptic I/O

ACM Classification Keywords

H.5.2 [Information interfaces and presentation (e.g., HCI)]: Haptic I/O, Prototyping.

Introduction

We have a plethora of remote communication technolo- gies at our disposal, mostly limited to visual and auditory senses. These spatial senses are according to Boernstein important for dealing with the concept of structure in physi- cal objects, this also includes the sense of touch [2]. Thus, like the visual and auditory senses, the sense of touch is

Internet of Touch: Analysis and Synthesis of Touch Across Wearable and Mobile Devices

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). Ubicomp/ISWC’16 Adjunct , September 12-16, 2016, Heidelberg, Germany

ACM 978-1-4503-4462-3/16/09.

http://dx.doi.org/10.1145/2968219.2971382

273

Figure 1: Vibrotactile array with actuators spaced 3cm apart from each other

Figure 2: Wearable touch sensor array with one sensor exposed

Figure 3: The mobile device interface for stroke input

important for Human Computer Interaction when dealing with the physical environment. Examples of applications are spatial navigation, such as for drivers, helicopter pilots, and astronauts [6, 19], not to mention the incorporation of haptic feedback into virtual environments, where users have the ability to push, pull, feel, and manipulate objects in virtual space [16].

Human-human communication also includes this physical element. Research has shown that the sense of touch is highly important in social interactions [15]. Gentle stroking touches applied with a velocity of between 1-10 cm/s have been suggested to be especially relevant in social-affective touch interactions [4, 5, 14]. Researchers have now begun to investigate how such social-affective aspects of touch can be used in HCI, for example for remote communica- tion between partners [9, 21], or in interactions with social robots [20].

Considering the importance of touch in physical HCI on the one hand, and interpersonal communication on the other, there is a challenge for wearable interfaces to address both these aspects in the design of wearables that can detect touch and stimulate the sense of touch. Here we demon- strate a method that allows for touches to be detected in real time, on multiple devices, using different input technolo- gies, and rendered on the same vibrotactile array.

Demo Outline

In this demo we introduce a method for the detection and generation of touches, including stroking touches. An op- timized method for rendering vibrotactile stroking touches based on the work of [1, 12, 11, 13] is used as output. A linear array of four Precision Microdrives PicoVibe 306-117 vibration motors is worn on the forearm to apply vibrotactile stimulation (see Figure 1).

For input, we introduce a method of detecting dynamic touches across different devices including a touch-screen based mobile device (Google Nexus 5. See Figure 3), and a novel type of wearable touch sensor based on the work of [10] (see Figure 2).

Touch input from both the mobile device and the wearable touch sensor is synthesized on the vibrotactile array, and can be used for both static (i.e. localized vibrotactile sensa- tions) as well as dynamic (i.e. vibrotacitle stroking sensa- tions) touches. The wearable touch sensor has four textile touch sensors in the same configuration as the vibrotactile array. The sensors combine capacitive and resistive sens- ing and are thus capable of detecting very gentle contact, as well as different levels of pressure applied to the sen- sor. Conversely, the mobile device does not detect touch pressure, but has a higher resolution for detecting touch location. On the screen of the mobile device a line can be seen that users can touch and stroke in any way they like (see Figure 3). The precise location of the finger is directly coupled to the vibrotactile array. When a user’s finger is in a location on the touch screen that does not correspond with the exact location of an actuator in the array, a vibrotactile sensation will be generated using an algorithm for phantom haptic sensations [1, 12].

We demonstrate how, despite the differences between input methods, both methods can be used to synthesize dynamic vibrotactile feedback on the linear array.

Conclusions

We demonstrate the use of tactile interactions across dif- ferent devices, by detecting static and dynamic touches on a touch-screen based mobile device and a novel wearable touch-sensor. These touches are synthesized on a linear array of vibration motors using phantom haptic sensations

UBICOMP/ISWC ’16 ADJUNCT, SEPTEMBER 12-16, 2016, HEIDELBERG, GERMANY

274

SESSION: DEMOS

[1, 12, 11]. Our method demonstrates that different input methods can be used for generating similar vibrotactile sensations. Touch sensors and actuators may be used in remote social interactions for affective communication [9, 21], but may also be useful in more specialized settings. One such setting is communication with visually impaired, or deaf-blind users [7, 18, 8, 3], who may strongly benefit from tactile communication. In such a situation it is plausi- ble that caregivers have access to ergonomically designed wearable touch sensors that do not interfere with their daily routine for the purpose of communicating with their clients, while family members can use standard touchscreen-based mobile devices to communicate with loved ones through tactile feedback. Another example is to extend communi- cation to more task-oriented interactions, like covert touch communication for military applications [17]. Here, soldiers in the field may have sensors and actuators integrated into their garments, while personnel at a command post may use touchscreen-based input methods for tactile communi- cation with soldiers in the field.

To conclude, the setup discussed in this paper opens up opportunities for touch-based wearables that can use differ- ent input methods, and are applicable in diverse situations.

Acknowledgements

This publication was supported by the Dutch national pro- gram COMMIT/.

REFERENCES

1. David Alles. 1970. Information Transmission by Phantom Sensations. IEEE Transactions on Man Machine Systems 11, 1 (mar 1970), 85–91.

2. W S Boernstein. 1955. Classification of the human senses. The Yale journal of biology and medicine 28, 3-4 (1955), 208–15.

3. Tanay Choudhary, Saurabh Kulkarni, and Pradyumna Reddy. 2015. A Braille-based mobile communication and translation glove for deaf-blind people. In 2015 International Conference on Pervasive Computing (ICPC), Vol. 00. IEEE, 1–4.

4. G K Essick, a James, F P McGlone, and Somatosensory Systems. 1999. Psychophysical assessment of the affective components of non-painful touch. Neuroreport 10, 10 (1999), 2083–2087.

5. Greg K. Essick, Francis McGlone, Chris Dancer, David Fabricant, Yancy Ragin, Nicola Phillips, Therese Jones, and Steve Guest. 2010. Quantitative assessment of pleasant touch. Neuroscience and Biobehavioral Reviews 34, 2 (2010), 192–203.

6. Alberto Gallace, Hong Z Tan, and Charles Spence. 2007. The Body Surface as a Communication System: The State of the Art after 50 Years. Presence: Teleoperators and Virtual Environments 16, 6 (2007), 655–676.

7. Francine Gemperle, Nathan Ota, and Dan Siewiorek. 2001. Design of a Wearable Tactile Display. In Proceedings of the 5th IEEE International Symposium on Wearable Computers (ISWC ’01). IEEE Computer Society, Washington, DC, USA, 5–.

8. Ulrike Gollner, Tom Bieling, and Gesche Joost. 2012. Mobile Lorm Glove. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction – TEI ’12, Vol. 1. ACM Press, New York, New York, USA, 127.

275

9. Gijs Huisman and Aduén Darriba Frederiks. 2013. Towards tactile expressions of emotion through mediated touch. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems on – CHI EA ’13. ACM Press, New York, New York, USA, 1575.

10. Gijs Huisman, Aduen Darriba Frederiks, Betsy Van Dijk, Dirk Hevlen, and Ben Krose. 2013. The TaSSt: Tactile sleeve for social touch. In 2013 World Haptics Conference, WHC 2013. IEEE, 211–216.

11. Gijs Huisman, Aduén Darriba Frederiks, Jan B. F. van Erp, and Dirk K. J. Heylen. 2016. Simulating Affective Touch: Using a Vibrotactile Array to Generate Pleasant Stroking Sensations. In Proceedings of EuroHaptics 2016. Springer International Publishing, 240–250.

12. Ali Israr and Ivan Poupyrev. 2011. Tactile Brush : Drawing on Skin with a Tactile Grid Display. (2011), 2019–2028.

13. Jongman Seo and Seungmoon Choi. 2010. Initial study for creating linearly moving vibrotactile sensation on mobile device. In 2010 IEEE Haptics Symposium. IEEE, 67–70.

14. Line S. Löken, Johan Wessberg, India Morrison, Francis McGlone, and Håkan Olausson. 2009. Coding of pleasant touch by unmyelinated afferents in humans. Nature Neuroscience 12, 5 (may 2009), 547–548.

15. India Morrison, Line Löken, and Håkan Olausson. 2010. The skin as a social organ. Experimental Brain Research 204 (2010), 305–314. Issue 3.

16. Marcia K. O’Malley and Abhishek Gupta. 2008. HCI beyond the GUI. In HCI beyond the GUI: design for haptic, speech, olfactory and other nontraditional interfaces, Philip Kortrum (Ed.). Morgan Kaufmann, Burlington, Chapter Haptic Int, 25–65.

17. R a Pettitt, E S Redden, and C B Carstens. 2006.

Comparison of army hand and arm signals to a covert tactile communication system in a dynamic environment. Technical Report August. U.S. Army Research Laboratory, Aberdeen.

18. Frank a Saunders. 1983. Information Transmission Across the Skin: High-Resolution Tactile Sensory Aids for the Deaf and the Blind. International Journal of Neuroscience 19, 1-4 (jan 1983), 21–28.

19. Jan BF Van Erp. 2002. Guidelines for the use of vibro-tactile displays in human computer interaction. In Proceedings of eurohaptics, Vol. 2002. 18–22.

20. Jan BF Van Erp and Alexander Toet. 2015. Social Touch in Human–Computer Interaction. Frontiers in Digital Humanities 2 (2015), 2.

21. Rongrong Wang and Francis Quek. 2010. Touch & talk. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction – TEI ’10. ACM Press, New York, New York, USA, 13.

276

UBICOMP/ISWC ’16 ADJUNCT, SEPTEMBER 12-16, 2016, HEIDELBERG, GERMANY

random generator
Peter
nou