Skip to main content

Handpad: a virtual mouse for controlling laptops in a smart home

Abstract

This study investigates the possibilities of equipping fixed furniture with touch screens to control laptops or other smart home devices, to create a holistic smart home control system. Although multi-touch screens have recently been integrated into desktop use, most users prefer a classical mouse for desktop input. This study introduces Handpad, which is a new indirect touch input device designed to control laptops in a smart home. Handpad combines the mobility of a touch screen with the precision of a mouse by tracking whole-hand movement on the touch screen and moving the screen pointer accordingly. In a user test, participants showed a high level of acceptance and positive attitudes toward Handpad.

Introduction

Multi-touch screens are becoming increasingly common in daily use, especially given the popularity of smart phones and tablets. However, the main usage of multi-touch screens is restricted to mobile applications. Meanwhile, users are now dealing with more smart devices in their homes. No single input method can satisfy users’ need to obtain a holistic solution for smart home control—it can even be difficult for users to find and distinguish between different input devices. In a smart home, indirect interaction modes can be used to create a holistic control system.

Are touch screens reserved for mobile usage? This study explores the possibility of adapting touch screen technology for domestic environments. This study provides a method for combining the capabilities of touch screen and mouse interaction in a smart home, demonstrating that touch screen usage is only limited by our imagination. Today, pointing devices are mainly adapted to specific scenarios: the touch screen for better mobility and the mouse for desktop use. Mouse interaction, with its two buttons and scroll wheel, remains the simplest and most efficient input for most desktop users. A mouse outperforms other devices in simplicity, precision, and traceability. Furthermore, the places in which we use desktops have evolved, reaching a variety of environments ranging from our offices to our homes. Can fixed furniture be equipped with touch screens to control a desktop?

This study explores the implementation of a virtual mouse that uses a touch screen to allow new methods of interaction while maintaining the simplicity and precision of using a mouse. This study also explores the application of touch screens in fixed usage. To achieve this aim, this study introduces Handpad, an indirect touch input designed for living room and mobile environments. We conduct a user test to evaluate Handpad’s performance and efficiency.

Related work

In the following sections, we discuss related work in touch screen input and enhanced mouse input, and compare the two.

Touch screen input

Previous research has shown that direct touch screens can enrich desktop interactions. However, users tire easily when interacting with vertical touch screens; interacting with a horizontal surface is faster and less exhausting. When touch screens are used as direct input devices, the finger hides the area of the screen underneath it. As a result, it is difficult to be as precise with a touch screen as one can be with a mouse. This issue is known as the occlusion problem. Some attempts have been made to overcome this issue. For example, Zimmermann et al. [1] enhanced the touch screen experience with haptic feedback for blind interaction.

More than a simple pointing device, a touch screen offers a large range of interaction possibilities. Schmidt et al. [2] proposed an indirect multi-touch approach. Damaraju et al. [3] introduced Multi-Tap Sliders, an innovative interface using a multi-touch screen to control different parameters in image editing software. In a smart home, there are multiple opportunities for touch screens to control devices. Obviously, mobile phones are often used to control smart home devices, and they can combine input display, output display, and remote control. However, in domestic spaces, the availability of a smart phone within a user’s reach has recently been reported to be as low as 46% [4]. Moreover, it takes about 1 s to start interacting with a smart phone, and potentially another 4 s for a user to remove the smart phone from his or her pocket [5]. Thus, it would be better if fixed furniture could be equipped with indirect touch screen input to create a holistic control system in a smart home.

Enhanced mouse input

Even though some laptops are equipped with a touchpad, most users prefer a classical mouse. Several attempts to increase the interactivity of the mouse can be found in the literature, including finger pressure sensitivity touch sensors [6, 7] and tactile and force feedback [8]. These approaches slightly increase mouse-targeting speed. Additionally, there have been several attempts to use hand motion trackers to increase mouse interactivity. For example, Mistry and Maes [9] introduced Mouseless, which is based on IR sensors. The device lies at the edge of a computer screen or keyboard to track hand movements. The user’s finger shapes are recognized and their movements are translated as mouse clicks or movements. Shizuki and Tanaka [10] presented a malleable mouse pad that the mouse can sink into as an additional interaction. Yang et al. [11] presented a prototype of Magic Finger, an index finger with optical mouse sensors that can track finger movement on a surface and identify the surface touched. Lv et al. [12] used a 10.1-in. tablet to implement a soft keyboard and touch pointer as an indirect input device.

Comparison of touch screen and mouse

Researchers have compared the touch screen and mouse in single pointing, dragging, crossing, and steering. There are some benefits to using a touch screen instead of a mouse, but these benefits are slight [13,14,15]. Findlater et al. [13] compared a classical mouse and touch screen for the four operations stated above, finding that the touch screen outperformed the mouse by only 16% on average for younger adults (19–51 years old, M = 27.7, SD = 8.9) and by 35% for older adults (61–86 years old, M = 74.3, SD = 6.6). In addition to pointing efficiency, Leftheriotis and Chorianopoulos [14] noted that users rated touch screen input twice as highly as mouse input in terms of preference, speed, and entertainment. In the games industry, Watson et al. [15] showed that touch screen input outperformed mouse input in a 2D shooting game for both vertical and horizontal surfaces.

Even though the touch screen performs comparably or slightly better than the mouse, precision issues due to occlusion and the “fat finger problem” are a thorn in the side of the touch screen. Sambrooks and Wilkinson [16] compared mouse and touch screen input performance during a targeting experience, and found the rate of misses for the touch screen input to be twice as high as that of the mouse. Zabramski [17] compared the accuracy and speed of a mouse, pen, and touch screen in reproducing randomized shapes. The touch screen was the least accurate, but the fastest input method.

Handpad design

Although the touch screen offers more possibilities for interaction, the mouse remains dominant for use in desktops. This study introduces Handpad, bringing touch screens or other hand motion trackers to desktop work to enrich interaction and productivity.

Handpad relies on two main hardware components: a large multi-touch surface and a computer. The touch surface is used to detect users’ hand movements. Hand movement data are then transferred to control the computer’s onscreen cursor. In this study, a multi-touch Android tablet is used as a touch surface in implementing Handpad. Figure 1 shows an image of Handpad.

Fig. 1
figure 1

Handpad image

Key design considerations

Handpad is designed to achieve performance similar to a classical mouse. The key design requirements are listed as follows:

  • Stand-by posture Users usually do not move the mouse, but remain in a stand-by position. It is important to ensure that users can rest a hand on the touch surface as comfortably as with a mouse without triggering unwanted cursor operation.

  • Precision Users must be able to retain the mouse’s precision-movement ratio.

  • Accuracy Users must be able to move the cursor to its intended location as easily as with a mouse.

  • Ease of movement Touch screen use must require only as much strength to move the cursor as mouse use does.

  • Interaction Users must be able to perform mouse behaviors such as clicking, dragging and dropping, scrolling, and hovering as easily as with a mouse.

Handpad interaction

Handpad basic interaction is described as follows: Users place one hand on the touch surface and move their fingers in contact with the touch surface to control the onscreen cursor. The following list clarifies several points of this design:

  • Tracking activation Users need to place five fingers on Handpad to begin interacting with it. This method allows the device to distinguish intentional cursor movement from an arm simply resting on the touch surface.

  • Finger tracking The position of all five fingers is tracked, but only four are used to directly move the cursor. To initiate cursor movement control, users first place one hand on the touch surface. The center of the four fingers (excluding the thumb) is used as the cursor’s relative position, as is the case when using a traditional mouse. The thumb is ignored in the calculation of the cursor’s position, allowing user more freedom of movement without triggering unwanted actions; however, the thumb is essential in the finger identification process.

  • Finger identification To distinguish the thumb from the other fingers, Handpad uses the method presented by Au and Tai [18]. The center of the contact points is first computed. The angles formed by two consecutive contact points (in space) and the center is computed. The spanning angle, which is the sum of two adjacent angles of the same finger, is used to differentiate the thumb from other fingers, since the thumb has the largest spanning angle. Then, the remaining fingers are identified by their proximity to the thumb.

  • Click trigger To trigger a mouse down event, users lift one finger from the touch surface. From a mouse down event, to trigger a mouse up event, users replace the lifted finger on the touch surface, completing a click event. Although this click method may seem counterintuitive, since most users are accustomed to pressing a button to trigger a click, the lift-and-place concept is well suited to replace drag-and-drop.

  • Button mappings Each finger can be mapped a button. In this study, we map the index finger as the left-click button. A double finger touch gesture is used to trigger scrolling.

Handpad software architecture

The Handpad software is developed on both the Android and Windows operating systems, to allow communication between a tablet and personal computer, respectively.

  1. 1.

    The Android application serves as a Handpad client, detecting finger positions from the multi-touch tablet surface and sending these data to the computer.

  2. 2.

    The Windows computer plays the role of the Handpad server, receiving and processing finger position data from the tablet and controlling the computer’s onscreen cursor accordingly.

Figure 2 illustrates the interaction between the user and Handpad. Once the Handpad Android client application and Handpad Windows server are installed, the tablet must be connected to the computer via a USB cable, and the receiver in the Windows program must be started.

Fig. 2
figure 2

User–Handpad interaction diagram

The receiver then searches for an Android device connected to the computer; if one is found, it triggers the Android Handpad client on the tablet. Communication between the two entities then begins and the user can move the onscreen pointer.

Handpad Android client

The Handpad Android client application detects and records users’ hand motions and transmits them to a computer. It is launched when communication between the Android device and personal computer has been established. The protocol used for this purpose is the Android Open Accessory Protocol 1.0, which has an accessory mode that allows Android devices to communicate and exchange data with another device through USB.

The procedure followed by the Android tablet and the Handpad Android client is as follows: the Android device is set to default USB connection mode when it is connected to a computer; the Android device is set to Android accessory mode when the Handpad Windows server finds the Android device; the Handpad Android client is then automatically launched and starts transmitting data to the connected computer. Once communication begins, the Handpad Android client transmits data continuously. The data transmitted to the Handpad Windows server is written in 128-byte words. The first 8 bytes are reserved for the number of points currently detected on the device surface. The remainder of the word consists of 8-byte pairs of x and y float coordinates of a detected point (if existing), coded in 4 bytes each.

Handpad Windows server

The Handpad Windows server, installed on a computer, aims to move the onscreen cursor according to users’ hand movements as detected by Android device and Handpad Android client. To establish communication with the Android device, the USB protocol is first used to detect a connected Android device to the computer.

The procedure followed by the Windows computer and the Handpad Windows server is as follows: the Handpad Windows server looks for an Android device through the USB protocol; once an Android device is connected, the server verifies its compatibility by determining the device’s accessory mode support; if compatible, the server then attempts to start the device in accessory mode; if the device supports Android Open Accessory Protocol, it establishes communication with the device; the mouse controller is then started, and the device can control the onscreen pointer.

Handpad mouse controller

Once communication between the Android device and Windows computer has been established, the Handpad Windows server receives user hand movement data from the Android device and processes them to produce corresponding onscreen cursor movements. The mouse controller is responsible for processing the received data and subsequently controlling the onscreen pointer.

A state chart of the mouse controller procedure is shown in Fig. 3. For example, State 1 of the mouse controller acquires basic information used to control the onscreen cursor and arranges it for later use. The number of finger points detected on the Android device and their positions are stored. The finger points detected are then sorted. The sorting method finds the largest spanning angle between two consecutive fingers and the center to identify the thumb point. Then, the remaining fingers are sorted based on the spanning angle they form with the thumb and center.

Fig. 3
figure 3

State chart diagram of the mouse controller

States 1–4 are used to create a distinction between intentional and unintentional behavior. The system assumes intentional behavior when five fingers are detected on the Android device. Thus, the detection of five fingers is needed to start moving the cursor. This procedure is as follows:

  • Step 0: no fingers are detected;

  • Step 1: five fingers are detected and mouse control begins;

  • Step 2: fewer than three fingers are detected and mouse control ends.

Once State 4 is reached, mouse control begins. First, click gestures discrimination occurs. If a click gesture is detected, the click is then processed (States 6–8). Afterwards, cursor movement is processed (State 9).

User tests

To evaluate the proposed prototype, we conducted user tests in living room and bedroom environments, collecting evaluations from participants after using Handpad.

Participants

A total of 31 participants (14 male, 17 female) ranging in age from 18 to 30 years old, took part in the user study. Participants were recruited through social networking applications and via flyers distributed in several areas on campus. Each was given an incentive of 7 USD for their participation.

Scenarios

Two hedonic scenarios for device usage were proposed. The first was a living room environment in which the user sat on a sofa and controlled a computer’s projected screen on a wall by the armrest of the sofa. The second was a bedroom environment in which the user lay down on a bed and controlled a computer screen projected on the ceiling. The tablet was put on the top of a drawer (for the living room scenario) or attached to the top of a pillow (for the bedroom scenario). The experiment room was a glass room designed to reproduce a living room environment. The furniture was mostly white. Pictures of the experiment room can be found in Fig. 4. The experiment was recorded with a camera. The light on the side of the room in which the experiment took place was turned off during the experiment. In both settings, the participant was placed between 2 and 3 m from the projected screen. Two devices were used to run Handpad: a Samsung Galaxy Tab S T805 running Android 4.4.2 and a Toshiba Satellite L755-1GE running Windows 7.

Fig. 4
figure 4

Experiment room settings. Left experiment room, Center a participant using Handpad while sitting, Right a participant using Handpad while lying down on a bed

Tasks

Participants needed to use Handpad to complete two tasks. Task A was browsing a film database and searching for films to watch. The participants were asked to browse through a film database website to search for films to watch. The participant was able to choose the film database website. During this task, the participant could read synopses, watch short trailers, and get more information about films. Task B was browsing one or several news websites to read news for 10 min. The participants were asked to browse one or several news websites and read news for 10 min. The participants were first asked which websites they preferred to browse (if any) and were then presented with a web browser with the desired web pages opened in various tabs (or windows, depending on user preference).

The graphical user interface used for both tasks is of a desktop OS, such as Microsoft Windows 7, with a web browser such as Mozilla Firefox.

Questionnaires

The following questionnaire was designed to collect user experience feedback. The questionnaire (shown in Table 1) was adapted from the Unified Theory of Acceptance and Use of Technology (UTAUT) model [19]. The questions were adapted to fit the hedonistic aspect of the scenarios. The words “job” or “work” present in questions was replaced with “task,” since the scenarios were not related to professional situations. We adopted a 7-point Likert scale for answering the closed questions of the questionnaire.

Table 1 Questionnaire constructs and items

Procedures

Participants received brief initial instructions for using Handpad and had the opportunity to practice. The sequence of the living room and bedroom scenarios was random. In each scenario, participants were asked to complete two tasks using Handpad. Task A was browsing a film database and searching for films to watch. Task B was browsing one or several news websites to read news for 10 min. Each scenario was randomly assigned a different task, the sequence of which was also random. After the two tasks, participants filled out the questionnaire and had a short discussion with experimenter. In all, the procedure took on average 45 min.

Results

Table 2 shows descriptive statistical results, which revealed that participants had relatively positive attitudes toward Handpad (M = 4.68, SD = 1.61) and relatively high behavioral intentions to use it (M = 4.48, SD = 1.84). Participants’ effort expectancy was relatively high (M = 4.59, SD = 1.51), indicating that they believed using Handpad to be easy. Meanwhile, participants believed that using Handpad could increase their social influence (M = 5.16, SD = 1.35). Participants did not feel enormous anxiety when they used Handpad (M = 3.92, SD = 1.16). They also thought that facilitating conditions for using Handpad existed in their daily lives (M = 4.87, SD = 1.26). It is interesting that participants’ performance expectancy was not very high (M = 3.76, SD = 1.46), suggesting that they did not have high expectations of Handpad productivity and speed, which may result from users’ low performance requirements in hedonic scenarios. In sum, participants had a relatively high level of acceptance of Handpad. The discussion with the participants at the end of the experiment reflected this trend. Users who did not have any issues using Handpad gave more positive feedback, whereas users who did have some difficulties judged Handpad more strictly.

Table 2 Questionnaire results

Conclusion and future work

This study proposed using a touch surface as an indirect input, since it can be easily and seamlessly embedded into a smart home. We have introduced Handpad as a new mouse emulation technique to indirectly control an onscreen pointer. This method uses hand movements detected on the touch surface to move an onscreen cursor. We designed a prototype to test user acceptance of Handpad in hedonic scenarios. This prototype provided good insight into how Handpad works in real-world scenarios. The user study showed that participants had high acceptance of Handpad. Though Handpad is still a prototype, these results show its potential as a viable alternative in hedonic scenarios. Fixed furniture can be equipped with this touch screen to create a holistic smart home control system. This concept can be refined in future studies.

References

  1. Zimmermann S, Rümelin S, Butz A (2014) I feel it in my fingers: haptic guidance on touch surfaces. In: Proceedings of the 8th international conference on tangible, embedded and embodied interaction. ACM, New York

  2. Kern D, Marshall P, Schmidt A (2010) Gazemarks: gaze-based visual placeholders to ease attention switching. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York

  3. Damaraju S et al (2013) Multi-tap sliders: advancing touch interaction for parameter adjustment. In: Proceedings of the 2013 international conference on Intelligent user interfaces. ACM, New York

  4. Dey AK et al (2011) Getting closer: an empirical investigation of the proximity of user to their smart phones. In: Proceedings of the 13th international conference on Ubiquitous computing. ACM, New York

  5. Ashbrook DL et al (2008) Quickdraw: the impact of mobility and on-body placement on device access time. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York

  6. Cechanowicz J, Irani P, Subramanian S (2007) Augmenting the mouse with pressure sensitive input. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York

  7. Omata M, Kajino M, Imamiya A (2009) A multi-level pressure-sensing two-handed interface with finger-mounted pressure sensors. In: Proceedings of graphics interface 2009. Canadian Information Processing Society, Mississauga

  8. Akamatsu M, Sato S (1992) The multi-modal integrative mouse: a mouse with tactile display. In: Posters and short talks of the 1992 SIGCHI conference on human factors in computing systems. ACM, New York

  9. Mistry P, Maes P (2011) Mouseless: a computer mouse as small as invisible. In: CHI’11 extended abstracts on human factors in computing systems. ACM, New York

  10. Kuribara T, Shizuki B, Tanaka J (2013) Sinkpad: a malleable mouse pad consisted of an elastic material. In: CHI’13 extended abstracts on human factors in computing systems. ACM, New York

  11. Yang X-D et al (2012) Magic finger: always-available input through finger instrumentation. In: Proceedings of the 25th annual ACM symposium on user interface software and technology. ACM, New York

  12. Lv C et al (2013) MK-pad: a Mouse + Keyboard input technique for distance interaction through a mobile tablet device. In: Proceedings of the 11th Asia Pacific conference on computer human interaction. ACM, New York

  13. Findlater L et al (2013) Age-related differences in performance with touchscreens compared to traditional mouse input. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York

  14. Leftheriotis I, Chorianopoulos K (2011) User experience quality in multi-touch tasks. In: Proceedings of the 3rd ACM SIGCHI symposium on engineering interactive computing systems. ACM, New York

  15. Watson D et al (2013) Deconstructing the touch experience. In: Proceedings of the 2013 ACM international conference on interactive tabletops and surfaces. ACM, New York

  16. Sambrooks L, Wilkinson B (2013) Comparison of gestural, touch, and mouse interaction with Fitts’ law. In: Proceedings of the 25th Australian computer–human interaction conference: augmentation, application, innovation, collaboration. ACM, New York

  17. Zabramski S (2011) Careless touch: a comparative evaluation of mouse, pen, and touch input in shape tracing task. In: Proceedings of the 23rd Australian computer–human interaction conference. ACM, New York

  18. Au OKC, Tai CL, Fu H(2012) Multitouch gestures for constrained transformation of 3d objects. In: Computer Graphics Forum. Wiley, Hoboken

  19. Venkatesh V et al (2003) User acceptance of information technology: toward a unified view. MIS quarterly. pp 425–478

Download references

Authors’ contributions

TCM and HJH are responsible for conceiving of and writing this paper, and for analyzing the results presented herein. They took suggestions when necessary from PLP. All authors read and approved the final manuscript.

Acknowledgements

This study was funded by a Grant (No. 71661167006) from the National Natural Science Foundation of China.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

Data will be made available to all interested researchers upon request.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pei-Luen Patrick Rau.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Taing, CM., Rau, PL.P. & Huang, H. Handpad: a virtual mouse for controlling laptops in a smart home. Hum. Cent. Comput. Inf. Sci. 7, 27 (2017). https://doi.org/10.1186/s13673-017-0108-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13673-017-0108-3

Keywords