Natural User Interface
- 1 Readings
- 2 Reading Critiques
- 2.1 Jonathan Albert 18:28:01 11/13/2017
- 2.2 Kadie Clancy 23:21:49 11/19/2017
- 2.3 Spencer Gray 12:09:40 11/20/2017
- 2.4 Tahereh Arabghalizi 17:35:35 11/20/2017
- 2.5 Mingzhi Yu 19:46:30 11/20/2017
- 2.6 Xiaoting Li 22:18:28 11/20/2017
- 2.7 Yuhuan Jiang 23:52:06 11/20/2017
- 2.8 Ahmed Magooda 0:03:10 11/21/2017
- 2.9 Mehrnoosh Raoufi 1:04:41 11/21/2017
- 2.10 Xingtian Dong 1:20:17 11/21/2017
- 2.11 Charles Smith 2:10:50 11/21/2017
- 2.12 MuneebAlvi 3:55:26 11/21/2017
- 2.13 Amanda Crawford 4:26:04 11/21/2017
- 2.14 Ronian Zhang 4:37:42 11/21/2017
- 2.15 Akhil Yendluri 8:50:24 11/21/2017
- 2.16 Ruochen Liu 8:58:53 11/21/2017
- Skinput: Appropriating the Body as an Input Surface, Harrison, C., Tan, D. Morris, D., In Proc of CHI 2010
- Distant freehand pointing and clicking on very large high resolution displays, Daniel Vogel, Ravin Balakrishnan, In Proc of UIST 2005
Jonathan Albert 18:28:01 11/13/2017
Distant Pointing: This article discusses ways to handle "clicking" for large and distant screens where users would not have access to mice or touch-based input methods. It details experiments with types of gestures and their recognition for various click actions. After reading the authors' statement that ray-casting off of a pointer finger was the least accurate method, I was surprised to see that they advocate exploring multi-finger gestures in further research. While the two may seem relatively orthogonal, I think the whole system's usage of fine-grained, dextrous movement relied too heavily on the motion-capture glove, and that the ray-tracing example only elucidated that point. In other words, the authors' test was idealized--sensing the hand with a full "view" by means of an extra modality that would be unavailable for general-purpose displays, such as those that would be found outdoors. Instead, depending on the distance of the user and the quality of the camera, systems (e.g., like Microsoft's Kinect) would not be able to detect single-finger movement. Those types of devices only have a front-facing thumbnail of the user's hand, and might have to make educated guesses as to the location or posture of one's fingers. Nevertheless, I think attempting to detect open or clenched hands would be a suitable venue with which to experiment. ---- Skinput: This paper details a system for sensing taps on a user's arm, in order to expand the types of modalities available for certain devices. It explains how a specific prototype works via acoustic properties of bones, discusses its accuracy under testing, and proposes potential UIs for such devices. While this paper lists several systems in a similar vein, the approach outlined herein is still fascinating. Though armband devices might not ever become mainstream, integrating this into wrist devices like FitBits or smart watches seems promising. By sensing taps in a smart watch, for example, an integrated system could communicate with a smartphone to control a music player, etc. The system could potentially be expanded to detect multi-finger taps to enhance the range of available inputs. (And this is different from the prior paper, since a peripheral is already attached to the user, whereas the other system is intended for the widest audience possible). However, I imagine the authors would admit their system could benefit from a smaller form factor than that of their prototype. Another consideration that could improve the system is to gather data about users' range of motion in colder temperatures--where their arms are covered or their range of dexterity is limited--in addition to those with lower overall dexterity, such as arthritic persons. In the latter vein, I think it would be more profitable to pursue tap-detection on other body parts or surfaces, rather than trying to detect finger-to-finger gestures, in order to make the system more accessible to a broader amount of people.
Kadie Clancy 23:21:49 11/19/2017
Skinput: Appropriating the Body as an Input Surface: This paper presents Skinput, a technology that allows the body to be appropriated for acoustic transmission, allowing skin to be used as an input surface. Skinput determines the location of finger taps on the arm and hand specifically by analyzing mechanical vibrations that propagate through the body. Skinput use an array of sensors that are worn on a noninvasive armband. This technology provides an always available, portable, and easily removable finger input system that can help make up for the limited usability and functionality of small screens on wearable devices. The authors performed experiments to assess the robustness of their system, and results showed that Skinput performs well for a variety of gestures, even when the user is in motion. To illustrate the utility of coupling projection and finger input on the body, the authors constructed interfaces for hierarchical menu selection, scrolling menus, and a numeric keypad. I think that using the skin as input for small-screened (or no screen) devices is very innovative and clever. Further, a device of this type provides evidence for the prediction that in the future, everything will be a screen. I also think skin is an ideal input devices in theory due to proprioception which allows humans to interact with their body in a eyes-free manner, which is not possible with most other forms of input. Distant freehand pointing and clicking on very large high resolution displays: In this paper, the authors developed and evaluated new pointing and clutching techniques that leverage the naturalness of pointing with a human hand. Humans will soon be interacting with very large, high resolution displays that will not only allow users to work up close with detailed information, but also enable them to move back and work with the contents of the entire display space. The authors identify desirable characteristics for pointing and clicking from a distance for very large displays with only the hand: accuracy, acquisition speed, pointing and selection speed, comfortable use, smooth transition between interaction distances. The authors prototyped different techniques for the mentioned interaction with such a system using motion tracking with reflective gloves. They prototyped two clicking techniques, AirTap and ThumbTrigger, that provide audio and visual feedback to replace the lack of kinesthetic feedback. Three pointing techniques were also developed: absolute position finger ray casting, relative pointing with clutching, and a hybrid technique. The results of their study largely favor relative hand-based pointing techniques and demonstrate error rates similar to those produced with devices like mice. I think that as ubiquitous computing becomes mainstream, it is imperative that all users be able to quickly and easily interact with group or public screens. The ability to interact with such a device naturally, and without the hassle of a separate input device, is central to its usability.
Spencer Gray 12:09:40 11/20/2017
In the first paper, Skinput: Appropriating the Body as an Input Surface, the authors create a method that allows the human body to be used for finger input to a computer device. The authors cite the reason for creating this system to change the interface for small, wearable devices. With ubiquitous computing, computers are becoming smaller and smaller. However, it is difficult to interact with these small devices. This paper is significant because it presents a new interface to devices that will only continue to decrease in size. Devices such as smart watches present a challenge because buttons and task bars can only get so small before they are unusable. By using finger taps on the human body, people can interact with wearable devices in more expressive and natural ways. I found this paper very interesting because the goal of HCI is to make interactions as natural as possible. I found it very intuitive to use the human body to interact with devices because it accomplishes this task. In the second paper, Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays, the authors create a system for a user to interact using direct manipulation. This system tracks the user's hand so that the user only needs to move their hand and a few gestures to interact. The author's motivation for creating this system was to implement a direct manipulation system that will reduce the gulf of execution as much as possible. Pointing with a hand is the most natural and easiest way for people to interact with a computer. For this reason, this paper is significant in the HCI field. As displays become larger and cheaper, systems such as the one described in this paper will become more prevalent. It is imperative to study what makes these system successful or not successful in order to improve them for a more natural interaction.
Tahereh Arabghalizi 17:35:35 11/20/2017
Skinput: Appropriating the Body as an Input Surface: In this paper, the authors present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. They describe a novel, wearable bio-acoustic sensing array in order to detect and localize finger taps on the forearm and hand. Results from their experiments show that this system performs very well for a series of gestures, even when the body is in motion. In my opinion the idea of this paper is novel and interesting but it has some limitations. For instance, wearing the bio-acoustic sensor and using the arm as an input can be inconvenient in some situations. ------------------------------------------------------------------------------------------------ Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays: In this paper, the authors work on the design space of freehand pointing and clicking interaction with very large high resolution displays from a distance. They introduce and develop two different hand pointing techniques that differ with regards to the number of recalibration activations and recalibration time. The longer recalibration times for the RayToRelative technique does not affect its overall selection time compared to the Relative technique, indicating that the time overhead due to the ray casting portion of the technique was compensated for by a reduction in subsequent relative movement. There is also a Hybrid RayToRelative Pointing that combines absolute position ray casting and relative pointing with clutching. When user open his hand, it will do relative cursor control and when he points with his finger it will do absolute ray casting. The current techniques only support actions equivalent to a single button mouse or touch screen. It would be good to consider additional body movements, in the design of future pointing techniques. For example, using a second hand or eye gaze.
Mingzhi Yu 19:46:30 11/20/2017
Sentence Summary of Skinput: Appropriating the Body as an Input Surface: The author presents a skin input system in this paper. It uses the skin as an input surface. Besides that, the system also allows the tapping on the arm by sensing the vibrations that propagate through the body. The concept of input from the body surface is novel. The author group also conducted a thorough user study that evaluates this system through varies facets. In general, this paper provides a fancy way of interaction the computer. This technology is novel and has a wide range of usage that, for example, wearable equipment, only appears in some science fiction movie. Although the equipment (sensors) may look ridiculously big to wear, but I believe this is only a prototype of this technology. In the future that the electronic engineering is more advanced, the size of this equipment can be reduced to a petite size (may not visible). In general, this is the very interesting concept. Distant freehand pointing and clicking on very large, high-resolution displays, Daniel Vogel, Ravin Balakrishnan. In this paper, the author designed a new pointing system that has a remote control on larger, higher resolution displays. Pointing the target farther away does not match any model or equipment that are used today. The author's idea is novel. I mentioned several feasible solutions that approach this farther aways pointing problems and came up with their human hand only design. The paper demonstrated this idea clearly by both graph and description. It mentioned several techniques such as using the index of fingers and moving the fingers. The evaluation mainly focuses on the selection time and error analysis. The results show their error rate is in the low range compared with using devices like mice. The paper is well formed and explained very clearly at each technique. The user's study is thoughtful. In general, the system is impressive because it approaches some problem that previous studies has not shown some efficient solutions.
Xiaoting Li 22:18:28 11/20/2017
1.Skinput: Appropriating the Body as an Input Surface: In this paper, the authors present a technology called Skinput which allows the human skin to be used as an input surface. This technology aims to solve the problem that users cannot easily find input surfaces from the nearby environment. It also aims to increase users’ interactions with small mobile devices. The authors introduce the bio-acoustic behind the technology. And they introduce the sensor design which overcomes the problem such as capture of irrelavant frequencies and thus to a high signal-to-noise ratio. The user study shows that the overall classification rates are high and the average accuracy is 87.6%. This is a novel technology but there’s future work need to be done, such as single-handed gestures, tpas with different parts of the finger, and so on. 2. Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays: In this paper, the authors explore the design space of freehand pointing and clicking with very large high resolution displays in distance. The authors designed three technologies for pointing techniques and two technologies for clicking techniques. The merit of this paper is that the authors use many figures to help audience understand the idea presented in this paper.
Yuhuan Jiang 23:52:06 11/20/2017
Paper Critiques for 11/21/2017 == Skinput: Appropriating the Body as an Input Surface == This paper discusses an input technology that uses human body as input surfaces. The propagation of vibration on the human arm is detected by sensors on an arm band. An interesting aspect of the work is the armband prototype. It has five sensing elements in two sensor packages. With the signals, an SVM model with 186 features is trained. The input locations can be found in the fingers, whole arm, and forearm. The user study is a within-subject design with five conditions, each corresponding to a position of sensors. From the user study, the authors find that the wearable sensing array performs well for various gestures. This does not only apply to when the subject is still, but also when they are moving. A user interface projected on to the arm is demonstrated. The projected interface can be tapped by an arm wearing the armband. To related to today’s technology, there is no similar commercial products that utilizes the bio-vibrations. This new technique may be at the front part of the long tail of innovation. == Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays == In this paper, the design space of freehand pointing and clicking interaction with large display with high resolution from a distance is studied. The author begins by stating what traditional screen input mechanisms assume. They assume that the user is close to the display (which means stylus and touch inputs are possible). They also assume that the screen is a stationary and horizontal surface. This gives the insight for the author to propose 5 characteristics for pointing and selection devices for very large high-res screens: accuracy for selecting small targets, the speed for acquiring targets, selection speed, the comfort of use, and the smoothness of transition between near and far distances. With many examples discussed in the paper, RayCasting was faster in tasks where clutching would have been required or when selecting large targets. However, it has a high error rate. What is meaningful about this work is that the need for facile pointing and licking techniques for large high-res displays is motivated. This calls for future efforts to enhance user’s interaction with large displays. To relate to what we learned in the course lectures, I am very curious to know if selecting target objects on large, high-res screens also conforms to the Fitt’s Law.
Ahmed Magooda 0:03:10 11/21/2017
Skinput: Appropriating the Body as an Input Surface, this paper talks about a new input method for human, called Skinput which is a mobile interface that allows people to use their body parts to do all the input. The signals are collected using an array of sensors worn as an armband. The primary goal of this device is providing an always-available mobile input system. It is discussed that they have used acoustic effects of taping on skin to find the taping and locate the place of it. In order to recognize the tap from the signals, the authors trained SVM classifier with 186 features computed from the input signals. They also have mentioned that the number of false positive in jogging and walking situation and that higher BMI can cause lower accuracy in their experiment. At the end this paper seems interesting, while it still seems very far from being a part of real application, it guides attention towards an interface that can be applicable one day. ---------------------------------------------------------------------------------------------------------- Distant freehand pointing and clicking on very large high resolution displays: In this paper the authors discuss the design space of freehand pointing and clicking with a large high-resolution display from a distance. In these big devices, it is good to allow the user to interact with the display from both near and distant. In a near distance, the user can change the details whereas in a far distance the user can manipulate things like sorting and arranging the entire workspace. According to the paper, for such an interaction there is a set of desirable characteristics needed such as accuracy, acquisition speed, selection speed, comfortable use, and smooth transition between interaction distances. The previous works have focused on using some external device such as laser or hand-held devices to improve precision in distant pointing and perceive the clicking action. This work has not used these devices, instead, they have proposed two clicking techniques one using index finger and the other using the thumb. Furthermore, three techniques are proposed to enhance pointing precision which are absolute position finger ray casting, relative pointing with clutching, and some hybrid technique with relative pointing. After that, these three techniques are compared by the mean of task completion time, error rate, recalibration activations, recalibration frequency and total recalibration time. Ray casting is found the fastest technique with the most error rate which means it is suitable for big targets but bad for small ones. In my opinion, we can improve pointing accuracy by enhancing our tracking and computer vision techniques and using better sensors.
Mehrnoosh Raoufi 1:04:41 11/21/2017
Skinput: Appropriating the Body as an Input Surface: In this paper, Skinput was introduced. It is a technology to appropriate the human body as an input interface. The authors explained that they made it possible using an array of sensors in an armband. Their motivation is body area is large and always available so it fits a mobile interface very well. Moreover, body area can be accessed in an eyes-free manner for each person.To my mind, it is a novel idea to use the human body as an input interface. It can bring people a noticeable amount of convenience. Further, they trained their system through a brute force machine learning approach. They trained SVM classifier with 186 features. To evaluate the performance they conducted a user study of 13 participants that varied in age, gender, and body mass indexes(BMIs). The experiment was done in multiple positions. They reported the accuracy of each of them. Near the elbow is the location they get the highest accuracy due to the proximity of sensors. Also, BMI impacted their results. The accuracy falls as BMI increases. That happens because fatty tissues diminish acoustic energy transmitted to the body.In this paper, they only considered arm as an input interface but they claim it can be applied to the other parts as well. ------------------------------------------------------ Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays: This paper explored freehand pointing and clicking from distance. The authors presented different methods for pointing and clicking. Their motivation was that some tasks are better to be done from distance from a large display view e.g. sorting slides/photos/pages spread over the large display, or presenting a large drawing to a group while navigating/panning/highlighting. Unlike the previous work that used laser or handheld devices, they used no external devices for pointing. Instead, they tried to implement pointing, clicking, and clutching only using the human hand. They presented different approaches for detecting hand gesture. In particular, they provided two methods for clicking one by using index finger the other using thumb. They also proposed three methods for increasing the accuracy and precision of clicking techniques; absolute position finger ray casting, relative pointing with clutching, and hybrid technique with relative pointing. Moreover, due to individual differences, they implemented a calibration method to increase the accuracy. Lack of external devices pushed them to provide another kind of feedback. Thus, they provided both a slight visual and auditory feedback for clicking. All in all, their design supports equivalent actions to a single button mouse or touchscreen. I am impressed by their work as I think the idea of distant pointing without using external devices can have wide application, especially in group work.
Xingtian Dong 1:20:17 11/21/2017
1. Reading critique for ‘Appropriating the body as an input surface’ This paper is very useful and inspiring. The author developed a new input technology which uses skin as input surface. This surface make use of a set of sensors which determine the location of vibrations that propagate through the body. The new input technique allows users to perform visual free input as the users himself can know the location of his body. The author also assess the robustness and limitation of the new technique. The experiment is also a good example for us. They design to perform example finger taps in order to calibrate the system. And the analysis on the result is pretty good. This author of this paper is really creative. Skin provides a great surface for on the go input recognition. And machine learning is used to learn faulty classification. It provides us a way to use machine learning. One drawback of using skin as input is that it relies on the user having a projection device attached to their body, and their arm must be in particular orientation. Although also some noise is still unable to be got ride of and makes a lot difficulties for recognition, it is a valuable technique which might makes a revolution on input device. 2. Reading critique for ‘Distant freehand pointing and clicking on very large high resolution displays’ This paper explored the design space of free hand pointing and clicking interaction with very large high resolution displays from a distance. The author also develops and evaluates three techniques for gestural and two for clicking. Also he introduces design characteristics such as accuracy, acquisition speed, pointing and tracking , visual environments and selection with the hand. The author provides experiments to examine the techniques which are really interesting. From this paper I learned that RayCasting is the fastest technique for large targets, but it is lack of good control for small target. This drawback makes if an impractical solution. The paper is also inspiring that when it is impossible to provide some feedback, we enhance some other feedback.
Charles Smith 2:10:50 11/21/2017
On: Skinput Skinput proposes a new way of interacting with devices, by tapping on your skin. This is done with a group of sensors on an armband. The paper right away opens an assumption that has since been seen to be incorrect: we cannot increase screen sizes of mobile devices. Apple began selling the popular iPhone 6 Plus in 2014, showing that consumers are ok with a larger screen size. Even though this technology could be applied to devices other than cell phones, it seems important to point this out. The fact that the device has to be reprogrammed for every user and for orientation changes sounds like a large drawback to the technology. This could slow down users adoption and could quickly become a large hassle, if the programing time was long. On: Freehand pointing This paper takes a look at manipulating extremely large displays, possibly from a distance. They do this by using a person’s hand as the input. It is interesting to see the authors trying solve a problem that does not yet exist. The screens they are preparing for were not a thing when the paper was first published, and while large screens certainly exist today, they are not commonplace, and rarely need to be interacted with. However, the authors saw the progression of technology and proposed an idea to the future to work with. The authors of this paper also have included in their experiment results how user felt about using their device. It seems to frequent in the papers that we read that only speed or accuracy is measured, but if users do not like it, then it would never be adopted, regardless of improvements to other systems.
MuneebAlvi 3:55:26 11/21/2017
Critique of Skinput Summary: This reading demonstrates the potential of using the body itself as an interactive GUI. The reading then shows several examples of how this can be applied. I am actually surprised by this reading. I remember seeing demos a few years ago of very similar technology. However, seeing as how this reading was published in 2010, I have yet to see a major push from any companies or even the media or social networks to make this idea into a product. I suppose there are limitations such as the hardware and equipment required. However, seeing as how this was in 2010, I would assume that lots of the required technology would be smaller and more compact by now. Therefore, there must be other reasons as well that are preventing this from entering the mass market. Perhaps Microsoft themselves did not see enough potential over the last few years. Also, with everyone carrying around smart phones, there is little need to project a GUI on the hand and deal with such complexities when a smartphone can present the GUI in a much simpler way. Another reason for this idea not taking off could be that it would be weird to interact with a GUI on ourselves. Along with the added equipment, maybe this was too much for the average user who is looking for simpler and not more complex means to interact with computers and devices. Lastly, today's smartwatches are small enough and simple enough that maybe its not the case that many of the applications in skinput would be better or simpler. Critique of Distant Freehand Pointing and Clicking Summary: This paper presents the design space of free hand pointing on distant objects. It also describes the different use cases and ways to make distant pointing feel more interactive. I agree with many points in the paper. I like the potential of the technology that they purpose. Like the other useful scenarios presented in the paper, I really see this technology having potential in a classroom. It would be great if students could directly interact with a smart board in the front of the class without ever leaving their seats. This would allow them to manipulate objects on the board to maybe clarify their question for the professor or teacher. However, this technology is improving through products like Kinect which detect motion gestures from a distance and can allow manipulation of objects. Also, there are devices like Leap Motion's controller which allow finger and hand gestures. Even cars like BMW are allowing hand gestures to control things like changing the temperature or turning up the volume. However, I don't see much praise about any of these devices. It seems like the software and hardware are really lacking in terms of what users expect compared to what they actually get. Most of these gesture systems are clunky and sometimes not consistent as one gesture can easily be recognized as another unintended gesture.
Amanda Crawford 4:26:04 11/21/2017
Skinput: Appropriating the Body as an Input Surface, Harrison, C., Tan, D. Morris, D., In Proc of CHI 2010. This papers discusses a wearable technology that uses the a person's body to serve as an instrument. Using a projector, the user can perform acoustical notes as if it were using an actual human. This paper does a really good job of going beyond the technology and observing all of the random factors that could manipulate the notes. For example, the discussion of the effect of a person's body mass index shows that sometimes, the surface isn't as fixed as we would automatically assume. Distant freehand pointing and clicking on very large high resolution displays, Daniel Vogel, Ravin Balakrishnan, In Proc of UIST 2005.The paper focuses on transforming the task of pointing and manipulating using human gestures and tracking. It examines the techniques, AirTap, ThumbTrigger, and introduces their own set of pointing techniques. I believe that this tool could be useful for people who do not enjoy the common setup of sitting at a flat surface to manipulate my desktop. I would very much enjoy a tool that can track my gestures and remove the use of a hard surface. I think that this is a perfect starting point.
Ronian Zhang 4:37:42 11/21/2017
Distant Freehad Pointing and Clicking on Very Large, High Resolution Displays: In this paper, the author proposes solutions to freehand pointing and clicking interaction with high resolution display in a distance. The paper proposes 2 clicking technology: airtap click which is similar to how to click a mouse button or tapping on screen, and thumb trigger click where the thumb is moved in and out towards the index finger side of the hand, (These method are all limited with single button event and don't seperate between left and right click.) 3 pointing technology: finger ray casting pointing technique (which refered as absolute position), using the motion of hands projected for pointing (ralative pointing), and hybrid method which uses ray casting to recalibrate the hand position while repoitioning the cursor at the same time. By experimental evaluations, the author concludes that the proposed method is possible to achieve the same low range error rates as it in devices like mice. The evaluation results is acutally promising (since it's similar to mouse), and as the growing needs of big high resolution screens, the technologies that are discussed in the paper are verly likely to have a bright future.—————————————————————————————Skinput: Appropriating the Body as an Input Surface: The paper is very interesting, it dicusses a input method that allows human skin as an input surface. It tries to use the sensing of acoustic transmissions of human. More specifically, it tries to resolve the location of finger tap on human by detecting the mechanical vibrations that propagates through human body. The job is poineering since it designes a novel wearable device for acoustic signal acquistion, and even though limitation exists, the experiment shows positive results: the input sensor is close to the contact for finger taps and the classification has a high accuracy, and the overall result is that the system performs well for a series of gestures. There are also other potential usage of the approach: single hand gesture, taps with different parts of the finger and differentiating object materials. As stated in this paper, there is still rich design space in skin input. If the author could further improves the technique and put it into production, it may be very promising.
Akhil Yendluri 8:50:24 11/21/2017
Skinput: Appropriating the body as an Input Surface
This technology transmits acoustics by appropriating human body using skin as an input surface. They use vibrations on the body to detect finger taps on the arm and hand. The wearable device can be used in future tech devices thereby removing the need for any physical devices like a smartphone by casting the screen on our arm or hand and the device senses the touch by using the vibrations generated. The results were also examined in a test study consisting of two-part, twenty participants to assess the capabilities of the system. This system provides a new, unique and a futuristic approach to present technology.
Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays
This experiment explores the method of pointing and clicking interactions from a distance by using hand as an input medium. They also incorporate visual and audio feedback to overcome the lack of kinesthetic feedback. This paper talks about the various pointing and clicking techniques that are available at present and how they inspired and assisted them in the project. Then it talks about techniques like Airtap, Thumbtrigger, etc which are used in the distant freehand pointing and clicking experiment. The experiment is evaluated by a user study consisting of 12 participants. The experiments proved that Raycasting technique was significantly faster than Relative and RayToRelative techniques. But there is also a higher error rate. This research is a step towards facile pointing and clicking techniques from a distance. This paper also helps in evaluating the performance of the hand based pointing technique with the traditional devices like mice.
Ruochen Liu 8:58:53 11/21/2017
“Skinput: Appropriating the Body as an Input Surface” and “Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays” are two of the papers about natural interfaces. Personally, I believe natural interfaces are the extend of tangible interfaces and they will certainly be more and more common in the next age of ubiquitous computing. “Skinput: Appropriating the Body as an Input Surface” mainly presents a method that appropriates the human body for acoustic transmission and enables human’s skin to be used as an user interface. Using this method, the only needed device is an armband worn by a user. By detecting the mechanical vibrations on the body using sensors in the armband, the location of finger taps on the arm or hand can be resolved. This feature allows lots of corresponding possible application such as building a projective interface on the arm. It brings the basic method of use of touch screen to a far more natural platform—human skin. It may be the right developing path for people to get rid of external devices, no matter whether they are portable. But perhaps the only problem is that the armband seems to be extremely heavy and big for everyday use. It is unacceptable for users to use such a big device for everyday interaction. Some technical improvements and industrial designs can make it better. “Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays” presents a method that allows freehand pointing and clicking on large, high resolution displays. This method brings a solution for the interaction between remote users and large display space and it can be spread to other situations. Specifically, three techniques for gestural pointing and two techniques for clicking are presented and evaluated. The usability of them with error rate is at the same level with mice. Also, visual and auditory feedback are added to make the gulf of evaluation smaller. I believe this remote clicking and pointing technology as a natural interface has a promising future since it solves the real problem in a unique way and it can be extended to many other application situations.