Natural User Interface

From CS2610 Fall 2016
Jump to: navigation, search

slides1 slides2


Readings

Reading Critiques

Haoran Zhang 18:01:23 10/30/2016

Skinput: Appropriating the Body as an Input Surface: In this paper, authors present a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface, called Skinput. In this tools, user must put on a device, and project the graphic on arms and hands. Then use other sensors to detect the movement of fingers, whole arm, and forearm, or the figure movement on arm, so that the system can know user’s choice. This should be a fancy device in 2010, which is 6 years ago. But this device has limitation, for example, people who has higher BMI will lower the accurate of the system, because the body fat influence the sensors. And user need a huge device to put on, it also limits the scenario that this device is useful. It will be a kind of silly, when you put it on and walk on the street. Even we don’t think about the size of the device, you have to put your arm into a specific angel so that the projector can project the graphic on the arms. Obviously, it is cool, but it still hard to challenge existing portable input device, for example cellphone. Distant Freehand pointing and clicking on very large high resolution displays: In this paper, authors provide an input device, it allows users to do distant freehand pointing and clicking on Very Large, High Resolution Displays. Form the title, we have already known the first limitation of this project. We have to use a VERY larger and HIGH resolution displays. Other than that, we have to let user put on some devices, so that it can detect the movement of hands and fingers. The good thing is it proved gesture recognition, and have different responses. Also, users can use multiple gesture to input command to the big screen. But I think, there are more ways to finish this target compare to 2005. Now we have Kinect, and Leap Motion. They can detect not only hand movements, finger movements, but also body movement. This is mean, the system can add move body movement detection and provide more powerful tools. And another benefit is, with new devices, a Very large and High resolution display is not necessary, because the detector itself can reach a higher accuracy and detect tiny movement.

Zhenjiang Fan 18:28:23 11/17/2016

Skinput: Appropriating the Body as an Input Surface :::::::::::::::::::Operations and moves that carried out by both hands are far more natural than other types of body gestures. And hand movements can define lots of operations either by using a single hand or combining both hands. And hand gestures are an inseparable part of body gestures, so, I think, hand gestures could be utilized as the main input of the paper's proposal. The paper has presented our approach to appropriating the human body as an input surface. The paper has described a novel, wearable bio-acoustic sensing array that we built into an armband in order to detect and localize finger taps on the forearm and hand. Results from its experiments have shown that its system performs very well for a series of gestures, even when the body is in motion. Additionally, it has presented initial results demonstrating other potential uses of its approach, which it hopes to further explore in future work. But I don't think using other parts of the body as input interface is a great idea, for example, the most of the body surface will be covered when the temperature is low. So the most ensuring method is that using one palm as the input interface and the other as the operator. :::::::::::::::::::::::::::::::::::::Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays ::::::::::::::::::::: I like the design that the paper has proposed, which utilizes the rich as natural hand gestures as an input tool. RayCasting was faster in tasks where clutching would have been required or when selecting large targets, its high error rates prevent it from being a practical technique. The paper found no major significant effect between the Relative and RayToRelative techniques in terms of selection time or error rate. Its techniques currently only support actions equivalent to those of a single button mouse or touch screen. It would be interesting to explore using the thumb and index finger together to “left” and “right” click.The paper has motivated the need for facile pointing and clicking techniques for interacting with large displays from a distance; identified desirable characteristics for such techniques; and developed and evaluated new pointing and clutching techniques that leverage the simplicity and inherent human ability to point with a hand; provide unobtrusive but yet effective visualizations to subtly alert the user when postures and gestures are about to become ambiguous; and used various heuristics to tune the parameters of the clicking mechanisms such that they behave as users implicitly expect. Finally, our evaluations demonstrate the usability of relative hand base pointing techniques with error rates in the same low range one typically sees with status-quo devices like mice. There are more than just six or ten gestures that we can explore from the hands, we can use hands as an operation surface, we can use both hands as two different input tools, etc.

Alireza Samadian Zakaria 23:26:08 11/21/2016

The first paper is about Skinput which is a mobile interface that allows the skin to be used as an input surface. The signals are collected using an array of sensors worn as an armband. The primary goal of this device is providing an always-available mobile input system. It is discussed that they have used acoustic effects of taping on skin to find the taping and locate the place of it. It is done by using multiple sensors since they found it impossible to do this by a single voice sensor. One of the interesting points is that they have claimed that their sensor design is relatively inexpensive which is promising for mass production. In order to recognize the tap from the signals, the authors trained SVM classifier with 186 features computed from the input signals. The most accuracy that they have achieved is for the bellow-elbow arm position and it is 95 percent. In my opinion, bellow elbow is not a good place since we put our hand on tables and it may cause many false positives in real situations. Furthermore, they have reported accuracy in most of the parts whereas precision is more important in my opinion since people can tolerate some false negative by taping again but false positive means we need to correct a false signal which may not be possible in an easy way. However, they have mentioned the number of false positive in jogging and walking situation. It is also mentioned that higher BMI can cause lower accuracy in their experiment. All in all, the proposed interface can be a start to a new generation of input technology if they can make the armband more unnoticeable. ------------------“Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays” is a paper about the design space of freehand pointing and clicking with a large high-resolution display from a distance. In these big devices, it is good to allow the user to interact with the display from both near and distant. In a near distance, the user can change the details whereas in a far distance the user can manipulate things like sorting and arranging the entire workspace. According to the paper, for such an interaction there is a set of desirable characteristics needed such as accuracy, acquisition speed, selection speed, comfortable use, and smooth transition between interaction distances. The previous works have focused on using some external device such as laser or handheld devices to improve precision in distant pointing and perceive the clicking action. This work has not used these devices, instead, they have proposed two clicking techniques one using index finger and the other using the thumb. Furthermore, three techniques are proposed to enhance pointing precision which are absolute position finger ray casting, relative pointing with clutching, and some hybrid technique with relative pointing. After that, these three techniques are compared by the mean of task completion time, error rate, recalibration activations, recalibration frequency and total recalibration time. Ray casting is found the fastest technique with the most error rate which means it is suitable for big targets but bad for small ones. In my opinion, we can improve pointing accuracy by enhancing our tracking and computer vision techniques and using better sensors.

Anuradha Kulkarni 23:31:17 11/21/2016

Skinput: Appropriating the Body as an Input Surface: The paper presents new sensor prototype, with input as vibration of fingers on human skin. Skinput is a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. It uses waves produced by finger taps on the hand of the user and tune the sensors through machine learning and data analysis. A prototype wristband has been built that is enhanced with sensors for monitoring vibrations on the skin. The idea is very innovative. The author performs evaluation through two-part, twenty- participants to study the capabilities, accuracy and limitations of this device. The evaluation gave enough justification that this device works out well for series of gestures, even when humans are moving. The author didn’t specify the limitation of this sensor which was succinctly mentioned in the introduction section. Overall the idea was very novel and motivated. --------------------------------------------------------------------------------------------------------------------------------------------------- Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays: This paper proposes replacement of mouse to control very large screen by using the method on air hand gesture. The 7 hand gesture techniques like hand-held pointing devices, laser devices, eye tracking, direct hand pointing, hand and body tracking virtual environments were studied in order to propose new methodology in order to overcome pointing task of mouse device for large screen in a distance. Two algorithms for clicking have been implemented: AirTap and ThumbTrigger. First one simulates a click as a movement of the index finger, whereas the latter tracks the thumb's relative position to the hand. Three different pointing techniques are presented: ray-casting (is based on the pointing ability of human beings and moves the cursor to the point on the display that is pointed by the user's finger); relative pointing (aims to make the pointing method easier and less tiring for the user) and a hybrid method. The evaluation results showed that this technique which is more natural to human performed at the same rate as a mouse.

Xiaozhong Zhang 23:35:25 11/21/2016

Skinput: Appropriating the Body as an Input Surface In the paper, the author presented Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. The system resolves the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. The author assessed the capabilities, accuracy and limitations of their technique through a two-part, twenty-participant user study. Results from the experiments showed that the system performs very well for a series of gestures and motion. Additionally, the author have presented initial results demonstrating other potential uses of the approach. These include single-handed gestures, taps with different parts of the finger, and differentiating between materials and objects. The author concluded with descriptions of several prototype applications that demonstrate the rich design space the author believe Skinput enables. Distant freehand pointing and clicking on very large high resolution displays In the paper, the author explored the design space of freehand pointing and clicking interaction with very large high resolution displays from a distance. Three techniques for gestural pointing and two for clicking were developed and evaluated. In addition, the author presented subtle auditory and visual feedback techniques to compensate for the lack of kinesthetic feedback in freehand interaction, and to promote learning and use of appropriate postures. In summary, this research has made several contributions: the author have motivated the need for facile pointing and clicking techniques for interacting with large displays from a distance; identified desirable characteristics for such techniques; and developed and evaluated new pointing and clutching techniques that leverage the simplicity and inherent human ability to point with a hand. Finally, evaluation shows that the relative hand base pointing techniques have error rates in the same low range as status-quo devices like mice.

Tazin Afrin 1:05:21 11/22/2016

Critique of “Skinput: Appropriating the Body as an Input Surface”: In this paper, the authors develop a new technology called Skinput where the skin is used as an input surface. An array of sensors has been used to determine the location of the vibrations. The vibrations propagate through the body following a finger tap on arm and hand. This is a visual free input because the user can sense their body location by tapping and without any visual input. The user wears an armband which can propagate the vibration signal. This armband is place in multiple position on the arm and tested through experiments. The authors run a within subject design. The had five different arm band positions and each participant went through all the positions in randomized order. In order to collect data and train the system, each participant performed example finger taps. From the result we can see, where the input sensor was close to the contact for the finger taps, classification accuracy was higher. The authors argue that the faulty classifications were because of the quality of training data. The overall result shows that the system performs very well for a series of gestures. It maintains the performance even when the body is in motion. The other potential uses of this approach of study are – single-handed gesture, taps with different part of the finger, differentiating object materials etc. In overall, this paper is very interesting for the way of input method. It introduces a new modality for future addition of possible modality of input features. However, using fine tuning and a more powerful machine learning could help enhance the classification result. ------------------------------------------------------------------------------------- Critique of “Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays”: Daniel Vogel and Ravin Balakrishnan explored the design space of distant freehand pointing and clicking on very large and high scale resolution displays. They have developed three techniques for gesture pointing and two techniques for clicking. Also they have presented a subtle auditory and visual feedback technique that compensate the kinesthetic feedback in freehand interaction. The airtrap click technique is how we move our index finger over mouse. There are ambiguity and idiosyncrasy of style of finger movement which makes it challenging. When the thumb is moved in out of the index finger, a thumb trigger is clicked. Compared to Airtrap it can represent a downward position which Airtrap cannot. Detecting and teaching hand postures can be very challenging for the ambiguous position of hand, when the hand is neither open nor close. The index finger is also used for ray casting. In this case, a ray is extended from the tip of the finger to the cursor position. In relative pointing an open hand is used for cursor control and a closed fist is used for clutching. RayCasting was faster in tasks when selecting large targets. However, how the system only supports single button action, but it would be interesting to explore the thumb and index finger together. The evaluation result demonstrate the usability of hand base pointing techniques with low error range.

nannan wen 1:20:50 11/22/2016

Skinput: Appropriating the Body as an Input Surface by Harrison, C., Tan, D. Morris et.al, Review: In this paper, the author mainly talked about a new input method for human, called Skinput. With this new equipment, people can use their body parts to do all the input. The author in this paper implemented a system which uses their body as keyboard, finger as cursor to type in sentences in word, which turn out to be portable and handy. In this paper, the author also stated their opinion that there are still a rich design space in skinput, it kind of make the images in science fiction film real. I think the idea is new and interesting, even though there’s no real applications that can do that yet. But I think this propose has remaining questions needs to be solved too. For example, when users use their arm as input device, they have to show out their skin, which is not convenient during winter time. Another one is that user needs to wear a sensor, which is inconvenient, but maybe in the future it’ll change. =======================================================Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays by Daniel Vogel, Ravin Balakrishnan. Review: In this paper, the author introduced a new way of manipulating pointer on a huge screen from a distance by their hand. In the paper, the author proposed three techniques for gestural pointing and two for clicking, they also implemented two of them, they also did some evaluation of the proposed methods. As a replacement of mouse, I think this paper has a neat idea. From the evaluation the author did, it seems that in some specific cases, new techniques would be more suitable than the one that we think is perfect for it. I think the invention is good, because the screen will be larger and larger in the future. This method could accomplish the need of human being. And also with larger screen, new changle will emerge, these techniques, though not fully be understood yet, but definitely is a good starting point.

Keren Ye 2:26:00 11/22/2016

Skinput: Appropriating the Body as an Input Surface The paper presents Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. It assesses the capabilities, accuracy and limitations of the technique through a two-part, twenty-participant user study. Firstly, the authors describe the design of a novel, wearable sensor for bio-acoustic signal acquisition. In general, the sensing array is built into an armband. Sensing elements detect vibrations transmitted through the body. Then, they describe an analysis approach that enables the system to resolve the location of finger taps on the body. More specifically, the analyze the mechanical vibrations that propagate through the body. Then they collect these signals using the sensors worn as an armband. In the next section, the paper states the robustness and limitations of this system through a user study. And finally, they explore the broader space of bio-acoustic input through prototype applications and additional experimentation. In sum, the paper presented an innovative approach to appropriate the human body as an input surface. I think this is could be trend among the young generation if it could be put into production. Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays The paper explores the design space of freehand pointing and clicking interaction with very large high resolution displays from a distance. In the introduction part, the authors describe the design characteristics including accuracy, acquisition speed, pointing and selection speed, comfortable use, smooth transition between interaction distances. In the next part, the paper summarised several previous work such as hand-held indirect pointing devices, laser pointer-style devices, eye tracking, body and hand tracking, etc. Then the authors proposed their approach, i.e., pointing and clicking using only the hand. They firstly discuss about how to click and clutch without a button, then introduce their system prototype. For clicking techniques, they mentioned about AirTap, ThumbTrigger, and finally introduce their approach. Relative pointing with clutching is then discussed, then the authors introduced their methods for evaluating the prototype. In sum, the approach proposed is quite innovative, and the authors mentioned lots of future works.

Debarun Das 3:48:41 11/22/2016

“Skinput: Appropriating the Body as an Input Surface”: This paper discusses about ‘Skinput’ which is a technology for using the human skin as an input surface. This technology aims at sensing the acoustic transmissions of the human body. More specifically, it helps in resolving the location of a finger tap or flick on the human by studying and analyzing the mechanical vibrations that propagate through the body. Thus, the main contributions of the paper are i) description of the novel wearable device for bio-acoustic signal acquisition. ii) analysis of the mechanical vibrations for proper location a tap on the arm. In addition, it also discusses about the limitations of the system. The arm band consists of two sensor packages, with 5 sensors in each of them. For experimentation and evaluation, the authors recruited a set of participants and they built an SVM to recognize and analyze the vibrations produced by different taps. The results of the experiment have been positive. This paper serves as an important one as it introduces a technology which has tremendous future scope.---------------------------------------------------------------------------------------------- “Distant freehand pointing and clicking on very large high resolution display”: This paper does research in the area of clicking and pointing tasks. It mainly discusses about free hand pointing and clicking on a very large screen from a distance. This new technology is used instead of using mouse, for controlling a pointer. The paper initially starts by discussing about the previous works that have been done in this area. Then, it discusses about the novel techniques for solving the problem at hand. It proposes two clicking techniques – “AirTap” and “ThumbTrigger”. Also, it discusses about three pointing techniques – “absolute position finger ray casting, relative pointing with clutching, and a hybrid technique using ray casting”. The author then compares and analyses among the different techniques. There are several tradeoffs for each of the techniques. The appropriate technique depends more on the situation at hand. In addition, the authors use visual and auditory feedback schemes in the clicking techniques to ‘compensate for the lack of kinesthetic feedback typically present when clicking a physical button’. Amazingly, the error rate of these techniques is in similar range to that of the mouse. Thus, again, this is an important paper that discusses about a technology that has endless future scopes

Steven Faurie 8:15:17 11/22/2016

Steve Faurie Skinput: Appropriating the body as an input surface: This paper described the development of a system that could identify where a user tapped on their arm or hands. Because the system could locate where a user tapped themselves it could theoretically be used as an input device with different tap types corresponding to different actions. The authors related what they were trying to do to previous works that attempted to use tables as an input surface. They noted that users don’t always have a table handy but they do always have access to their own body. So using being able to use the body as an input surface would be convenient. Another advantage to having a person use their own body is that they intuitively know where they’re touching. One of the examples they gave is that it is very easy to touch your thumb to the tip of any of your other fingers even without looking. The development of the device also including testing the accuracy of the device for different individuals. The device was trained on each individual using an SVM classifier. Each individual would perform the actions several times and their labeled choices were entered into the classifier. During testing their inputs would be interpreted by the classifier and the authors would evaluate how often they were correct. The system was pretty successful. An interesting point was that they could increase accuracy by lumping together some groups the classifier had a hard time distinguishing between. The build of the individual also had an effect on accuracy. In general it was harder for the system to distinguish the different inputs on heavier individuals. Another interesting and surprising point was that the system was not so sensitive it interpreted activity like walking as inputs. The running test also was surprisingly good with relatively few errors. They go on to describe several different interface types that the system could support. Distant Freehand Pointing and Clicking on Very Large High Resolution Displays: This paper explores the idea of using gestures to control a large display from a variety of distances. It is an interesting concept and one similar to what Microsoft tried to do with the Kinect. Some of the challenges they address include accuracy. The device must be able to select relatively small targets. Acquisition speed means the device must be able to interpret what gestures are intended for it quickly and accurately. In the pointing speed section, he talks about how you should be able to touch any part of the large display in a single motion. Basically, they are saying that the device should not require the equivalent of picking up your mouse while swiping right and moving it back to the left side of the mouse pad to continue swiping right. They also focus on simplicity of use and easy transitions between different distances. A unique focus of this paper is their desire to avoid requiring some sort of pointing device. They used motion tracking software but noted that computer vision was developing to the point it might be a viable alternative to the glove with markers on it. They ended up developing a system that would track the users motion and translate it into pointer movements. Clicking gestures using a finger would be interpreted as clicks on the interface. They implemented a special thumb clicking version as well. The affordance of the device is increased by using audio and visual clues to help the user interpret what their gestures mean and are doing. Tracking cursor movement was another major challenge the authors addressed. One implementation of the point tracking actually used ray casting to draw a line from the index finger to the screen. This technique had many of the same flaws as using a laser pointer. Most people have some jitter in their hands while attempting precise pointing from a distance. They found pointing techniques that used the entire hand were more accurate. These techniques also allowed the user to use hand gestures to disengage their hand from the pointing device. The authors found the relative pointing techniques to be more accurate than the one based entirely on ray casting from the index finger. In the end they ended up implementing a system with the abilities of a single button mouse. The authors noted the expressiveness of the system could be increased by allowing it to recognize other pointing gestures along with body movements. They also noted future research could see if eye tracking or tracking of the non dominant hand could be used to select a general area of the display and then letting the user use their dominant hand to make fine grained movements.