Mobile Interfaces - 1

From CS2610 Fall 2017
Jump to: navigation, search



Reading Critiques

Jonathan Albert 17:12:24 10/7/2017

TinyMotion: This paper details a software-based system for detecting movement via a cell phone's camera. It explains how the system works, how it was tested, and its performance in those tests. I think using a camera and image processing to control an otherwise static screen was a novel approach to providing alternate input modalities for phones. I notice, however, that touch-based (and, if I recall correctly, multi-touch) systems were in-vogue research topics around the time this paper was published. The paper's system was understandably focused on expanding the usefulness of existing technologies; nevertheless, it seems to provide limited value for the tasks examined in its exemplar applications. In the case of mobile gesturing for character recognition, the pointing task seems to involve much more physical movement when compared to mouse- or touch-based input methods. Correctives such as C/D gain could theoretically be employed to mollify this, but the given phone's screen is too small for pointer acceleration to have any nonnegative effect--whereas decceleration would decrease relative pointing performance. While the method undertaken by the authors was intriguing, and while the system could be used for other unforeseen tasks, I think the system's usefulness was too limited. ---- Sensing: This paper explores the use of sensors to enhance mobile devices, such as proximity, touch, and tilt sensors. It details how these sensors are used, some applications thereof, and associated problems. The discussion of rotation detection and screen adaptation was intriguing. Not only did their auxiliary sensors determine desired orientation when the phone screen was not perpendicular to the ground, but their interface adjusted how its buttons responded to inputs depending on the mode. They account for putting the device on a flat surface while maintaining its orientation, and even encountered a user who more or less requested rotation lock. With how prevalent this technology is today, it was interesting to see how so many features attendant with modern devices were implemented in this early prototype. The authors mentioned that making multiple inputs cohere is "hard," yet implied that it may be beneficial to add more modalities to the system. While each of these features such as tilt detection worked well on their own, I wonder if adding more functions would decrease usefulness. We have read in prior papers about a user's limited memory and attention span, as well as the need to make functions visible and affording. Yet, the authors mention that users did not know certain features existed until told. While incorporating many modalities can create improvements, especially as virtuosity increases, the nature of phones nowadays often has users learning the same tasks over again when they purchase a new device--or even when they update its OS. Adding more complex interactions, if they are not intuitive, will just be too much for the average user.

Kadie Clancy 14:54:07 10/11/2017

Sensing Techniques for Mobile Interaction: In this article, the authors create a prototype for investigating context awareness via sensing techniques by integrating a set of sensors into a handheld device. Sensors were added to detect touch, tilt, proximity, and scrolling to support background interaction with the device. Supported interactions included: recording memos when the device is held up like a phone, automatic switching between landscape and portrait mode, automatically turning the device on when picked up, and tilting the screen to scroll. The authors argue that simple and cheap sensors play an important role in mobile interfaces, but are not a panacea; features should be carefully designed. The authors conduct a usability study to gauge user reactions to their techniques, where the results were largely positive. I think it is interesting that although this paper was published in 2000, and the authors used a crude prototype for testing, sensors with these capabilities are present in modern smartphones. While not all features presented in the prototype are realized in today’s smartphones, the authors correctly predicted that contextual awareness via sensing will be relied upon for future interfaces for mobile devices. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: This paper presents TinyMotion, a software approach that measures a cell phone movement in real time by using the built-in camera. The authors implement several applications based on TinyMotion including Motion Menu, Mobile Gesture, and several camera games. The authors then conduct an informal study, receiving largely positive results concerning background and illumination conditions and user satisfaction. The authors then conducted a formal study to quantify human performance using TinyMotion as a basic input control sensor. Study results determined that TinyMotion can reliably detect movement under most conditions, and that TinyMotion based pointing follows Fitts’ Law. The results of the study also revealed the scope of applications that can be built on TinyMotion. For example, users can use the Vision TiltText faster than MultiTap, and while using a camera phone as a handwriting device is feasible, it may not be directly practical as a means of user input method. This paper is interesting for a few reasons. First, the authors provide an in depth study of the theoretical significance of their software, via Fitts’ Law, but also create and test applications utilizing the camera as an input device to show what applications are useful and feasible for this new input method. By demonstrating that TinyMotion based pointing follows Fitts’ law, the authors provide a benchmark for future improvement in hardware.

Tahereh Arabghalizi 15:26:37 10/11/2017

Sensing Techniques for Mobile Interaction: This paper discussed about methods of interaction with PIM’s using sensors embedded in those devices such as touch, tilt and proximity sensors. The main idea is to change the way we interact with mobile devices and make it simpler and more convenient. The sensors used in this research include an accelerometer, 2D gyroscopic axis to measure tilt for changing viewing modes, infrared sensors for proximity and touch for automatic power on. The authors suggest that the hybrid design of using these sensors would be the future of ubiquitous devices and as we know today such design is used very broadly, for instance, proximity sensors are used in touch devices like cell phones to lock the screen while talking on the phone. The goal to create such devices is to provide users with a more comfortable environment and feeling. While interactive sensing techniques seem to provide many benefits, they also increase opportunities for poor design because the strengths and weaknesses in the design space are not as well understood as traditional GUI design. We need experiments to quantify user performance with these techniques, and we need longitudinal studies to determine if users may find sensing techniques cool at first, but later become annoyed by false positives and negatives. ------------------------------------------------------------------------------------------------ Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: In this paper, the authors proposed a method called TinyMotion that measures cell phone movements in real time by analyzing images captured by the built-in camera. Through an informal evaluation and a formal user study they found that TinyMotion can detect camera movement reliably under most background and illumination conditions. They also found that task acquisition tasks based on TinyMotion follows Fitts’ law and the Fitts law parameters can be used to benchmark TinyMotion based pointing tasks. The other outcome was that the users can use Vision TiltText, to input sentences faster than MultiTap with a few minutes of practice. Using camera phone as a handwriting capture device, large vocabulary, multilingual real time handwriting recognition on the cell phone is also feasible based on this case study and finally TinyMotion based gaming available for the current generation camera phones. In conclusion, the work presented in this paper shows a novel idea on making use of mobile phones' sensors and provides a nice interface for interacting with mobile devices.

Ahmed Magooda 16:18:18 10/11/2017

Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study In this paper the authors are proposing a camera based framework they called “tiny motion” to detect various user movement actions and use it in many applications. The main idea of the framework is using image processing algorithms to detect the movement direction and based on that the actions can be mapped to different outcomes based on the application. The authors proposed different kind of applications like “gaming, pointing application, writing aid applications, etc..”. The authors then went and did informal evaluation in the form of measuring users satisfaction after using multiple applications, and a formal one which measured the time and complexity of doing a number of tasks using the tiny motion camera based framework. I found performing two different kind of evaluation actually a strength point of this paper, however the number of samples is still a factor that needs to be considered here if you want to have an intuition of how the thing is going to generalize. I found the paper interesting and i think the idea of using camera was good specially before the starting of depending on accelerometers and gyrometers. -------------------------------------------------------------------------------------- Sensing Techniques for Mobile Interaction: In this paper the authors shift gears from using the camera to starting using sensor to perform so interface related task. While the previous paper focused on using camera, this paper focuses on some ways to use sensors to perform some tasks like memo recording, changing screen orientation via switching device orientation and scrolling by tilting the device. In these applications the authors provided some informal evaluation by measuring users satisfaction and number of errors in some tasks, while the sample size is small (7 persons) it shows some preliminary results that users actually liked using sensors to record voice memos by holding device in mobile speaking position. The authors argues that it is while using some cheap sensors can actually produce good results it is still hard to compete against traditional UI as these sensors can be tricked by some unexpected cases.

Sanchayan Sarkar 19:11:39 10/11/2017

CRITIQUE 1 (Sensing Techniques for Mobile Interaction):----------> In this paper, the authors explore context-sensitive interface design by using three sensors: Linear Accelerometer, capacitive touch sensors and proximity sensors. They produce novel interactive mechanisms: voice memo recording, switching between landscape/portrait orientations and tilt-scrolling through menus to demonstrate the application of interactive computing. One of the main strength of the paper is developing the scenarios for the usability testing for the applications. In the Voice memo detection, the authors introduce a strong non-visual feedback mechanism by giving two sharp beeps to indicate the end of recording. It is also interesting to notice the interplay of proximity and tilt sensor in indicating recording. When it came to evaluation a striking observation was that even though the statistical difference between ‘manual’ and ‘sensed’ condition was marginal, the response from the users were overwhelmingly positive towards the ‘sensed’ interaction. This shows the huge difference when it comes to evaluating interactive devices as quantitative measures are not good enough. Another feature I liked was the simple application of FIFO to retain the orientation and solving the “Put-Down Problem” for landscape/portrait detection. The paper uses several time thresholds for inference passive acknowledgements for the user. One potential flaw, in my opinion, is the authors did not give the users time to practice and therefore user habits are not accounted for. It would be interesting to notice how a practiced user would affect the qualitative responses when it came to this mode of detection. However, one must commend the paper for explicitly mentioning each and every false positive scenarios and how the method could deal with them. It is essential since the solutions to every false positive scenario are not the same. The final novelty of the paper lies in the use of screen orientation and touch to control the power of the device. It presents and explores several challenges that can arise for this application. The only negative I found in these usability scenarios is the lack of a comprehensive quantitative evaluation model. I feel that the lack of comprehensive evaluation model makes it harder for the usability evaluation. Never the less, since the authors explore a vast number of challenges tackling distinct false positive scenarios, this paper is critical for a deeper insight to mobile interaction and developing new application scenarios ******************************************************************************* CRITIQUE 2 (Camera Phone based Motion Sensing: Interaction Techniques, Applications and Performance Study) In this paper, the authors propose a novel method for mobile phone interactions by basing the camera movement as an input sensor. The authors not only describe the method for detecting motion as a function of relative positions of a macro-block in consecutive frames but also create interesting validation scenarios like Target Acquisition, Text Input and Menu selection to show the relevance of the method compared to the usual state of the art methods. One of the strengths of this method is that it shows remarkably high accuracy under different illumination conditions. I find this interesting because I have worked on a project in Face Recognition where detecting gradient patterns under different lighting conditions was a big challenge. So the fact that TinyMotion is robust to this factor impresses me. Another strength of this method is the high level of efficiency. Image processing problems are computationally intensive and this method does well in abetting that. Another interesting aspect is the idea of Vision ‘Tilt-Text’. In my opinion, it quite novel as it introduces more natural mappings into interactions. However, it will be interesting to see its’ relevance in today’s virtual keyboards which are essentially QWERTY in nature. I feel that even though, the method was quite novel, the introduction of the touch based smart phones makes the Vision Tilt Text a bit irrelevant. However, even then, the idea of vision based tilt mappings can be applied to other aspects like interacting with augmented reality. The paper also does a great job in its evaluation scenarios. It is interesting to see that that all the tasks of pointing, text input and menu selection have the same order of error rates as expected in the Fitt’s Law. I believe with today’s processing power, the sampling rate can be increased to improve the transmission rate. The paper also performed a rigorous comparative analysis and places the method in rich context. This can give a deeper insight for potential researchers who are thinking of innovating motion sensing in phone interactions. .

Mingzhi Yu 20:49:43 10/11/2017

Sensing Techniques for Mobile interaction: This paper discussed the sensing techniques that the Microsoft used in their mobile design and setting. According to the published time, I believe this is also one of the earliest technology that used multi-interface design in a mobile phone. Before the smartphone era, I still remembered that we used some intelligent phone that has an actual keyboard and the small colorful screen. At that time, owning a blackberry phone seems very in trendy and fancy. After the iPhone intrudes and occupied the mobile phone market, we started to get used to the smartphone with touchscreen and gravity sensor. However, Microsoft phone is one of the pioneers that are brave enough to try this kind of multi-sensor techniques. Even though nowadays, the windows phone seems to lose the battle in the mobile phone market compared with iPhone and Android, the motivation to create a new kind mobile phone with multi-sensors is very insightful and brilliant. In this paper, the authors reveal many design details and cite many related works. I honestly have no knowledge in the field of EE and have no much feeling about each of the sensors they talked about. However, I do understand the concept and have feeling that where these concept come from, how they could be beneficial for users. In the article, they also said that acceptance of this new technique from existing users is worrisome. But if we look at today's mobile market and how users welcome new technology and design, it is adequate to prove everything. Camera Phone Based motion Sensing: Interaction Techniques, Applications, and Performance Study: Although many previous works have been discussed using the sensor as the mobile phone input interface, this paper came up with a system that replaces the sensor with the camera itself. The authors designed a gesture/motion detect system by using mobile's camera and evaluated them by indicating the difficulties that the tasks are. This is an innovated works. However, I was wondering where the camera located on the mobile phone. If the camera is in the back of the phone, is the Fitt's law still apply since hand gesture is twisted comparing the normal position. For example, for texting task, I don't think moving above a small webcam is much easier than moving above the screen. In general, the idea of this paper in innovative and interesting.

Spencer Gray 22:14:10 10/11/2017

In the first paper, Sensing Techniques for Mobile Interaction, the authors used proximity range senosors, touch sensitivity, and tilt sensors to enable special features to the user. Using an informal study, the authors measured how well the users enjoyed these special features which aimed to reduce the cognitive load of common interactions with mobile devices. This paper is significant in the HCI field because this has led to the sensor rich smartphones that fill the pockets of millions of users today. These researchers openned the door into the vast field of features that can enrich our interactions with our hand held computers. If I were to rewrite this paper today, I would use a much larger and more formal study. Even as the authors stated, many of the features that they proposed may seem fascinating at first, but could become annoying or distracting. A longer, more formal study could show how certain sensor based features impact our daily interaction with these devices. In the second paper, Camera Phone Based Motion Sensing, the authors created a software only approach to using the camera as an input device for pointing, text entry, and gaming. The result, TinyMotion, was well received in both the formal and informal studies that were performed. To me, the most significant part of this approach is that it was software only. It shows us that there are many undiscovered ways to interact with the existing hardware on the smart phones that we carry. It should be used as motivation to innovate our smart phones using the pieces that we already have put together. If I were to rewrite this paper today, I would not include the handwriting part of TinyMotion. Using the camera to draw characters one-by-one does not have any practical applications. While it is interesting the computer vision aspect of it was interesting, it will never be an efficient way to enter text. Especially since all smart phones today come with touch screens. It is much quicker to simply draw a character with a finger than to use the camera.

MuneebAlvi 22:59:02 10/11/2017

Critique for Sensing Techniques for Mobile Interaction Summary: This paper shows early attempts by Microsoft to use tilt, touch, and IR sensors to provide useful functionality to mobile device users. Some of this functionality includes tilt scrolling, switching the screen orientation, and recording voice memos. This paper shows that many features of modern smart phones began over 15 years ago. Every iPhone and Android today have many of the features described in the paper. However, due to the advancement of technology, modern phones obviously are more capable due to having more sensors. Modern phones use their sensors such as proximity sensors to dim the screen when the user holds the phone up to their face. Other phones use sensors such as fingerprint scanners to unlock access to the phone or make online payments. The latest Samsung phones can even scan the user’s iris to create a sort of password. The upcoming iPhone X will have an integrated 3d sensor that can be used to detect 3d objects such as people’s faces. I tried some of the same scenarios that were described in the paper. I couldn’t test all of them because many phones today don’t include all the same sensors for the same purpose. For example, many phones today do not have touch sensors on the side to determine if the user is holding the phone. One test I was able to replicate was to put my phone into landscape orientation, lay it flat, and then tilt it to the portrait orientation. The researchers in the paper said that the phone should retain the last stable orientation before the phone was flat. This means the phone should stay in landscape. However my iPhone went to portrait mode. This leads me to believe that this finding by the researchers was either not tried by many companies or companies found that the method was not good for users. Critique for Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study Summary: This paper tries to introduce a new way to sense tilting using the camera on the phone to process the differences in images and detect movement direction. When I started reading about TinyMotion, I immediately thought about the discussion we had in class about the computer mouse. In class we talked about how the mouse uses a small camera to capture images and uses that to sense movement direction. As I kept reading, the paper also mentioned this similarity. Therefore, I appreciate the use of taking such a concept and applying it to a bigger scale such as controlling gestures and detecting movements on a phone. The paper also mentions that TinyMotion is used to replace accelerometers that many phones at the time did not have. However, most phones today do have such sensors. However, I agree with the paper that neither accelerometers nor TinyMotion are good enough on there own. Accelerometers have sight inaccuracies and TinyMotion does not work in the dark or when pointed to surfaces that have no distinguishable patterns. Therefore, I believe combining the two could lead to more precise and accurate motion sensing.

Xiaoting Li 23:34:23 10/11/2017

1.Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: In this paper, the authors introduce us a pure software approach named TinyMotion by presenting several interactive applications based on this approach. Through analyzing image sequences captured by the built-in camera, this software approach is able to detect mobile phone user’s hand movement in real time. It is interesting to use camera to indirectly capture user’s hand movement and help users to input text into mobile phones in an easier way. Besides the interesting approach being introduced by the authors, the method that the authors used to evaluate the approach is also impressive. The authors first used informal evaluation and got promising results from it. Then the authors used formal evaluation to get qualitative results. I think this is a more efficient and cost effective way to carry out evaluation experiment in research work. Carrying out informal evaluation is not that expensive. If the result from the informal evaluation is not promising, there’s no need to keep on to the formal evaluation. In this way, we can carry out evaluation more efficiently and more cost effectively. 2. Sensing Techniques for Mobile Interaction: In this paper, the authors introduce how implementing sensing techniques into mobile phones can bring special interactive features and functionalities. The authors mainly introduce three sensors, touch sensors, proximity sensors and tilt sensor. An informal evaluation was carried out to measure the performance of these sensors being added to handheld devices. In the last section of the paper, the authors introduce how to use these sensors for power management. The sensors can detect user’s behavior and turn on or turn off the handheld devices automatically according to the detected behavior. However, is it really necessary to detect user’s behavior and turn off the device automatically for the users? In real life, most of the users don’t turn off their mobile devices. Therefore, I don’t think it’s necessary to add this feature to the handheld device.

Xingtian Dong 23:42:38 10/11/2017

1. Reading critique for ‘Sensing Techniques for Mobile Interaction’ After reading this paper, I think it is amazing. I saw the prototype of today’s smart phone. It is really amazing that the author add multiple sensors to a device to make it more powerful and environment awareness. Today, we have more kinds of sensors and they are much more powerful. It is a chance to think about how can we add them to mobile devices or tablets. Another interesting part of the paper is that the author designed experiments to test usability. The comparison among Control, Sensed and Manual gave a clear image on how new technique is powerful than existing ones. And the analysis for why the new technique is more powerful is also very enlightening. If we can use some technique to reduce the operations for some device. I will be more acceptable by users. 2. Reading critique for ‘Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study’ I think this paper is also very enlightening. From the last paper I learned that we should add more sensors to devices. But this paper use the existing sensor for different use. I think it is because we have more advanced algorithm so that we can manipulate the input data to get something different. But to achieve this, it necessary to learn more advanced algorithm and try to combine them with existing devices. Another important part is also how the author design experiment to examine the new technique and how to evaluate it.

Charles Smith 0:28:19 10/12/2017

On: sensing techniques The authors of this paper attach sensors to a mobile device to provide applications ‘context’. This is done with a touch sensor, a tilt sensor, and a proximity sensor. Smart devices now rely on this technology for almost every interaction. This is important to give the users a great experience. Users can tilt the display and see a different view. Phones also rely on knowing when they’re being held and when they’re not to know when to conserve battery power more aggressively. Devices today use many more sensors than the ones outlined in the paper. There are more axes to the tilt sensors, GPS is used, NFC readers, and many more. These all provide more context to the device and allow for multi-modal input. On: Camera phone This paper is about using the camera as an interface device. Users could use the camera of their flip phones to play games. Using the camera as an input device is still not widely used today. The paper identified a large drawback, that is still not solved in today’s hardware, the greatly reduced battery life. User of modern smartphones already complain of short battery lives, especially as the device reaches its end of life. Using an input on the back of the phone is an intriguing idea. Some android-based mobile OSes even implement this using the fingerprint reader on the back of some smartphones, tho due to the non-standardized hardware implementation, many applications do not take advantage of this method of input. This could be a common input method, should hardware manufacturers standardize this practice.

Krithika Ganesh 2:02:46 10/12/2017

Sensing Techniques for Mobile Interaction: This paper stresses on the importance of contextual awareness via sensing by stating that one can deliver simple and pleasant user experience, while still allowing direct control to a mobile device by the careful use of sensors. The author is successful in obtaining the context of interaction by parsing the sensor readings from the two accelerometers, an IR receiver/transmitter, and a couple of touch sensors which he uses in his prototype phone. The author uses the design space to explore the designs that have been tried before and also explores novel idea to come up with innovative ideas. The voice memo recording was triggered by the position of the phone and its proximity allowing users to perform visual and cognitive demanding tasks while they were recording a message and specified sounds which were a means of feedback. The auto-detect of phone orientation allowed users to change the orientation of the phone's monitor based on its position which enabled interaction to be more natural as compared to conventional ways for altering display orientation. The tilt-sensitive scrolling, enables users to scroll the documents through positioning of the phone. The power management powered on/off the display of the phone based on the phone’s position, orientation, and user's proximity, input, and duration of input. Today, most of our phones have sensors like gyro meter, accelerometer and proximity sensor which helps us to capture the user's context and thereby helping the mobile phone to adapt to the user’s interaction to suit the current task of the user. It would be interesting to extend the sensing technology to all the objects used by the user (IoT) thereby capturing more contextual data. ------------------------------------------------------------------------------------------------ Camera Phone based Motion Sensing: Interaction techniques, applications and performance study: This paper proposes a method to detect user's hand movement in real-time by capturing the image sequences using the built-in camera. What is interesting is that this paper takes a completely different approach as compared to the one above. Instead of adding sensors to a phone (hardware approach), TinyMotion takes an innovative approach of using the existing camera (hardware) and adding software to the phone to adapt to the interaction of user by making it more natural for the user. The approach that the author takes to test TinyMotion was indeed very impressive. Two rounds of testing (informal and formal) are performed with different users. The approach uses benchmark of motion detection in different environments and initial user impressions to test informally and does a more formal testing by allowing a large number of test subjects to examine the features of TinyMotion after going through a training session. The tasks given to the users varied from easy tasks such as pointing, menu selection, text input to complex tasks such as playing games. The success of TinyMotion and its performance is proven by the evaluation, which resembles Fitts' law. I really like the way the new features were added to the camera which motivates us to add more functionality (software) to the existing hardware.

Ronian Zhang 3:12:26 10/12/2017

Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: In this paper, the author built up a software of detecting user’s hand movement in a real-time base using phone’s camera. The detection perform well in detecting movement (could work in real-world environment). The application lies in 4 aspects: target pointing (follows Fitt’s law) , menu selection (not able to reduce selection time), text input (no speed improvement, but efficient and interesting to use), and also complex application like games (some degree of inconsistency with the conceptual model, could improve by customizing the user’s convention). The paper concentrated on experiments, evaluation method and evaluation results which makes the work of the paper more convincing. The reflection of user feedback and the comparison between other methods (point out its own weakness and limitation) made the paper very objective. I believe if the experiments could be conducted with more people (not limited to the college students), the feedback would be more comprehensive and more improvement of could be made. In all, the software is real-world based, it’s usable (in terms of usage senario, recognition speed, battery consumption ) and efficient (recognition rate, easy to learn). ————————————————————————————— Sensing Techniques for Mobile Interaction: In this paper, the author introduced different kinds of sensing techniques on mobile devices which includes: tilt sensor(can’t respond to rotation), proximity sensor (perform bad in sunlight, energy consuming). The author also built up a prototype which had proximity range sensor, touch sensor and tilt sensor; and he implemented a software context server which could transmit the sensor raw data in to logical context. He conducted experiments in voice memo detection, portrait display mode detection and tilt scrolling and portrait modes. Even though common in today’s daily life, the thoughts and models at 17 years ago is truly pioneering. The integration of those sensors made the functions usable. The author also pointed out that sensor technologies can’t offer panacea for user interface: the UI design is actually an important role (If we looked back today, this is real). Even though, sensor confusion is very popular topic nowadays, I believe the usage of basic sensors (like camera and microphone) haven’t reach their fully capacities, they are capable of doing more jobs.

Yuhuan Jiang 4:18:50 10/12/2017

Paper Critiques for 10/12/2017 == Sensing Techniques for Mobile Interaction == This paper discusses various sensing techniques in handheld mobile devices. The major argument of the authors is that some adaptations must be made in order to integrate interactive sensing techniques into mobile systems, because there are more factors in a mobile setting that can disrupt the perception of the signals, such as tilting, or screen orientation (landscape, portrait). The authors developed prototypes which revealed the power of sensor fusion. Sensor fusion refers to the use of aggregated data from multiple sensors. For example, a voice memo fires false positives or false negatives when only the device tilt angle is used. The authors further argue that a hybrid design integrating sensor with traditional techniques may be more practical. == Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study == In this paper, a system named TinyMotion is introduced. TinyMotion is a software solution for detecting user’s hand movements, by analyszing the image sequence captured by the camera. The algorithm of TinyMotion consists of four major steps. First, it converts the image sequence captured by the camera to another color space (24-bit RGB to 0-bit grayscale). Second, the image is subsampled to a 8x8 image. Third, full-search block matching algorithm is applied on adjacent frames to estimate motion. Lastly, the relative movements are detected and accumulated to provide an absolute measurement from a starting position. In the informal evaluation, the authors found that TinyMotion does not work on certain backgrounds such as completely dark rooms, or surfaces without patterns. In the formal evaluation, Fitts law was applied in the linear regression of target acquisition/pointing task. The bandwidth is found to be low. The authors believe the cause is the low sample rate of the camera. Overall, the system described by the system is powerful in that it can even be used as an input method for texts. However, It is also suffering from problems such as environmental lighting and battery efficiency issues.

Amanda Crawford 6:48:50 10/12/2017

• Sensing Techniques for Mobile Interaction. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E., ACM UIST 2000, pp. 91-100. Sensing Techniques for Mobile Interaction gives us an introduction to the Cassiopeia prototype built for developing new mobile sensing techniques. This study was inspired by Bill Buxtons' push for developing contextually aware ubiquitous computing devices. Going beyond the core principle of contextual awareness, the researchers sought to find new ways to enhance the intimate interactions between devices and their users. Taking the most natural and common forms of human interaction, gestures, these researchers explore how to use sensors to decipher these movements and interactive input methods. Some of these gestures were used to create memos, decide on the orientation of the screen view and alerting the phone that someone was near and to make itself available by turning itself on. We can see how this device is able to monitor is presentation techniques on an autonomous level. Many of these ideas were not possible without the background interaction architecture. The integration of some of the most simple sensors shown in this article such as tilt, proximity, and touch, provides a groundbreaking framework for our current line of mobile devices today. One of the impressive points of this paper was the measurement of the user experience and the error correction. Given that the complexity of deciphering between gestures was a challenging task at that moment, being able to handle failure gracefully should be at the forefront. It's a key to keeping the intimate relationship between the device and the user. • Camera Phone Based Motion Sensing : Interaction Techniques, Applications and Performance Study Wang, J., Zhai, S., and Canny, J., ACM UIST 2006, pp. 101-110. In the paper Camera Phone Based Motion Sensing: Interaction Techniques, Applications, and Performance Study, we are introduced to a computer vision technique called Tiny Motion. Inspired by CyberCode's 2D barcode detection and processing, Tiny Motion detects a user hand motion using the camera. This research seeks to reveal new methods of interactions with the then developing mobile device. At the time, the phone was moving beyond the purpose of voice calling and the need for new and faster methods of interactions were growing. One of the solutions that were proposed in Tiny Motion's performance study was using the camera and spatial detection create a new pointing mechanism and text input. Through this experimental design Tiny Motion was able to observe that this mixed interaction technique abides by the Fitt's law. Due to the low quality camera and processing, the pointing performance was relatively low. However, the promise of this study is that with the growth in the performance of the camera, will result in the increase in performance of the Tiny Motion Algorithm.

Akhil Yendluri 8:23:38 10/12/2017

Sensing Techniques for Mobile Interaction
In this paper the authors tried to integrate sensors like infrared, touch sensitivity, tilt sensors with a mobile device and explore what all can be accomplished using them. They used them to perform new interactions like recording a voice memo, scrolling the page using tilt, switching between portrait and landscape mode, detecting whether the device has been picked up or not using the proximity sensor. The authors feel that the mobile interface must be aware of the user’s context in order to adapt to the user’s current task or situation. But the author’s feel that a hybrid combination of sensors and traditional methods are the most optimal way of completing a task. The authors have done research using a prototype device Cassiopeia in which they integrated all the sensors and performed formal and informal research. But they conclude that sensing techniques cannot offer a panacea solution for UI on mobile devices. They also mention some future use cases where these sensors could prove useful and provide a direction for future research in this field.
Camera Phone Based Motion Sensing: Interaction Techniques, Application and Performance Study.
Unlike previous paper which uses external sensors to become context aware, Tiny Motion, uses software to become aware of the user context. It makes a clever use of the camera to detect the user context. It takes a series of images from the in-built camera. It then a) converts them to grayscale b) applies Grid sampling to reduce computation complexity c) detects motion by using an algorithm d) and post-processes it to deliver the result. This has many useful implementations such as pointing in menus, playing games, using motion to move across maps, gesture recognition, etc. The authors perform formal and informal evaluation to evaluate the performance of Tiny Motion in real world scenario. The authors also conclude that Tiny Motion follows fitts’ law for pointing. The authors also talk about the drawbacks of this like drain of battery when estimating acceleration. Overall the paper teaches me the correct way of conducting an experiment and helps me understand the need for extensive evaluation.

Mehrnoosh Raoufi 8:27:00 10/12/2017

Sensing Techniques For Mobile Interaction: In this paper, some sensing techniques are introduced. These techniques lead to a mobile device prototype that is context-aware. They built and evaluate it. The main sensors used in this device are proximity range sensor, touch sensitivity, and tilt sensor. They also develop an application on their prototype that is voice memo recording. They argue that by the grace of sensing techniques, this device causes minimum distraction and requires the minimum visual attention of the user during voice recording. To avoid recording accidentally, they add some conditions under which recording takes place. First of all, the user should be holding the device. It can be recognized via the touch sensors available on the back and sides of the device. The second condition is that the users should keep the device close to themselves. This condition also can be easily detected by proximity sensors. At last, there is a specified orientation needed for recording to begin. Tilt sensors are responsible for determining this one. Tilt sensors are also used for switching between portrait and landscape modes of the screen when the orientation of device changes. This feature is very common in today's devices. The other usage of tilt sensors is scrolling the screen. They also perform an informal experiment to evaluate their voice memo application which they claim to requires less cognitive and visual attention than traditional methods.-------------------------------------------------------------------------------------------------------------------------------- Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: In this paper, they present a software called TinyMotion that is a method to detect device movement by capturing from built-in camera in the mobile phone. They argue that their approach can be highly popular in future since almost all phones tend to be camera phone in the future. They were right. Today, almost all of them have at least one camera. They proposed that built-in camera can go beyond taking a picture and it can contribute to interface design. They take advantage of this idea in their TinyMotion software. They introduced different parts of their algorithm for their application development. The first one is color space conversion in which they extract a gray scale image from the captured image via bit shifting method. The second part is grid sampling. They used 8x8 sampling window for their implementation. The third part is motion estimation. They detect motion through similar technique used in video encoders. The last part is post processing. In this step, they accumulate the relative movements detected in the previous step to provide an absolute measurement from a starting position. They implement the TinyMotion based on this algorithm. Then, they performed an informal user evaluation to test its usability. At last, they perform a formal evaluation for two goals; to quantify human performance using TinyMotion and evaluating the scope of applications that TinyMotion can be applied to. In the end, they suggest that the built-in camera can be used for some input actions such as pointing, menu selection, and text input.

Ruochen Liu 8:57:07 10/12/2017

1. Sensing Techniques for Mobile Interaction: This paper, published in 2000, presents the contents about using sensing techniques to design and build mobile interaction. Basically, it is about several unique aspects of human-computer interaction with handhold devices in mobile settings. To build a prototype device, the authors built a set of sensors into a handhold device and designed new functions based on these sensors. Also, for the evaluation of the sensing techniques, an informal experiment, intial usability testing results, and user reactions are presented. This paper was written about 17 years ago, far before the popularity of smartphones, but some ideas and designs in this paper are still shining. That means the authors are the pioneers of technology and their research lead the way of development of mobile devices. It's interesting to find the similarity between previous research and designs on today. Both the touch sensors and tilt sensors are common on today's smartphones. And the range sensors are also applied in the new generation of iPhone in a different but similar way. The prototype device mentioned in the paper uses IR receiver and IR e mitter while the iPhone X uses depth of field camera to complete the human-machine interaction. I personally think there is still a promising future for sensing techniques applied in mobile interaction area since there are always new generations of sensors coming and the applications based on them are promising. Just like the example in the paper, using a sensor with a simple gravity-activated switch can improve the tilt sensor performance. What I also learn from the paper is the authors' careful observation and analysis on users' gestures. Since the human-machine interaction always works for the humanbeings the most, the humanside is the key to good performance. 2. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: This is a paper co-written by Professor. Wang in 2006. It presents a method to sense the hand movement of a mobile phone user by analyzing the image sequences captured by the built-in photo camera. The method, called TinyMotion, is a pure software appoach that only need the basic hardware of a mobile phone. Based on informal evaluation and formal user study, conclusions can be made that the TinyMotion method is reliable under most background and illumination conditions and it can be widely used in the areas such as the design of input method, the captrue of handwriting, the design of vocabulary mutlilingual handwriting recognition, and gaming interaction. The core of TinyMotion may be the algorithm. It consists of four steps: color space conversion, Grid Sampling, motion estomation, and post procesing. In the first step, the 24-bit RGB color is converted to an 8-bit gray scale image. Then, Grid Sampling is used to reduce the computation complexity and memory bandwidth for the next calculations. Next, motion estimation is applied to detect the rotation of camera. The results of this step are distance changes in the x and y directions. Finally, the post processing step provides an absolute measurement from a starting position. As can be seen from the examples, TinyMotion can be widely applied. Compared with accelerometers, the similar movement sensing technology with totally different working mechanism, TinyMotion still has some certain disadvantages. But as a new method, its future is promising.