Mobile Interfaces - 1

From CS2610 Fall 2015
Jump to: navigation, search

Slides

Readings

Optional

Reading Critiques

Matthew Barren 17:23:05 10/11/2015

Summary Sensing Techniques for Mobile Interaction: Hinckley, Pierce, Sinclair, and Horvitz consider the power of using sensors to provide low power automaticity to mobile device functions. The functionalities they focus are voice memo, scrolling, screen orientation, and powering on/off the device. The authors of this paper take advantage of technological advances in sensor power consumption and size to use sensing to provide greater automaticity of mobile device functions. Some of the functions that they consider are commonly found in handheld devices today. For example, screen orientation is available in practically all phones to change the use of reality on the phone. An interesting point is made when the authors examine scrolling in a cell phone. When certain actions are occurring it can be inferred that other they may negate the need for certain screen output to be displayed. They make this discovery by observing the display requirement while a user is scrolling. In this instance, the user will benefit most from being able to see more of the text that is being searched through. Additionally, since their focus of attention is on the text, peripheral functions such as a main bar can be moved off of the screen. As well, users can bring back these functions when they perform an action that shows that they are no longer scrolling. These contextual shifts allow for better performance while completing varying tasks on a mobile device. Bringing automaticity to on and off functions could be extremely beneficial if the amount of false predictions is equivalent or better than manual implementations currently in use. A false prediction for turning on could lead to the battery dying or performing unintended functions while in a piece of luggage or a pocket. Additionally turning off could lead to the user losing important unsaved information or frustration with intermittently using functionality. If an on/off function does not behave as expected, consumers will lose confidence and trust in the device. This is not likely why many devices do not utilize automatic on/off functions. Camera Phone Based Motion Sensing Summary: Wang, Zhai, and Canny examine the use of camera phones to sense movement of a mobile device. Their application TinyMotion is tested through pointing, menu manipulation, and game applications, and then the authors examine their application to the use of accelerometers. The authors of this paper repurpose the camera on mobile phones to perform tasks that commonly occur with accelerometers. These tasks include pointing, menu scrolling, and games. TinyMotion takes many photos, and then uses these photos to determine the camera shift, which results in actions occurring on the screen. The idea of using changes in photos to examine motion is used in most contemporary mouses. The ideas presented in this research are a new potential way of measuring motion by using a camera. Although the conclusions seem to result in the accelerometer being a higher performing sensor for movement, this does not negate the possibilities of using applications like TinyMotion for other solutions. In comparing TinyMotion and accelerometer, it is clear that the accelerometer benefits from less computation and battery life savings. Also, for many applications the accelerometer's blindness to the surrounding environment means that backgrounds that are difficult to differentiate, such as patterns, do not affect acceleromters. That being said, a camera has the benefit of being aware of the background. In considering this, the sensing of a camera is helpful when the application and user need to use the environment to aid in performance. Also, using camera movement sensing has the potential to use high resolution to provide very accurate results. The camera may not be the best sensor for small scale actions, but the camera can provide great benefits if the context requires situational awareness or environmental considerations.

Zihao Zhao 1:38:16 10/12/2015

The paper “Sensing Techniques for Mobile Interaction” is a paper about sensing techniques motivated by unique aspects of human-computer interaction. The device also uses the touch and the tilt sensors was able to prevent power-off. The work contains interactive sensing techniques and when creating smarter interfaces by giving computers sensory apparatus to perceive the world is not a new idea but there are few examples of interactive sensing techniques. By implementing specific examples , it explores some new points in the design space, uncover many design and implementation issues, and reveal some preliminary user reactions as well as specific usability problems. I think this idea is good since some existing devices provide functionality similar to this automatic power-up through other approaches. This mean provides similar functionality without requiring the user to open or close anything and it allows easy one-hand operation. Exactly what we can or cannot accomplished with this techniques is not obvious. However, we must recognize that sensing techniques cannot offer a panacea for UI on mobile devices, and careful design and tasteful selection of features will always be required. Only some of the actions that mobile devices support can lend themselves to solutions through sensing techniques, other task maybe too complex or too ambiguous. ------------------------------------------------------------------------------------------------ The paper “Camera Phone Based motion Sensing: Interaction Techniques, Applications and Performance Study” introduces a software approach to detect a mobile phone user’s hand movement in realtime by using the built-in camera of the phone, TinyMotion. This paper proposes a informal evaluation and followed by a 17 participants’ formal evaluation. In this case, the paper followed the principle of iterative design which help the designers deeply understand about the problem. I feel quite excited with the idea of take the moving detection as an input for the text or to play a mobile game. The TinyMotion algorithm consists of 4 steps of which I think is quite relevant to pattern recognition. The main idea of the the algorithm is to compare the movement of gradients from frame to frame. Although the algorithm is not complicated but it is good enough for the TinyMotion which has been showed on the experimental statistics. Since this work was done before 2007 and the hardware of cellphone at that time was not as good as now. I think if the work is conducted now, the results will be quite different because 1)we can add large RAM on the cell phone and it will supply us with more space to put character dictionary. 2)The frequency of the camera is largely enhanced and it is more than the frame of a mouse and the information transmission rate will be much more than 0.9(bit/sec). During the target acquisition part, I am quite confused of the intuition of the warm up session, since as mentioned by the author, we are not intending on the learning curve, so is the warm up session necessary? And the participants only entered 8 sentences with each input method while another study entered at least 320 sentences. The author gives the reason that it is because we don’t need to know the learning curve of the participants, but does it really work with only 8 sentences? During the evaluation results part, Fitt’s Law was used as a benchmark for the applications on TinyMotion. There are many interesting and useful feedback given by the participants in the study, such as we should consider the pattern to input a sentence rather than only word by word.

Ameya Daphalapurkar 18:32:50 10/13/2015

The paper titled ‘Sensing techniques for mobile interaction’ precisely talks about the Human Computer Interaction involved in hand held devices in the mobile settings. The paper basically tries to bring the new aspects in mobile devices by introducing the changes in orientation in the mobile screens when the mobile is held in a specific way and even calibrating the auxiliary movements. Consideration of various factors for research with the use of sensors and there various implications is a major part. The authors also describe their sensors and electronics which are installed on mobile devices like Cassiopeia. The various sensors include Touch sensors, Tilt Sensors, Proximity Sensors. To explore the design space, many testing are done which include usability testing which includes many informal testing like the detection of portrait or landscape mode. The algorithms merge and use scrolling as well to detect orientation quickly. Power management and its factors like the placement of the buttons and ease of user are important as well. ************************ The paper titled ‘Camera Phone Based Motion Sensing : Interaction Techniques, Applications and Performance Study’ is about presentation of a software named Tiny Motion that uses built in camera that analyzes sequences of image captured by detecting the hand movements of the user. Paper also presents the implementation and design and the various applications of tiny motion. The software includes many movements such as horizontal, vertical, rotational and tilt movements. Tiny motion thrives better irrespective the imaging surface is near or far as compared to the engineering camera phone applications which need a suitable flat surface. The paper also includes an algorithm explanation including the four major steps. Color space conversion in which the 24bit color is converted to 8bit. Reductions in complexity of computations are achieved through grid sampling. Some of the steps are similar to those with MPEG encoding and so is motion estimation, which includes a block matching algorithm of full search. Post processing involves further changes to properly calibrate the start position. Sensing camera on cellphone thus gave the idea for mobile gestures. Vision tilt text involves many configurations which involves setting buttons on keypads for specific actions. Formal evaluation included tests of participants and consideration of various attributes such as Target / Acquisition pointing, Menu selection and text Input in the consideration. Thus the paper concludes that the Tiny Motion software detects camera easily, follows many laws such as Fitt's and an input for Tilt text.

Long Nguyen 19:08:13 10/13/2015

Read on "Sensing Techniques for Mobile Interaction": This paper idea is about how we can use sensors in mobile device to benefit user in contextual awareness area, where low cost sensor can greatly increase usability in mobile applications. The authors point out implementation of different sensors can conflict with each other, discuss the challenge and suggest new solutions in design process. I believe one of the main contribution of this paper is the idea of using tilt sensor and make it interactive between mobile devices and user. Even though implementation was still naive and simple, since the paper was published in 2000, it has shown new opportunities and innovatives for mobile devices applications until now.----------------------------Read on "Camera Phone Based Motion Sensing : Interaction Techniques, Applications and Performance Study": This paper and the above paper has some commons in using sensor to explore new data source/ input and create new functionalities. This paper focuses on one particular sensor: intergrated camera on mobile phone, and implement a new pure software to evaluate new interaction availability between human and handheld. I think the ideas were quite creative at that time, and nowadays we can see some new applications applying its idea like heartrate monitor. However I think camera is not a good sensor to control/interact comapred to smartphone/ handheld front face nowadays, because smartphone is quite big and difficult to hold in one hand, using index fingers to control through camera at the same time. If in the future there are others new prototype/ device with smaller and have camera intergrated in good position, I think this idea can be applied wider and more effecive.

Manali Shimpi 22:36:49 10/13/2015

Sensing Techniques for Mobile Interaction: The paper introduces and integrates sensors into handheld devices and illustrates various new functionalities in these hand held devices using these sensors. The building blocks of interaction with mobile devices is very different than desktop computers. It is important to understand the context of interaction in case of mobile devices for them to adapt the interaction to suit the current task. Authors describes the new technique which are implemented like voice memo recording by speaking into the device, switching between portrait and landscape display modes, automatically powering up when the user picks up the device etc. and present usability testing results as well as users reactions to these techniques. The various sensors used are touch sensor, tilt sensor and proximity sensor. Touch sensor allows to detect if the user is holding the device or not. Tilt sensor detects the tilt of device relative to the constant acceleration of gravity. The proximity sensor is calibrated by measuring the actual distance to an author’s hand in a normally lit office environment. The authors also made changes in the software architecture by implementing information server that acts as a broker between the PIC / sensors and the applications. Author has described usability issues in the context of each technique. In the end author says that use of sensors can help deliver devices that are as simple and pleasant to use as possible while still allowing direct control when necessary. ---Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: The paper presents an implementation of software TinyMotion which detects a mobile phone user’s hand movement in real time by using in-built camera. TinyMotion detects the movements of a cell phone in real time by analyzing image sequences captured by its built-in camera. The TinyMotion algorithm consists of four major steps: 1. Color space conversion. 2. Grid sampling. 3. Motion estimation. 4. Post processing. The formal evaluation of tinyMotion was conducted in three sessions. Target Acquisition/Pointing, menu selection and text input. Target Acquisition/Pointing was designed to quantify the human performance of the TinyMotion based pointing tasks. Menu selection included three tasks which are cursor button based selection, TinyMotion based selection, and TinyMotion based selection with tactile feedback. Text input compared the performance of the most popular mobile text entry method – MultiTap with our Vision TiltText input method. Paper showed that it is also possible to build higher level interactive applications based on sensing method.

Chi Zhang 22:55:48 10/13/2015

Critiques on “Sensing Techniques for Mobile Interaction” by Chi Zhang. This paper is about the sensing techniques in mobile interaction. It was described in this paper that the usability of mobile apps could be greatly boosted by using the low cost sensors. We can easily see the implications of it when we think about the use of smartphones. It utilized all kinds of sensors and achieved great mobility and usability. Well though great achievement has been met by people nowadays, there is still room for potential in the utilization of sensors. As long as there are more creative usages of sensors discovered by people, the mobile terminals would become more and more popular in the future. This is a very good paper, and it gives a very comprehensive introduction of the important role played by sensors in the mobile technologies. ------------------------------------------------ Critiques on “Camera Phone Based Motion Sensing : Interaction Techniques, Applications and Performance Study” by Chi Zhang. In this paper, the concept of TinyMotion is introduced. It is a technique used for detecting movement and tilting of mobile devices. It analyzes the image sequences captured by the camera, and detects the movement from the image sequences. It can manage to detect many kinds of movement, like horizontal, vertical, tilting, rotational. It’s a very creative paper as it tries to utilize the functionality of camera to detect movement. My doubt towards this technique is how popular it could be in the real life. And how could people integrate this technique into the daily used applications. I believe there would be more persuading evidence found in the future for this technique. And in all, this is a very carefully designed research paper and it has very good part of designed experiments.

Vineet Raghu 0:06:19 10/14/2015

Camera Phone Based Motion Sensing: Interaction Techniques, Applications, and Performance Study In this paper, the authors describe TinyMotion, which is a software based method that uses a mobile phone’s built in camera to detect the hand movement of the corresponding user. They proceed by describing the TinyMotion algorithm, which involves four steps to determine the motion of the user. The usability results described in the informal results section appear to be highly positive, as typical users found it very difficult for TinyMotion to be fooled into making errors. In addition, the technology was very novel at the time, giving users plenty of ideas as to how to make this technology an integral part of the phone experience. Then, the authors of the paper conducted a formal user study experiment to determine the effectiveness of several different input modalities enabled by the camera based motion sensing. In particular, they found that their target acquisition through tilting was slightly higher for vertical targets than horizontal. It is interesting to note that the authors when testing text based input barely allowed users to practice the text-tilt system despite the fact that this was brand new to users. To then compare this to multi-tap which had been in use by nearly all phone manufacturers seems to be unfair, and as predicted multi-tap had much better accuracy but who knows if this would hold under constant practice? It was a nice addition that the authors gave constructive criticism about their own prototype mentioning that the battery life could be a concern to users with the backlight permanently fixed on. Also, at the time of publication this was a very novel technology that wasn’t employed much in the flip phone era, but unfortunately with the advent of accelerometers in every phone, the usefulness of this type of software today has to be altered. It could be used as an alternative type of input to enhance current tilting and touch screen, but it would not be a standalone application today. -------------------------------------------------------------------------------------------------------- Sensing Techniques for Mobile Interaction The authors here describe new sensing techniques for mobile interfaces, such as recording memos when holding a device like a phone, switching between screen orientations when the phone is turned to the side, and automatically powering up the device. In addition, the authors describe ways of evaluating these new techniques in a usability setting, and they further their goal of providing a context-sensitive interface. Next, the authors describe the various sensors that they utilize for this, and discuss both the benefits and drawbacks of these sensors. For example, they give how the tilt sensor determines the orientation of the device based upon the gravitational acceleration, but they also mention how this sensor would be unable to determine if the user is holding the device right side up or upside down. Finally, the authors describe some experiments that they ran on each type of sensor modality, and discuss their observations from these. One criticism I have of these experiments is in the voice memo detection results. They give a bar chart describing the error results from users trying to use voice memo detection, but they fail to include error bars on these charts, and they have very few samples which make it difficult to take these results as reliable. The qualitative descriptions appear to be useful to improve the prototype in the future, but a more rigorous experiment would have been a nice addition as well. As with the previous paper, I appreciated how the authors gave an in depth look at both how the sensing techniques are beneficial for the users and the shortcomings of how the prototype could be improved in the future. Overall, I felt that this was a good paper to describe how advances can be made to make mobile devices more context aware, and this field is still being actively pursued and advanced today.

Priyanka Walke 0:40:04 10/14/2015

Reading critique on Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study This paper invents the use of mobile phone cameras to capture the input for gesture recognition, tilting and even movements. This is achieved through a software named TinyMotion in C++ using BREW platform for Motorola V710. TinyMotion makes use of the personal cell phone which is a widely used device today. It uses the camera based phones in an innovative way to propose innovative interaction modes thereby giving rise to a new way of exploring the existing devices. TinyMotion along with another paper on sensors for mobile interaction investigate for new as well as existing ways that make mobile interaction more innovative and convenient. This off course underlines the main concept of this paper. Also, we could make use of similar grounds to find out innovative ways to interact with some other devices. The key points of discussion about TinyMotion is the use of cameras in case of gesture recognition, for character recognition in case of input, creating camera based games, tilt text and finally the motion menu. The paper did not conduct a formal user study however, an informal user study statistics helps in understanding the challenges faced. It also depicts every scenario that helps in understanding the working, along with the construction process for generating images, their processing and finally storage. These kind of challenges arise when different environments are considered. Reading Critique on Sensing Technologies for Mobile Interaction This paper explores new ways of interaction with PIM’s using the embedded sensors like touch, tilt and proximity in those devices. The papers main concept is to innovate and simplicize the way of interaction with mobile devices. It states that the interaction should reduce visual focus, in fact it should cut down the idea of having a constant attention of the user. This interaction should occur naturally as it happens with all other tasks of life. The paper mentions sensors like Accelerometer, Infrared sensors for proximity, 2D gyroscopic axis to measure tilt for changing modes of view and finally a touch for power on. The authors state that such a combination of sensors working in congruence with each other is definitely the next big thing. According to them, much of the mentions of this paper are practically implemented in today’s world. One of them being the Proximity sensor that is widely used in locking the touch screen phones while answering a call. Some sensors though not mentioned in the paper can detect a person’s walk is present in the IPhone’s health app. As mentioned earlier, the main concern behind creating these devices is to make them more natural to use and adaptive to the user’s environment. The use of a good quality user testing would definitely justify the use of this technology. Also, an adequate feedback along with experimental data from a varied source of audience was required. Making use of different scenarios would more appropriately explain the practical uses of the devices and also the situations and scenarios where they are more useful than the deprecated technology. An important aspect of these devices is that its users need to get comfortable with them along with the ease of use as they have a lot of friendly features one of them being the option to disable them whenever required.

Shijia Liu 0:47:50 10/14/2015

Section 1: Sensing Techniques for Mobile Interaction.==== At first, this paper has shown us basic conception and some examples about the related work about the mobile interaction. Then it discussed the component and structures about the hardware sensors, the authors show us a host of different types of sensors via a few of instances and graphs about how those sensors works and what their roles and functions when they perform in the mobile technologies. Then this paper transfer its main point to the interactive sensing techniques, which includes the usability of testing, the usability of the techniques by using some charts and examples. Furthermore, it also shows the technique of recording the voice memo, which is called the voice memo detection and illustrate constraints and performance of it as well. In addition, the paper have shown us the technique of portrait/landscape display mode detection which is also significantly represent how mobile interaction works and convenient human life. At last, we also learned that how power managed in the mobile devices. Above on all, the authors have discussed the situation of mobile interface and the future developing directions of it.================================Section2:Camera phone based motion sensing: interaction techniques, applications and performance study:===TinyMotion is an independent software which is supposed to detect hand movement during using mobile devices. It is achieved by analyzing image difference with the aid of the built-in camera. Basically, it has an algorithm consisting of four major steps: 1. Color space conversion. 2. Grid sampling. 3. Motion estimation. 4. Post processing. In this paper, four applications (Motion Menu, Vision TiltText, Image/Map Viewer and Mobile Gesture) and three games (Camera Tetris, Camera Snake and Camera Break-Out) are introduced to test the efficacy of TinyMotion.Through an informal evaluation and a formal study (the latter involving 17 participants from a university), five results can be shown as follows. 1. TinyMotion performs rather reliably for detecting camera movement under the condition of most background and illumination. 2. TinyMotion based pointing measurement follows Fitts’ law and the parameters of Fitts’ law can be used to characterize the target acquisition tasks based on TinyMotion. 3. TinyMotion enabled input method is faster than MultiTap for the users with several minutes’ training. 4. TinyMotion makes it feasible to achieve large vocabulary and multilingual real time handwriting recognition on mobile devices by using the built-in cameras as gesture capture devices. 5. TinyMotion potentially enables games on the current camera mobile devices more enjoyable and immediately available. Although some problems such as a relatively high error rate in TinyMotion selection, Vision TiltText, an inconsistent experience across the current game and sensitivity issue exist, TinyMotion is a promising independent software project with further improvement on higher level interactive applications as hardware performance in phones advances.

Samanvoy Panati 1:05:37 10/14/2015

Critique 1: Sensing Techniques for Mobile Interaction The authors in this paper added special features to the mobile device like detecting change in the orientation and position, voice memo detection, tilt scrolling and powering up when in use. Some of these ideas are currently implemented in different types of mobile devices. The other features are not yet implemented. The reasons for this might be because of ease-to-detect issues (the detection of a certain pattern may be difficult and leads to the problem where one pattern may be interpreted as another) or design problems (the user may not like the design included with these features). Adding certain features may be liked by some users and disliked by others. Rigorous testing must be done and proper care is to be taken before launching new feature to a mobile device. The author’s experiments prove that using different types of sensors gives different features to the device even though they are not expected to be of such importance. This paper was published 15 years ago and it gave many insights into the use of sensors with hand held devices. ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Critique 2: Camera Phone Based Motion Sensing – Interaction Techniques, Applications and Performance Study This paper introduces uses TinyMotion, a software which uses camera to do amazing things. It can detect camera movement by capturing images continuously. The camera can be used to input text, capture handwriting and even to play games. The authors illustrate the experiments and that they got very accurate results in extreme conditions. The experiments are done with the cameras dating 10 years back and evidently there can be many interesting things to do with the cameras used in mobile devices at present. TinyMotion is best in the way in which it doesn’t make any changes to underlying hardware and it uses less battery power. The research done in this paper gave many thoughts and ideas of plausible features that can be done using camera of mobile hand held devices.

Darshan Balakrishna Shetty 1:16:53 10/14/2015

Sensing Techniques for Mobile Interaction : This paper sheds light on new ways for interacting with a mobile device. In the paper, the mobile device is enhanced by adding sensors and several applications are implemented to take advantage of sensor readings in order to improve user experience. The prototype phone built, which is used in this work, features: 2 accelerometers, an IR receiver/transmitter, and a couple of touch sensors. The authors succeeds in parsing readings from those sensors in order to identify the context of the device that it is operating. In order to identify ways for leveraging sensor readings, they had to perform an exploration of the design space and identify concepts that have either been tried before, or were completely novel. The first application developed was a voice memo recording which was triggered by the position of the phone and its proximity. This application allowed users to perform visual and cognitive demanding tasks while they were recording a message. Feedback was provided to them through specific sounds produced by the phone. Another application developed, was to auto-detect the phone orientation, In this application users experience changes based on the orientation of the phone's monitor and its position. Most of the users find this feature really helpful and preferred it over conventional ways for altering display orientation. Furthermore, another application presented in this paper is tilt-sensitive scrolling, which enabled scrolling of documents through positioning of the phone. Finally, a phone feature presented is power management, which based on the phone’s position, phone’s orientation, and proximity of the user, user input, and duration of input, could power on/off the display of the phone. --------------------------------------------------------------------------------------- Camera Phone based Motion Sensing: Interaction techniques, applications and performance study : This paper makes use of a common feature of mobile phones and tries to create a new way of taking input from users. TinyMotion innovates in terms of mobile device interfaces and user interaction, because it leverages an embedded phone sensor to create input. The author makes a plausible observation, states that cameras on mobile phones are common and they can be used as natural input interface. TinyMotion borrows ideas from computer vision and through software engineering they succeeded in developing it for mobile devices in a lightweight fashion. In order to test TinyMotion's feasibility and success rate, the author performs two rounds of tests (informal and formal) with different users. In the informal testing, testing involves simple benchmarks of motion detection in different environments and collecting initial user impressions of random users. And the formal testing examines TinyMotion's features through a large number of test subjects. The users were given the opportunity to go through a training session before the evaluation begins. Testing involved tasks with varying difficulty, ranging from simple tasks, such as target acquisition, to more complex tasks, such as playing games using TinyMotion. Evaluation results prove the success of TinyMotion and its performance, which resembles Fitts' law. Overall, the work presented in this paper shows a novel idea on making use of mobile phones' sensors and a nicely engineering interface for interacting with mobile devices.

Sudeepthi Manukonda 1:28:04 10/14/2015

Sensing Techniques for Mobile Interaction is an interesting paper written by Ken Hincley, Jeff Pierce, Mike Sinclair and Eric Horvitz. Hand held devices in mobile settings work the human computer interaction in mobile devices. Changing Orientation and position are two of the important features. When it comes to real world, the functions domain increases further. For sensing, a wide range of sensors are required. In a handheld device, the sensors are combined to provide an integrated device. The paper starts by comparing the mobile device with Mr. Cleo. The mobile Device is unaware of environment, oblivious to any presence, selfish and inconsiderate. Whereas Mr. Cleo is aware of everything that is happening around her, detects presence and is the selfish too. A smart computer is defined as the one that is aware, respectful, etc. The design of smart computer is in both hardware and software hands. Sensors for human interactions aim at meeting the reality. For this to happen, the sensors should be able to sense more that just the commands. Background sensing can enhance Mobile User Interface to an extent. To begin with the mobile sensor prototype, the prototype has been introduced with a tilt sensor, proximity range sensor and a touch sensor. The tilt sensor uses the tilt feature relative to the gravity. The drawback with the tilt is that it cannot sense rotation around vertical axis. Voice detector should be held close to the face, should be tilted like a receiver, and it must be held. Power management is another interesting feature this paper talks about that is included in the prototype. It has automatic power on while holding in the hand but it doesn’t have automatic power off. This is because the touch, tilt, and proximity sensors prevent it from switching off. New User Interfaces with better sensitivity to the user and the users’ physical environments. There is still more scope to simplify and enhance the user interface. We can include new behaviours and services that users find compelling, useful, engaging and respectful. —————— The second paper “Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study” by Jingtao, Shumin and John talks about something called TinyMotion. TinyMotion is a pure software approach for detecting a mobile phone user’s hand movement in real time by detecting image sequences captured by the built-in camera. The built in camera can be anything ranging from the smallest size to the largest. There are five important points one has to note about TinyMotion. It can detect the movement od the camera with great reliability under most background and illumination conditions. It uses Fitts Law and its parameters to calculate the distance for target acquisition. It user something called vision Tilttext, an input method, to enter sentences faster than multiTap. It uses camera phone as a handwriting detection device. Gaming with this device is more enjoyable and realtime. The algorithm to create this TinyMotion includes Color Space Conversion, Grid Sampling, Motion Estimation, and Post Processing. This is the algorithm part. Implementation is the next step. Implementation includes motion gesture and vision teiltext. Then two rounds of evaluations are done and the evaluation results are presented. The evaluation results deals with the results in target acquisition and pointing. After all the tests are done, it is concluded that the TinyMotion can detect camera movement under most back ground and illumination conditions. It satisfied most of the proposed points. TinyMotion is purely software.

Xinyue Huang 1:29:55 10/14/2015

Sensing techniques for mobile interaction The paper described sensing techniques motivated by unique aspects of human-computer interaction with handheld devices in mobile settings. To explore some of research issues, and work towards design goal of providing context-sensitive interfaces that are responsive to the user and the environment, they have constructed a prototype sensor-enriched mobile device. They added a two-axis linear accelerometer (tilt sensor), capacitive touch sensors, and an infrared proximity range sensor. These sensors combine low power consumption and cost with the potential to capture neutral, informative gestures. It introduced the hardware configuration and sensors. It includes touch sensors, tilt sensor, and proximity sensor. Tilt sensor can detect the tilt of device relative to the constant acceleration of gravity. It can also responds to linear accelerations, such as those resulting from shaking the device. Proximity sensor uses an infrared transmitter/receiver pair positioned at the top of the device. For software architecture, they implemented a software context information server that acts as a broker between the PIC/ sensors and the applications. The server continuously receives sensor data packages from the PIC, converts the raw data into logical form, and derives additional information. For voice memo detection, they allow the user to record a voice memo by simply holding the PIM like a cell phone or microphone and speaking into the device- a natural, implicit gesture that requires little cognitive overhead or direct visual attention. The paper also introduced the informal experiment which include several separate conditions and some usability problems. For portrait / landscape display mode detection, users of mobile devices can tilt or rotate their displays to look at them from any orientation. For tile scrolling&portrait landscape modes, it includes clutching and screen real estate optimization, transfer function, contrast compensation, and integration of portrait / landscape modes. For power management, the power button placement, size and protrusion must balance the desire to have it easily accessible against the danger of accidentally hitting the power button or having it held down while carrying the device in a pocket. Camera phone based motion sensing: interaction techniques, application and performance study The paper presented tinymotion, a pure software engineer approach for detecting a mobile phone user’s hand movement in real time by analyzing image sequences captured by the built-in camera. Mobile phones have become an indispensable part of our daily life. The compact form has the advantage of portability, but also imposes limitations on the interaction methods that can be used. The paper introduced three aspects of related work. The related work includes emerging camera phone applications, new interaction techniques for mobile devices, and computer vision in interaction systems. The paper next introduced the tinymotion algorithm. There includes some aspects such as color space conversion, grid sampling, motion estimation, and post processing. For implementation, the paper used many applications and games to test efficacy. The four applications include Motion Menu, Vision TiltText, Image/Map Viewer, and Mobile Gesture. The three games include Camera Tetris, Camera Snake and Camera BreakOut. The author introduced two kinds of evaluations. They evaluated the reliability of TinyMotion by two methods. The first is to benchmark the detection rate of camera movements in four typical conditions and four directions. The second is to conduct an informal usability test by distributing camera phones installed with TinyMotion-enabled applications to users. For formal evaluation, the paper established two goals. The first one is to quantify human performance using TinyMotion as a basic input control sensor. The other one is to evaluate the scope of applications that can be built on the TinyMotion sensor. The experiment contains six parts. They are overview, target acquisition/pointing, menu selection, text input, more complex application and collecting qualitative feedback. The paper gave three aspects on evaluation results. They include target acquisiton/pointing, menu selection, and text input. The author also include some future work which include consideration the use of clutch that can engage and disengage motion sensing from screen action. Another direction is to explore the possibility of applying a marking menu approach using gesture angles rather than movement distance for menu selection.

Zinan Zhang 3:09:19 10/14/2015

1. For camera phone based motion sensing: interaction techniques, applications and performance study --------------- This paper mainly talks about interaction techniques and applications based on the phone’s camera. That is, take full advantages of the phone’s camera to accomplish many other missions. Among all the applications mentioned in the paper, the one called ‘Vision TiltText’ is interesting. The main idea of this vision tilttext is to text some words with the help of the movement captured by the camera. If I want to input a letter ‘A’, I can just press the button 2 while moving my phone to the left. As we all know, the Squared input method is what most people use. It is easy and convenience for people to use. However, it cannot actually input what people really want 100%. Because there are three letters in a single button, the phone will provide several possible words for the user to select. Then, the keyboard input is applied for the phones input. It can provide all of the letters for user to input. Unfortunately, there is a great deficiency for this kind of input: the keyboard is too much big for the size of the phone’s screen. So people will always touch the wrong letter that the do not want to input. It is not handy sometimes. But with the technique mentioned in the paper, the deficiency of the Squared input method is eliminated. People can easily input the correct letter with only one simple movement or even do not need to move. That will speed up the time for input significantly. Only one factor may cause the incorrect of the input is the light because the moving is depended by the picture captured by the phone’s camera. Either too light or too dark will cause the incorrect of the capture. But according to the data gathered by the author, this kind of function performances well in different kinds of light. It is robust.------------------------------------------------------------------------------------ 2. For sensing techniques for mobile interaction------- This paper also mainly focuses on the some interaction application based on the phone’s sensing. One of application mentioned in the paper interests me a lot: voice memo detection. This kind of function is for activate the voice memo and starting recording. I think it is very useful. For example, the reporter always needs to do some interviewing maybe in the street or in the park. Some interview needing starting at once so they do not have too much time to turn on their recorder and it is also not continent to carry a recorder with them everyday. Then this new phone comes to work. The reporters just need to take out their phone and lean it towards him or her or the interviewer and everything they need to record is done. And they do not need to worry about the phone recording when it is place in their pocket or bag. There are some functions like the user have to hold the device in order to start the voice memo to ensure that the phone’s voice memo maintain inactive state.

Jesse Davis 7:04:31 10/14/2015

Sensing Techniques for Mobile Interaction As the name of the paper states, this paper focuses on various sensing techniques for mobile interfaces. It goes over many of the techniques that we presently use in our phones today such as touch sensors, tilt sensors, proximity sensors, etc. This paper actually explores the physical implementation of these features as in, how do we implement it (hardware), how will the user activate it, and then tests these methods in order to find improvements. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study This papers covers different techniques that utilize a camera phone to detect different types of motion to run various applications. The beginning of the paper discusses their technique for capturing gestures with the camera phone and then it dives into some applications that use their technique such as Mobile Gesture and Camera Snake. They then evaluate their techniques in various environments that varied in illumination and background with positive results. Then, for formal evaluations, they tested specific scenarios in their applications such as target acquisition and pointing, menu selection, text input, and some of the games mentioned earlier. To summarize, I would say the results they got overall were a mix between good and diverse (i.e. all the feedback was generally good, some users had specific preferences that countered other users’ preferences). Peephole Displays: Pen Interactions on Spatially Aware Handheld Computers This paper goes over some very interesting and unique user interface manipulation techniques with respect to pen interactions on spatially aware handheld computers (which I knew were a thing, but were unaware of a lot of the methods/applications presented in this paper). The most interesting and noteworthy implementations in my opinion were: the obvious one-handed selection (quick), the two-handed selection (even quicker, potentially more accurate), ALL of the 3-D implementations and applications. The figures that accompany the 3-D applications and implementations helped a lot to visual what the authors were conveying to the reader and I thought they were all really cool.

Mingda Zhang 7:54:36 10/14/2015

Sensing Techniques for Mobile Interaction As a pioneering work published in 2000, this paper demonstrated a few basic ideas of using sensing techniques to improve mobile interaction experience, and proved their feasibility in a few applications. Multiple sensors as well as with customized software were used together to detect the environment change, for example, proximity range sensors were used for distance detection; tilt sensors were used for angles detection, and accelerators were used for orientation detection. Considering the limitations of era, this paper was pretty creative and original, and most of them have been widely used in industry. Nowadays various inexpensive sensors were added to mobile devices, in order to achieve a better user interaction experience with reduced cost. The idea behind these add-on sensors could be traced to this paper. An interesting issue for me was the privacy and security. Maybe in the 2000 this was not a big problem, but now mobile devices were everywhere and people started to care more about their own privacies. Technically, Google Now and Apple Siri has provided the functionality as voice-based personal assistant, but they both require explicit instructions to avoid overly affecting user life. We have experienced so many privacy leakage with computer, now since personal devices had a closer intimacy with us, undoubtedly more damages would be caused if this happened with our phones. Another concern of this paper was the possibility of false positive signals. But as we could seen, in today's mobile phones this problem has been resolved satisfactorily. It was a great lesson that never take current technology for granted, as they might be great breakthrough when first discovered. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study Different from the first paper, authors of this paper proposed a purely software-based solution for motion detection. Their approach based on camera-based context recognition, which was kind of similar with the modern mouse. The ideas were also like the popular concept of pattern recognition, and the results were promising. It was worthy pointing out that cameras in 2007 were not as good as we were using nowadays on phones, and CPU of mobile phones were also pretty weak. I am curious about the power draining for such a functionality. This paper was a complete research articles with the algorithm development, prototype applications test and user experience study. This paper reminded us to use existing sensors for achieving non-trivial tasks.

Adriano Maron 7:56:09 10/14/2015

Sensing Techniques for Mobile Interaction: This paper describes new techniques for interacting with mobile devices through the integration of different sensors that capture movement, orientation, and more. The authors propose that mobile devices augmented with different sensors can identify many activities and provide more interaction possibilities. This work presents an usability test with a prototype phone enhanced with the following sensors: two accelerometers, an IR receiver/transmitter, and a couple of touch sensors. The features provided by the phone, such as memo-recording, phone orientation and tilt-sensitive scrolling, were well received the users. Although very simple, those use cases demonstrate the relevance of the contextual information when designing new interaction techniques. Considering the tendency of having many sensors in a single device, it is important to accurately make sense of the of data being measured. False positives and incorrect gesture identification might cause frustration to the users. Finally, interactions need to feel natural and not intrusive, similarly to the auto-detected orientation of the display. ================================================= Camera Phone Based Motion Sensing : Interaction Techniques, Applications and Performance Study: This work proposes TinyMotion, a software-based technique that uses the built-in camera to identify gestures and serve as an alternative input channel for mobile devices. The motion sensing techniques are inspired in the computer vision algorithms, providing a reliable starting point for image processing. TinyMotion combines the images captured by the camera with keystrokes to provide new input possibilities. For TiltText, pressing a key and tilting the phone with select one of the character options associated with the pressed key. Menu selection is another task enhanced by the camera-based motion sensing. In this case, the completion time was larger than other techniques, which demonstrates that motion sensing is no applicable to every task. Accelerometers provide a similar functionality as TinyMotion. TinyMotion incurs some overheads in terms of battery usage and processing, which are not present in accelerometers. However, given its software-based nature, TinyMotion can be easily extended to provide a more rich set of behaviors that could use the contents of the image as extra source of information (e.g., facial expressions, light intensity, etc.)

Ankita Mohapatra 8:09:39 10/14/2015

Review of “Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study” by Jingtao Wang, Shumin Zhai, and John Canny The paper presents a system named “TinyMotion” which is software developed for mobile phones to utilize the camera to test the feasibility of using the camera motion detection for menu selection or game playing, and for using the camera for handwriting recognition. Even though the camera is not as high quality as they are currently on many smartphones, it was shown in the paper that the camera motion detection and recognition software the authors developed performs accurately even in somewhat extreme conditions. While this paper shows great promise in the usefulness of a camera upon other features, users will not consider a camera to be anything other than a recording device. In addition, users are unlikely to use a camera to do functions that require movement because it will either cause others to believe the camera is recording them or it will cause the user to look very strange while tilting the camera to do selections or games. I recall previously when there was one Nintendo game that used the camera to provide augmented reality, where animals would be flying towards the player while the player would have to point to the animal and shoot a laser to prevent it from colliding with the player. It looked very odd, and some people made comments about that, both to him and behind his back, and eventually he stopped playing that game (or at least stopped playing it in public). While there is a feeling of novelty associated with using a camera to do other things, it feels like any products that come from augmenting the usefulness of a camera will be dead-ends until people are willing to allow cameras to be constantly recording. Along the same vein, I was surprised also that this paper mentioned nothing about the privacy issues that may come about due to the usage of a camera, but it may not have been an issue back in 2006. Based on hindsight, it does not seem surprising that using a camera for tracking cursor movement works well. Optical mice use practically the same concept, by using a low quality camera to track movement. To use another low quality camera for the same end should indicate that the results will be similar. However, I wish the paper mentioned the sensitivity of the camera to cursor movement ratio, since it is not clear from the paper how quickly someone could acquire a target (e.g. one inch of movement to ten pixels on the screen). If this information was given, then it would be easier to get a sense of the magnitude of the tilt movements needed to move a cursor an approximate distance on the screen. Review of “Sensing Techniques for Mobile Interaction” by Ken Hinckley, Jeff Pierce, Mike Sinclair, and Eric Horvitz The authors present a device and software for the device to utilize accelerometers in two dimensions to detect the orientation of the phone. The software then uses the accelerometer readings to perform multiple services, such as memo recording, portrait and landscape orientation detection, powering up the device, and scrolling. It is interesting to see that only one of these ideas is in use today, namely the automatic detection of portrait or landscape orientation. As the authors have commented about their features in the paper, it is possible that the biggest reason why the other features may not have been seen in mobile smartphones today could be due to the lack of easy discoverability. It is possible that motions that may be natural for one event (e.g. holding a phone for a call) may occlude other possible functionalities that require the same gesture (e.g. the voice recording memo feature as presented in the paper). It may be easier for users to have a one-to-one correspondence of gesture-to-feature mapping, since a gesture would, without confusion, lead to a single service. However, the importance may not be focused on how the gestures were used, but rather the importance of using contextual data to aid a device to assist a user. Adding an accelerometer can provide many more features, four of which were presented in the paper. Notice that without the accelerometer, none of the suggested features can be used, since the device has no data to be able to determine the orientation of the device for portrait or landscaping, for example. While a desktop computer will not have any use for an accelerometer (unless a user is hoping to detect earthquakes?), accelerometers can enhance the utility and convenience of mobile devices. Even though a computer may not use certain sensors, there probably exist many other sensors that could be useful for making the desktop experience better. Overall, this paper is a good argument for the usage of context-aware sensors to improve mobile devices, and possibly other ubiquitous computing devices as well.

Mahbaneh Eshaghzadeh Torbati 8:15:44 10/14/2015

<Sensing Techniques for Mobile Interaction> The authors presented a software for the device to utilize accelerometers in two dimensions to detect the orientation of the phone. The software then uses the accelerometer readings to perform multiple services, such as memo recording, portrait and landscape orientation detection, powering up the device, and scrolling. It is interesting to see that only one of these ideas is in use today, namely the automatic detection of portrait or landscape orientation. As the authors have commented about their features in the paper, it is possible that the biggest reason why the other features may not have been seen in mobile smartphones today could be due to the lack of easy discoverability. It is possible that motions that may be natural for one event (e.g. holding a phone for a call) may occlude other possible functionalities that require the same gesture (e.g. the voice recording memo feature as presented in the paper). It may be easier for users to have a one-to-one correspondence of gesture-to-feature mapping, since a gesture would, without confusion, lead to a single service. However, the importance may not be focused on how the gestures were used, but rather the importance of using contextual data to aid a device to assist a user. Adding an accelerometer can provide many more features, four of which were presented in the paper. Notice that without the accelerometer, none of the suggested features can be used, since the device has no data to be able to determine the orientation of the device for portrait or landscaping, for example. While a desktop computer will not have any use for an accelerometer (unless a user is hoping to detect earthquakes?), accelerometers can enhance the utility and convenience of mobile devices. Even though a computer may not use certain sensors, there probably exist many other sensors that could be useful for making the desktop experience better. Overall, this paper is a good argument for the usage of context-aware sensors to improve mobile devices, and possibly other ubiquitous computing devices as well. <Camera Phone Based Motion Sensing> Interaction Techniques, Applications and Performance Study” by Jingtao Wang, Shumin Zhai, and John Canny The paper presents a system named “TinyMotion” which is software developed for mobile phones to utilize the camera to test the feasibility of using the camera motion detection for menu selection or game playing, and for using the camera for handwriting recognition. Even though the camera is not as high quality as they are currently on many smartphones, it was shown in the paper that the camera motion detection and recognition software the authors developed performs accurately even in somewhat extreme conditions. While this paper shows great promise in the usefulness of a camera upon other features, users will not consider a camera to be anything other than a recording device. In addition, users are unlikely to use a camera to do functions that require movement because it will either cause others to believe the camera is recording them or it will cause the user to look very strange while tilting the camera to do selections or games. I recall previously when there was one Nintendo game that used the camera to provide augmented reality, where animals would be flying towards the player while the player would have to point to the animal and shoot a laser to prevent it from colliding with the player. It looked very odd, and some people made comments about that, both to him and behind his back, and eventually he stopped playing that game (or at least stopped playing it in public). While there is a feeling of novelty associated with using a camera to do other things, it feels like any products that come from augmenting the usefulness of a camera will be dead-ends until people are willing to allow cameras to be constantly recording. Along the same vein, I was surprised also that this paper mentioned nothing about the privacy issues that may come about due to the usage of a camera, but it may not have been an issue back in 2006. Based on hindsight, it does not seem surprising that using a camera for tracking cursor movement works well. Optical mice use practically the same concept, by using a low quality camera to track movement. To use another low quality camera for the same end should indicate that the results will be similar. However, I wish the paper mentioned the sensitivity of the camera to cursor movement ratio, since it is not clear from the paper how quickly someone could acquire a target (e.g. one inch of movement to ten pixels on the screen). If this information was given, then it would be easier to get a sense of the magnitude of the tilt movements needed to move a cursor an approximate distance on the screen.

Kent W. Nixon 8:35:39 10/14/2015

Sensing Techniques for Mobile Interaction In this paper, the author discusses the efficacy of using additional embedded sensors in PIM device to power new interaction methods. The author fits a modern PIM with an IR range detector, capacitive touch sensor on sides on back, capacitive touch sensor on the bezel of the screen, and a 2D accelerometer parallel to the screen. Using data from these sensors combined with one another, the author is able to implement new interaction methods, such as tilt to scroll, power on when grabbed, and automatically begin voice recording when held up to user’s face. The majority of the user studies found these new interaction methods to be preferable to existing ones, and requiring less mental workload. This paper was very interesting to me as embedded sensing on modern devices is now a very common occurrence, but probably was not when this paper was published in 2000. Further, it was interesting to see the concept of sensor fusion being applied this early in the history of mobile device, as it is still a very common theme in modern research. This paper the very much related to my research, but differs in that minimal power performance metrics were reported. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study In this paper, the author proposes the use of the back-facing camera on a mobile device as a tilt sensor to be used in various applications. This is accomplished by regularly sampling from the image sensor and performing inter-frame motion analysis similar to that used in modern video compression algorithms. Several test applications are developed which rely on information returned from the camera, such as a contact list, a T9 text entry method, and a handwriting recognition method. Several games are also created, including Tetris and Breakout. User response on this method of input was mixed. In some cases, user’s preferred the tilt method over existing ones (T9, handwriting), while in other cases the method was found to be cumbersome (contact list scrolling). The authors relate the input method back to Fitt’s Law, and find it to be significantly less efficient than mouse input – although this is attributed to the low sampling rate of the device camera. I was surprised to read how tilt sensing worked correctly regardless of external environment, even working when the device was pointed at a clear sky. The power analysis section of this paper was severely lacking, and makes it difficult to compare the power efficiency of this method to any existing ones.

Lei Zhao 9:44:00 10/14/2015

‘Sensing Techniques for Mobile Interaction ’ uses a set of sensors and does research and checks status between them. In this reading, I find different types of sensors and their various data in those figures. Besides, I look for more information about tilt sensor on the Internet. A tilt sensor can measure the tilting in often two axes of a reference plane in two axes. In contrast, a full motion would use at least three axes and often additional sensors. One way to measure tilt angle with reference to the earths ground plane, is to use an accelerometer. Typical applications can be found in the industry and in game controllers. As a result, for future interfaces, contextual awareness via sensing may be increasingly relied upon to mobile devices. Although simple, cheap and relatively dumb sensors may play an important role in the future of mobile interaction, careful design and tasteful selection of features will always be necessary. What impressed me most is that, we should still not to ignore opportunities for poor design about interactive sensing techniques. We should seriously consider about result of our work at any time.‘Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study ’ proposed a method called TinyMotion that measures cell phone movements in real time by analyzing images captured by the built-in camera. In the following part, authors show the five significant characteristics TinyMotion has. Careful research seems much more useful in this paper. To get information about environment and backgrounds in which TinyMotion work properly, people should collect many typical environmental data. After a large number of user study, authors collect data and try to analyze and evaluate them under the direction of the TinyMotion algorithm. By comparing with different aspects, authors finally get the five main characteristics about TinyMotion. Also, it is shown that “clutch” that can engage and disengage motion sensing from screen action can be a good direction for future study. In my opinion, steps shown in this paper should be a good example for me on my future research.