Mobile Interfaces - 1

From CS2610 Fall 2016
Jump to: navigation, search

Slides


In Class demonstration schedule

  • Steven Faurie and Alireza Samadian Zakaria
  • Keren Ye, Haoran Zhang and Xiaozhong Zhang
  • Zhenjiang Fan and Nannan Wen
  • Tazin Afrin and Zuha Agha
  • Anuradha Kulkarni and Debarun Das


Readings

Optional

Reading Critiques

Haoran Zhang 20:28:56 9/28/2016

Sensing Techniques for Mobile Interaction: In this paper, authors talked about sensing techniques for mobile interaction design. From years ago, industry want to add more sensors to mobile phone. Like proximity range sensor, touch sensitivity screen, tilt sensor. In addition, acceleration transducer, GPS and so on. They want to use these sensors to improve the functionality of mobile phone. Nowadays, mobile phone is not just a phone anymore, it became to a life center of people. This is the reason why authors want to introduce sensing techniques in this paper, even early in 2000. They introduced a prototype device, a palm PC, with proximity range sensor, touch sensitivity, and a two-axis tilt sensor. Then, they used this device to did few experiments, like voice memo detection, portrait/landscape display mode detection, tilt scrolling & portrait / landscape modes. These functions seem normal in modern mobile device, but, back to 2000, they are the pioneer ideas. And authors explored the usability of these function with necessary sensors. Furthermore, they even explored the power management, like use these sensors to prevent undesired power-off or screen dimming due to the default system inactivity time-outs. From their work, they believe that sensor fusion will be essential to expand its capabilities and support additional interactive sensing techniques. Well, I have to say it is become true today, or even beyond their imagination. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: In this paper, authors introduced TinyMotion, a software approach for detecting a mobile phone user’s hand movement in real time by analyzing image sequences captured by the built-in camera. This project is not just a simple HCI project, but also involved a lot of technologies, such as emerging camera phone applications, computer vision in interactive systems. For the algorithm itself, it uses color space conversion, grid sampling, motion estimation, and post processing to predict the user movement with the mobile phone. Other than that, we can use this application to implement mobile gesture function, vision tilt text input function, or even play games using this application. Mobile phone become very popular in these days, almost everyone has a smart cellphone. Thus, exploring of sensing techniques for mobile interaction becomes important as well. Because a cell phone is not only a cell phone, but also a life center. Users want their cell phones have more and more functionalities. But without the help of different sensors, we cannot reach this target even closer. This paper gives us another inspiration that we can combine other technologies with sensing techniques, such as computer vision, machine learning to help us develop or design a better way to interact with mobile devices.

Tazin Afrin 1:11:52 10/2/2016

Critique of “Sensing Techniques for Mobile Interaction”: In this short article the authors describe different kinds of sensing techniques in mobile devices. When a user uses a mobile device the interaction includes moving with the device, holding phone at a special way while talking, changing orientation while using etc. The new set of sensors introduced in this article are proximity range sensors, touch sensors, and tilt sensors and the serve several new functionalities. Starting from 2000, when this paper was published, mobile devices had a start of growing market as personal information manager. Hence adding sensors to the devices open the opportunities for more intimate user experience. In this paper the authors natural and effective gestures to design the sensor effectiveness. For example, a touch sensor tells if the user is holding the device, a tilt sensor may tell if the user is reading anything form the screen, and a proximity sensor may tell if the user is scrolling. Throughout their experiment they found out that sensor techniques may be essential in future to expand the capability of a device. However, an optimum balance between sensors is yet to find out. The authors suggest and agrees with other researchers that a simple sensor maybe more useful than complicated ones. They have also mentioned that sensing techniques cannot solve all the problem of mobile interface today. Which we see in today’s smartphones, that touch sensors and tilt sensors maybe widely user but they are not all the problem solver. However, we cannot think the mobile devices today without sensors and they are essential part of handheld devices today. ------------------------------------------------------------------------------------- Critique of “Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study”: A pure software implementation of an application called TinyMotion is presented in this paper which analyzes the image sequence captured by the built in camera in the mobile phones and detects a user hand movement in real time. After experimenting on different interactive applications and games, the authors found that this technique is quite reliable under different environment, users can enter sentences faster, Fitt’s law is applicable here and it shows that camera phones are actually feasible for these purposes. The purpose of using mobile phone camera includes pointing, menu scrolling and playing games. The authors used these tasks because they frequently apprear in accelerometers. TinyMotion captures photos while the user is using the phone. Then from these photos it calculates the camera shift and based on the shift many actions are performed on the screen. This idea resembles an optical mouse. The main idea of the tinymotion algorithm is to track the gradient of the movement from frame to frame. The authors found from the experiment that, Fitt’s law parameters can be used to benchmark tinymotion. Also the user studies show that gaming is fun using the new technique. This study has been done in 2006, so the equipment or the mobile phone was not as advanced as today’s smartphones. I believe if this study can be done on the new smartphones the result would be much better.

Zhenjiang Fan 17:44:47 10/2/2016

Sensing Techniques for Mobile Interaction::::::::::::::::::The work first discusses reasons why we should explore the area of sensing technologies embedded on mobile devices. First of all, people have to adjust the configuration of their usage of mobile devices when they enter certain environments(contexts) manually, because of shortness of awareness of mobile devices in different environments. It is a common sense that people keep moving into different environments all day long. And, the set of natural and effective gestures—the tokens that form the building blocks of the interaction design—may be very different for mobile devices than for desktop computers. With different sensors embedded in mobile devices, these sensors may enable new interaction modalities and new types of devices that can sense and adapt to the user’s environment, they raise many unresolved research issues. Cassiopeia, the device that the author's team has assembled or built has multiple sensors that collect information around(that makes it context-sensitive). It carries sensor like touch sensor, tilt sensor, and proximity sensor. The touch sensor that installed on the back surface and sides of the device allows the team to detect if the user is holding the device. Tilt sensor detects the tilt of the device relative to the constant acceleration of gravity. Proximity sensor gives the team a signal of the user's proximity. With these sensors, the team had run different tests and experiments to collect some information for analyzing. There are several observations and problems had been mentioned in the work through the analysis of the collected data. And then, the work has come up with several technique areas that we can improve on. The usage of sensors on mobile devices helps deliver devices that are as simple and pleasant to use as possible while still allowing direct control when necessary. But, as the work has mentioned sensing techniques cannot offer a panacea for UI on mobile devices, the basic UI design concepts still play the most important role. The sensors that are used in the work have not been marketized widely and given the fact these sensors are pretty cheap, so they may someday in the near future be embedded in real-world commercial devices. And I do think the work should have put more attention on the conventional sensors like vibrators, cameras, and audio recorders. These conventional sensors have more research values than the sensors that were used in the work. :::::::::::::::::::::::::::::Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study::::::::::::::::::::::::::::::: As I mentioned above, the last reading material had been focusing on the unconventional sensors like proximity sensor and tilt sensor, this work mainly utilized the camera as its research tool. Through analyzing images captured by the device camera, the software tool, TinyMotion, has accomplished a series of functions: detecting camera movement; target allocation, acting as a very good interface for other applications; acting as handwriting capture device or handwriting recognition device or enabling games gaining more context-sensitive features. The TimyMotion algorithm is the core component of TinyMotion. The TinyMotion algorithm consists of four major steps: 1. Color space conversion. 2. Grid sampling. 3. Motion estimation. 4. Post processing. The first is to transfer a large image into a computer-friendly version of an image, a gray scale image. The second step is to reduce the computation complexity. Then in the third step of the algorithm, it starts decoding motions in the image. The final step is the post processing. The author then goes on talking about how the algorithm is implemented. The author had used both informal and formal evaluations to test the algorithm. There are lots of real-world applications that can exploit or benefit from the algorithm like text input, menu selection operation and camera movement for gaming. Given the fact that mobile devices have very limited resources, the algorithm has achieved a lot, even though, it may consume a lot of battery. The work has us given a perfect example of how sensors can help UI designers explore more useful functions for end-users. Sensors will be embedded in a variety of mobile devices in the coming age because context-sensitive devices can be seen as intelligent agents that eventually help us design our mobile devices as robotics.

Keren Ye 23:25:49 10/2/2016

Sensing Techniques for Mobile Interaction The paper describes sensing techniques motivated by unique aspects of human-computer interaction with handheld devices in mobile settings. It introduces and integrates a set of sensors into a handheld device, and demonstrate several new functionalities engendered by the sensors. Firstly, the authors introduce the hardware configuration and sensors. This is a experimental set with some hardwares and sensors such as touch sensors, tilt sensor, and proximity sensor. For the software architecture, they implement a software context information server that acts as a broker between the PIC / sensors and the applications. The paper discusses interactive sensing techniques in the next chapter. It describes in details about the usability testing, their implementation of voice memo detection and informal experiment. The authors also state their observations and usability problems. The authors describe strategies used in portrait / landscape display mode detection. Also, they mention the tilt scrolling method and the portrait / landscape modes. Due to the limitation of both of the methods, the authors try to integrate the portrait / landscape modes. Furthermore, to deal with the power management problem, they provide several innovative approach. In the conclusion, the authors state that they strongly believe such research will ultimately help deliver new and exciting software that fulfills the promise of ubiquitous mobile devices Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study This paper presents a software approach for detecting a mobile phone user’s hand movement in real time by analyzing image sequences captured by the built-in camera. The general idea is related to emerging camera phone applications, new interaction techniques for mobile devices, computer vision in interactive systems, and the TinyMotion algorithm which is proposed by the authors themselves. The general pipeline is stated in the next chapter. The authors did: 1) color space conversion, 2) grid sampling, 3) motion estimation, and 4) post processing. The authors then introduce several innovative application of their general processing pipeline. For example, the propose to use TinyMotion to implement games using mobile gesture. Also, they describe how to apply the TinyMotion to Vision TiltText. To evaluate the proposed TinyMotion, the authors firstly give some analysis of informal evaluation results. Then they describe the experimental setting of the formal evaluation. They have tested the 1) target acquisition / pointing, 2) the menu selection, 3) text input, and 4) more complex applications. In sum, the approach is quite innovative though it suffer the problem of battery life. And the authors are optimistic about the future of TinyMotion.

Steven Faurie 10:27:39 10/3/2016

Steve Faurie Sensing Techniques for Mobile Interaction: This paper described integrating sensing capabilities into a palm pilot like device. The authors added several sensors to the device including proximity sensors, motion sensors and touch sensors. After making changes to the device the authors conducted a user study with people who had jobs that required them to use similar devices. Many of the users reported liking a lot of the features. I didn’t think the results were that surprising. Many of the features are very similar to features present on modern smart phones, especially the tilt sensing. I thought it was interesting how they described those types of features as context related. Those types of features are things I just take for granted. It is easy to forget that phones and other devices weren’t always aware of the surrounding light level and adjusted the screen accordingly. Same with screen orientation. It was also interesting to see the paper describing the experimental process we had read about in earlier papers. It gave me some ideas for how we might approach our testing for the class project. Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: This paper described the development and testing of phone software that used the camera to detect phone movement. This was then translated into “cursor” movement on the phone screen. It was a way to add pointing to phones before touch screens were common. It was a clever use of existing hardware already found in many of the phones at the time. The paper went on to describe the experiments conducted with the software. Users were recruited and asked to perform several pointing, selection and text input tasks using the software. Their feedback and scores for the task were then recorded. It was interesting to see how TinyMotion compared to accelerometer based pointing options people had been developing at the time. Having quantitative results to compare input methods made the results much easier to understand. Much like the previous paper I thought this one was a good example of how we should plan to conduct our experiments for the class project. It showed the type of data we should collect, both qualitative and quantitative and how to analyze and present that data in a meaningful way.

Alireza Samadian Zakaria 16:55:03 10/3/2016

Sensing Techniques for Mobile Interaction is a paper about using sensors to make mobile devices context aware. Since the paper is old we already have these features in our mobile devices and we have used them many times. According to the paper, the set of gestures is different for mobile devices than for desktop computers and it can have many meanings so it’s a missed opportunity if we don’t use them. The authors have constructed a prototype sensor-enriched mobile device and used it to make this experiment. They have used touch sensor on the back of the device to understand if the user is holding the device, tilt sensor for finding rotation, proximity sensor which is an infrared transmitter and receiver pair. They have also implemented a software context information server that translates the signals to logical form data. They used these hardware sensors for a couple of things. For example, for voice memo user should hold the device, tilt the device towards himself and hold in close proximity; by this way, they make sure that they are not recording accidentally. Furthermore, they showed that using this method for voice memo requires less cognitive and visual attention by asking a participant to record voice in this method and tracking something by mouse simultaneously. One of the main problems with this method is that users cannot understand at the first time that what these gestures do. In addition to voice memo, the authors implemented two other applications by sensors which are device orientation detection and tilt scrolling. The first one is well known nowadays since we have it in all of the smartphones the second one is about scrolling some page by tilting the device which is used in today’s smartwatch but it is not used in smartphones, probably because using touch screen is easier to do that. ----- The second paper is about a software called TinyMotion. It detects a user’s hand movement in real time and then they use this detection for some applications. According to the paper, the procedure used to detect the movement has four major steps. The first step is changing color space from RGB to grayscale. The second step is downsampling the input image to reduce the calculations. The third step is motion estimation in which they have used a similar method to those used by video encoders. The fourth step is post-processing on these relative movements. They have implemented four applications and three games which are using TinyMotion. The applications are mobile gestures, vision TiltText, Image/Map Viewer and Motion Menu. After providing some information about the first two applications, the authors talk about their evaluation. They have evaluated the reliability of TinyMotion be benchmarking the detection rate of camera movements and conducting a usability test. Furthermore, they have carried out formal evaluation which had 6 stages. At each session, the participants were asked to do something like menu selection or using text input. After evaluating results, it is concluded that it is possible to use TinyMotion as either a pointing control sensor or a menu selection input device; however, it is not as accurate as MultiTap in text input purposes. Moreover, some discussions are made about battery life which is a major concern about this application; it is noted that we can use roughly 8 hours with the vibration function turned off.

nannan wen 22:42:49 10/3/2016

“Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study” by Jingtao Wang, Shumin Zhai, and John Canny review: In this paper, the author presented a system called “TinyMotion”, which is a software that they developed for user to use it one mobile phones, this software uses the camera to test the feasibility of using the camera motion detection for menu selection or game playing, and for using the the camera for handwriting recognition. Though the camera is not as high quality as they are on smartphones today, the paper shows that the camera motion detection and recognition software the authors developed performs accurately even when it operates in some extreme conditions. This paper showed the idea that how can we use sensors in mobile device to benefit user in contextual awareness area. “Sensing Techniques for Mobile Interaction” by Ken Hinckley, Jeff Pierce, Mike Sinclair, and Eric Horvitz review: In this paper, the authors present a device and software for the prupose of letting the device to utilize accelerometers in two dimensions to detect the orientation of the phone. The software then uses the accelerometer readings to perform multiple services using that feature, such as recording memos when holding a device like a phone. I think the approaches they took to implement the features are helpful and have a good insight. I think this paper is a good one to describe how advances can be made to make mobile devices more context aware and smart. .

Anuradha Kulkarni 23:47:22 10/3/2016

Sensing Techniques for Mobile Interaction: The paper elicits the usage of sensors in mobile device. It elucidates the use of sensors in mobile devices to benefit user in contextual awareness. The paper discusses how the implementation of different sensors conflict with each other, its challenges and provides solutions in design process. The paper also discusses about the special features added to the mobile device such as detecting change in the orientation and position, voice memo detection, tilt scrolling and powering up when in use. An important point was noted in paper that is rigorous testing must be done and proper care is to be taken before launching new feature to a mobile device. The main contribution of this paper is the idea of using tilt sensor and make it interactive between mobile devices and user.--------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study: This paper elucidates on one type sensor i.e. integrated camera on mobile phone and implement new software to evaluate new interaction between human and handheld. The paper introduces TinyMotion, a software which uses camera to detect camera movement by capturing images continuously. The camera can be used to input text, capture handwriting and even to play games. In my opinion the ideas were creative and can be applied wider and in more effective manner.

Debarun Das 3:20:32 10/4/2016

“Sensing Techniques For Mobile Interaction”: This paper aims to study the use of different sensors in mobile devices. The different types of sensors discussed and experimented with are the touch sensors. tilt sensors and proximity sensors. It further discusses the benefits and drawbacks of these sensors. One of the disadvantages of tilt sensor is that it cannot respond to rotation about the axis that is parallel to gravity. Similarly, proximity sensor consumes more power than needed but the paper discusses a way around it. Further, it implements voice memo detection, powering up the device when in use and tilt scrolling. Although most of these features are seen in today’s smartphones, yet there is more scope for harnessing this domain by utilizing sensors to improve the usability of the mobile devices. It claims to utilize sensors for more detection (like walking) as future use. However, as we know, modern smartphones already have this feature. This paper thus serves as an interesting read and provides essential background for utilization of sensors in smartphones. ================================ “Camera Phone Based Motion Sensing: Interaction Techniques, Applications and Performance Study”: This paper discusses about “TinyMotion”, which is based on a software for detecting motion of the user’s hand using an in built camera. It proceeds by explaining the “TinyMotion” algorithm, which consists of four major steps. They are color space conversion, grid sampling, motion estimation, and post processing. This paper further demonstrates the experiments and evaluations (formal and informal) performed and it shows quite accurate results in extreme conditions. Furthermore, ‘TinyMotion’ is used for handwriting capture which makes ‘real-time handwriting recognition feasible’. As future scope, it plans to do further applications of this technology. This paper presents a novel idea at that time and provides scope for further research in this area.

Zuha Agha 4:43:46 10/4/2016

Sensing Techniques for Mobile Interaction The theme of this paper revolves around augmenting hand-held devices with sensors. With sensors becoming more affordable, there are plenty of options for integrating new sensing techniques into mobile devices which will allow more degrees of freedom for the user to interact with the device. But at the same time selective choice of sensors is important as a conflict of sensory modalities can be counterproductive towards the goal of improving human computer interaction. In this paper, the author presents a prototype of a sensor rich mobile device to analyze some of these sensing techniques including voice memos, and switching between landscape and portrait mode via tilt. The author also highlights some of the issues with these techniques such as the interference of gravitational force with tilt orientation and the challenges of false positives and false negatives. To analyze the usability of these sensory modalities, user studies are conducted. However, one of the objections I have is that some of these user studies are not that sound such as the study with 7 users and very high standard deviation shown in the results. Nevertheless, the authors accept this concern of reliability themselves in the paper. What I like is that the authors present a holistic picture, with pros and cons of everything without having a bias. Camera Phone Based Motion Sensing This paper presents a system, TinyMotion, for detecting mobile users hand movements in real-time using images captured from its built-in camera. The underlying approach uses shifting or deviation between sequence of images and then applies mathematical operations to estimate position and acceleration. The method is primarily composed of a color space conversion phase, followed by grid sampling, motion estimation and post processing. System evaluation is done by conducting a user study of 17 participants. Results show that the system was robust to variations in background and illumination conditions. Another interesting finding is that target acquisition through tilting was easier with targets compared to horizontal targets. The paper also openly talks about the constraints of TinyMotion including battery life and the drift errors when estimating acceleration from deviation. All in all, what I like the most about is that the work presented is available as open source software which promotes the spirit of sharing research and transparency.