Hit enter after type your search item
NEWS SMART

Here you will find everything about smart and technology

Modality and Depth in Touchless Smartphone Augmented Reality Interactions

/
/
/
174 Views

hello everyone i'm Jing Qian, today i'm presenting a talk on modality and depth in touchless smartphone augmented reality interactions touchless interactions such as hand tracking or dwell can be intuitive entertaining and immersive to use These interactions are emerging on augmented and virtual reality environments as supplementations to traditional joystick or touchscreen

In practice interaction distances or depth can affect touchless interactions For head mounted displays or hmds both the effect of modality and distances are well studied, but this is not the case for smartphones Why do we explore touchless interactions on smartphones? as AR platforms, smartphones have much larger user base than hmds Thus exploring the effects of modality and distance could potentially benefit more users Smartphone touchless interactions can be useful when a person is cooking, or when a person is wearing a glove in harsh or soiled environments that makes touch interactions difficult or unsanitized to perform

Furthermore, the screen dwell interactions on smartphones can also enable one-handed interaction which can be useful in AR interactions in this paper we want to understand the fact of touchless modality and interaction distance on smartphones and explore these three hypotheses we hypothesize that users would prefer dwell interactions over hand interactions users would prefer close-range over distant interactions the overall cognitive load would be higher for close-range interactions For detailed rationale behind these hypotheses please refer to our paper we built an AR application with iOS ARKit and implemented dwell and hand tracking The dwell interaction uses ARKit's localization to find where the phone is pointing at and the user can aim at different virtual objects with a dot-like crosshair

when the user holds the phone still a progress wheel rotates to complete For hand tracking, we 3D-printed three color markers to fit the index finger and the thumb to support an OpenCV based color tracking we removed objects with similar colors in the lab to avoid tracking issues here are two example videos with our system the left one demonstrates using hand tracking to press on virtual pin buttons the sphere on the index finger indicates where the finger is on the right it is a video demonstrating the dwell interaction to press pin buttons For hand tracking a user can move their index finger over the virtual element and the element will change to a highlighted color when a user pinches it is considered selected if the target is a button the selection will automatically debunce if the target is a slider while selected user can move the object to perform translation and release fingers can deselect the object For screen dwell once the cursor aims at the virtual object a progress wheel starts to refill

once filled, it is considered selected if the target virtual object is a button it will automatically debounce but when the object is a slider the object will remain selected and move to where the cursor points to in real time Once the smartphone becomes still again the progress wheel will refill Once it's filled again, it is considered deselect we designed a 2×2 study to understand the effect of interaction distance and modality here we test interaction in a close-range where objects are within reach and in a distance where interaction happens at about 24 meters we recruited 15 participants with average age around 29 years old and the study was on average for 42 minutes most of participants have prior AR application experiences and when asked about familiarity with the concept of dwell interaction all of them seem to understand the concept right away but this is not the case for hand tracking We designed our task to reflect real life scenarios to get participants into the everyday mindset of using the touchless touchless interaction the two tasks here are pin-code task and the tv task shown on the two figures on the right

In the pin codes task a user needs to perform four selection tasks or subtasks to match a randomly generated password to unlock a virtual painting in a tv task the user was asked to perform two selection subtasks and four scrubbing subtasks to seek for specific moments in the video based on the experimenter's commands which is demonstrated in the right picture here we collected all these data and ran them through a multi-level mixed-effect linear model and we found that distant interactions are significantly faster than close-range ones Specifically screen dwell is significantly faster than hand tracking No interaction were found between modality and distance also we didn't find any significant effect on the task ordering so here we revisit our three hypotheses We hypothesized that the overall cognitive load would be higher for close range interactions but we found that the overall cognitive load was not higher for close range completion time However we found a significant correlation between overall completion time and the task load

The higher the completion time the higher the overall cognitive load for participants the figure on the right shows all the subcategories of NASA TLX NASA TLX is commonly used score rating to measure cognitive load we use the raw scores from 0-100 to measure effort frustration mental demand performance physical and temporal demand If we look at hand tracking only we notice that the distant hand tracking is in general less frustrating less mental demanding less temporal demand and also less performance rating we do not see similar effect for screen dwell where distant and close-range interactions have similar cognitive readings for all the subcategories we also hypothesized that users would prefer dwell interactions over hand gestures and found that participants strongly prefer screen dwell over hand tracking one possibility could be that the familiarity plays a part in this as almost all participants were familiar with the concept of screen dwell but only 17 were for hand tracking we also noticed that hand tracking was not natural for some participants as they exhibit different pinch gesture than what the system can recognize Despite the low preference hand tracking is still considered by some participants to be cool futuristic natural and fun to use we further hypothesized that users would prefer close range over distant interactions but the findings were reversed this is a bit unexpected since AR contents are often situated and meant to be treated like real objects and we interact with real objects in a close-range manner participants mentioned that they felt constrained in hand tracking because their hands need to be in the smartphone's field all the time as shown on the upper right figure here where user's hand becomes not visible on the screen if moved too far away to the right side of the smartphone and for some people a natural forward pinch was engaged in hand tracking but this results in the thumb occludes the index finger to the camera's perspective resulting in issues with hand tracking We also noticed that some depth finding issues can lead to timeouts in close-range hand tracking in general from the system design perspective it is better to inform that user's what gesture should be used in the beginning Prior work already showed that users have good learnability to gestures and can stick to those gestures once learned it further suggests that components requiring higher performance such as UI controls should be considered for distant interactions lastly on the ergonomic side both hand tracking and screen dwell has their shortcomings participants mentioned arm soreness in both dell and hand tracking situations for hand tracking a brief arm relaxation can reduce the soreness as for screen dwell use both hands to hold the phone almost eliminated further soreness i would like to mention few limitations we have firstly this study was framed in a context of pure touchless interaction and not focusing on the difference or similarities between a touch based interaction and a touchless interaction The 3D translation and rotations were also not evaluated but could be part of the future work the long-term benefits or issues of touchless interactions were also not within the scope of this paper so here's a takeaway we did a study to compare understands the effect of interaction distances on hand tracking and dwell in a smartphone context we find that overall distant interactions are faster more preferred than the close-range interactions but for hand tracking participants mentioned the close-range can be futuristic and immersive when designing for AR applications designers can consider distant interactions for UI components or other interaction components that performance is higher priority and in a fixed interaction distance the two modalities could be interchangeable without extra cognitive cost thank you for your time and that concludes my talk if you have any questions feel free to email me thank you bye

Source: Youtube

This div height required for enabling the sticky sidebar
Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views :