Momona Yamagami wins the College of Engineering Student Research Award. Congratulations Momona!

Momona Yamagami stands in front of a cherry blossom tree at the "Quad" on the University of Washington campus. She has short black hair and is wearing a navy blue sweater.The College of Engineering Awards acknowledges the extraordinary efforts of the college’s teaching and research assistants, staff, and faculty members. Momona Yamagami was selected for the 2021 Student Research Award. Congratulations Momona!

Momona Yamagami is an innovative researcher who focuses on developing novel accessible technologies with translational impact. In her first year, she helped build an interdisciplinary research program that blended neuroengineering, human-computer interaction and rehabilitation at the Amplifying Motion and Performance (AMP) Lab to evaluate and mitigate symptoms of Parkinson’s disease using virtual reality. Dedicated to building accessible and inclusive technology, she is working to apply control theory and artificial intelligence to improve device accessibility for people with and without limited motion.

“Momona is a truly exceptional student with a demonstrated history of leadership in research and education. We cannot wait to see where Momona steers her career trajectory and research contributions.”

Microsoft has paired with UW to CREATE!

Purple background with white stick figure holding light bulbDuring Microsoft’s annual Ability Summit, they announced a new partnership with the University of Washington to establish the Center for Research and Education on Accessible Technology (CREATE) and kicked-off the collaboration with an inaugural investment of  $2.5 million. CREATE is an interdisciplinary team whose mission is to make technology, and the world, more accessible.

The CREATE leadership will be comprised of six campus departments and three different colleges including the Steele lab’s own Heather Feldner and Katherine M. Steele. This fantastic news was featured on The Seattle Times and Greek Wire.

Get excited and help us congratulate Heather, Kat, and all those involved and cheer them on to CREATE!

M Yamagami, KM Steele, SA Burden (2020) “Decoding Intent With Control Theory: Comparing Muscle Versus Manual Interface Performance”

Journal Article in ACM Conference on Human Factors in Computing Systems (CHI) 2020 Preceedings:

These results suggest that control theory modeling can provide a platform to successfully quantify device performance in the absence of errors arising from motor impairments

Split image of upper body of user holding rod and slider with computer screen

Photo (top and bottom) of a user using a slider (top) and muscles (bottom) to control a cursor on the screen.
(Top image) Side image of user. User rests their elbow and pinches the slider and moves the slider towards and away from their body to control the cursor.
(Bottom image) Side image of user. User is strapped to a rigid device holding a bar with hands supinated towards the ceiling, with the forearms at a 90 degree angle from the upper arms.
Electrodes are placed on the biceps and triceps and labelled. Arrows pointing up and down indicate that users move their arm up and down to control the cursor.

 

Background: Manual device interaction requires precise coordination which may be difficult for users with motor impairments. Muscle interfaces provide alternative interaction methods that may enhance performance, but have not yet been evaluated for simple (eg. mouse tracking) and complex (eg. driving) continuous tasks. Control theory enables us to probe continuous task performance by separating user input into intent and error correction to quantify how motor impairments impact device interaction

Aim:  Propose and extend an experimental and analytical method to guide future development of accessible interfaces like muscle interfaces using control theory

Method: We compared the effectiveness of a manual versus a muscle interface for eleven users without and three users with motor impairments performing continuous tasks.

Results: Both user groups preferred and performed better with the muscle versus the manual interface for the complex continuous task.

Interpretation: Results suggest muscle interfaces and algorithms that can detect and augment user intent may be especially useful for future design of interfaces for continuous tasks.

 

Momona also gave a phenomenal talk on this paper last week in the University of Washington’s ‘DUB Shorts’ series (video posted below). Nice job Momona!

B Nguyen, N Baicoianu, D Howell, KM Peters, KM Steele (2020) “Accuracy and repeatability of smartphone sensors for measuring shank-to-vertical angle” Prosthetics & Orthotics International

Journal Article in Prosthetics & Orthotics International

Example of how the smartphone app was used for this research. The top images show a black smartphone attached with a running arm band to the side or front of the shank - the two positions tested in this research. The middle figure shows the placement of the reflective markers for 3D motion analysis to evaluate the accuracy of the smartphone measurements. Markers were placed on the lateral epicondyle of the knee, lateral maleolus of the ankle, tibial tuberosity, and the distal tibia. Blacklight was used to mark the position of each marker and hide the position from the clinicians. The bottom panel shows screenshots from the app. The first screen is used to align the device and has arrows at the top and bottom that remind the clinician which anatomical landmarks should be used to align the device while displaying the shank-to-vertical angle in real time. The second screenshot shows an example of the calculated shank-to-vertical angle while someone was walking. The average is shown with a bold black line, with all other trials shown in blue and excluded trials (e.g., when someone was stopping or turning) that deviated more than one standard deviation from other trials are shown in red. There is also text below the graph that provides summary measures, like shank-to-vertical angle in mid stand and cadence (steps/min). The results can be exported as a picture or sent via e-mail using the app.
A) Smartphone positioning on the front or side of the shank. B) Reflective markers on the the tibial tuberosity (TT) – distal tibia (DT) and lateral epicondyle (LE) – lateral malleolus (LM) were used to compare the accuracy of the smartphone to traditional motion capture. UV markings were used to keep placement of these markers constant while blinding clinicians. C) Sample screenshots of the mobile application, including the set-up screen and results automatically produced after a walking trial.

Background

Assessments of human movement are clinically important. However, accurate measurements are often unavailable due to the need for expensive equipment or intensive processing. For orthotists and therapists, shank-to-vertical angle (SVA) is one critical measure used to assess gait and guide prescriptions. Smartphone-based sensors may provide a widely-available platform to expand access to quantitative assessments.

Objectives

Assess accuracy and repeatability of smartphone-based measurement of SVA compared to marker-based 3D motion analysis.

Method

Four licensed clinicians (two physical therapists and two orthotists) measured SVA during gait with a smartphone attached to the anterior or lateral shank surface of unimpaired adults.  We compared SVA calculated from the smartphone’s inertial measurement unit to marker-based measurements. Each clinician completed three sessions/day on two days with each participant to assess repeatability.

Results

Average absolute differences in SVA measured with a smartphone versus marker-based 3D motion analysis during gait were 0.67 ± 0.25° and 4.89 ± 0.72°, with anterior or lateral smartphone positions, respectively. The inter- and intra-day repeatability of SVA were within 2° for both smartphone positions.

Conclusions

Smartphone sensors can be used to measure SVA with high accuracy and repeatability during unimpaired gait, providing a widely-available tool for quantitative gait assessments.

Try it out!

The app for monitoring shank-to-vertical angle is available for you to download and use on either Android or iOS smartphone. Please complete THIS SURVEY which will then send you an e-mail with instructions for installation and use. This app is not an FDA approved medical device and should be used appropriately.