Spring 2024 CREATE Research Showcase

The Center for Research and Education on Accessible Technology and Experiences (CREATE) hosted a Research Showcase and Community Day 2024 on May 20th. These events brought industry and community partners — leaders working and living in the disability and accessibility space — together with faculty and student researchers. Co-sponsored by HuskyADAPT. CREATE’s mission is to make technology accessible and to make the world accessible through technology.

Steele Lab members, Alexandra (Sasha), Mia, Kate, and Alisha,  presented posters at the CREATE Research Showcase to highlight design, development & research of technology to support individuals with disabilities.

Mia, Kate, and Alisha presented a poster on “The Switch Kit: bridging the gap in therapeutic toys for children with medical complexities“. This research involved the creation and evaluation of a therapeutic toy named the “Switch Kit,” designed for young children with medical complexities. The kit allows family members and clinicians to customize switches tailored to the unique needs of each child.

Alexandra presented a poster on “Camera-Based Interface for Hand Function Assessment”. Currently, hand function assessment (e.g., joint range of motion) in a clinical setting is done with low-resolution tools and oftentimes in a subjective manner that is time-consuming. With a camera-based interface, we wanted to improve the speed of collecting information about patient’s hand function, improve repeatability and objectivity, and enhance result presentation for both patients and clinicians.

MR Ebers, JP Williams, KM Steele, JN Kutz (2024) “Leveraging arbitrary mobile sensor trajectories with shallow recurrent decoder networks for full-state reconstruction,”

Journal Article in IEEE Access

Sensing is a fundamental task for the monitoring, forecasting, and control of complex systems. In many applications, a limited number of sensors are available and must move with the dynamics. Currently, optimal path planning, like Kalman filter estimation, is required to enable sparse mobile sensing for state estimation. However, we show that arbitrary mobile sensor trajectories can be used. By adapting the Shallow REcurrent Decoder (SHRED) network with mobile sensors, their time-history can be used to encode global information of the measured high-dimensional state space.

Summary figure of a shallow recurrent decoder network (SHRED) leveraging mobile sensors to reconstruct full state-space estimates from sparse dynamical trajectories. (Left) Sensor trajectory history encodes global information of the spatio-temporal dynamics of the sparsely measured system. In this work, we evaluate three challenging datasets, including forced isotropic turbulence, global sea-surface temperature, and human biomechanics. (Middle) The mobile SHRED architecture can (i) embed the multiscale physics of a system into a compact and low-dimensional latent space, and (ii) provide a mapping from the sparse mobile sensors to a full state estimate. (Right) The high-dimensional and complex system states can be reconstructed, provided training data for the dynamical trajectory of the sensor(s) is available.Aim: We leverage sparse mobile sensor trajectories for full-state estimation, agnostic to sensor path.

Methods: Using modern deep learning architectures, we show that a sequence-to-vector model, such as an LSTM (long, short-term memory) network, with a decoder network, dynamic trajectory information can be mapped to full state-space estimates.

Results: We demonstrate that by leveraging mobile sensor trajectories with shallow recurrent decoder networks, we can train the network (i) to accurately reconstruct the full state space using arbitrary dynamical trajectories of the sensors, (ii) the architecture reduces the variance of the mean-squared error of the reconstruction error in comparison with immobile sensors, and (iii) the architecture also allows for rapid generalization (parameterization of dynamics) for data outside the training set. Moreover, the path of the sensor can be chosen arbitrarily, provided training data for the spatial trajectory of the sensor is available.

Interpretation: The time-history of mobile sensors can be used to encode global information of the measured high-dimensional state space.

Steele Lab Presents at NWBS 2024

Members of the Steele Lab traveled to Eugene, OR for the 2024 Northwest Biomechanics Symposium May 17-18 hosted by the University of Oregon. The Northwest Biomechanics Symposium is a student-friendly conference and incorporates research labs from all of the Northwest, including Canada.

Charlotte Caskey, Mia Hoffman, Mackenzie Pitts, and Victoria (Tori) Landrum all gave podium presentations at the conference in Eugene. Kate Bokowy gave a poster presentation.

A special congratulations to Charlotte Caskey and Tori Landrum for receiving the Best Podium Honorable Mention Award in the PhD and Non-PhD categories, respectively.

In addition to sharing their research at the conference, the Steele Lab enjoyed connecting with fellow biomechanics researchers and exploring the surrounding Eugene area.

We are looking forward to NWBS 2025 in Vancouver, Canada!

AA Portnova-Fahreeva, M Yamagami, A Robert-Gonzalez, J Mankoff, H Feldner, KM Steele (2024) “Accuracy of Video-Based Hand Tracking for People With Upper-Body Disabilities”

Journal Article in IEEE Transactions on Neural Systems and Rehabilitation Engineering

Utilization of hand-tracking cameras, such as Leap, for hand rehabilitation and functional assessments is an innovative approach to providing affordable alternatives for people with disabilities. However, prior to deploying these commercially-available tools, a thorough evaluation of their performance for disabled populations is necessary.

A graphic which shows two hands demonstrating hand gestures and a Leap hand tracking device. The graphic also says that "average accuracy for all hands 0.7-0.9".Aim: In this study, we provide an in-depth analysis of the accuracy of Leap’s hand-tracking feature for both individuals with and without upper-body disabilities for common dynamic tasks used in rehabilitation.

Methods: Leap is compared against motion capture with conventional techniques such as signal correlations, mean absolute errors, and digit segment length estimation. We also propose the use of dimensionality reduction techniques, such as Principal Component Analysis (PCA), to capture the complex, high-dimensional signal spaces of the hand.

Results: We found that Leap’s hand-tracking performance did not differ between individuals with and without disabilities, yielding average signal correlations between 0.7-0.9. Both low and high mean absolute errors (between 10-80mm) were observed across participants. Overall, Leap did well with general hand posture tracking, with the largest errors associated with the tracking of the index finger. Leap’s hand model was found to be most inaccurate in the proximal digit segment, underestimating digit lengths with errors as high as 18mm. Using PCA to quantify differences between the high-dimensional spaces of Leap and motion capture showed that high correlations between latent space projections were associated with high accuracy in the original signal space.

Interpretation: These results point to the potential of low-dimensional representations of complex hand movements to support hand rehabilitation and assessment.

Megan Ebers Presents at 2024 WiDS Puget Sound Conference

On May 14, 2024, Steele Lab members Dr. Megan Ebers, Mackenzie Pitts, and Dr. Kat Steele attended the Women in Data Science (WiDS) Puget Sound conference hosted at Seattle University. WiDS aims to inspire and educate data scientists worldwide, regardless of gender, and to support women in the field.

Among the speakers at the conference, postdoctoral scholar Dr. Megan Ebers gave a presentation titled Data Expansion to Improve Accuracy and Availability of Digital Biomarkers for Human Health and Performance.”

A professional woman standing confidently in front of a projector screen, delivering a presentation.