Important notice: Beware of scammers pretending to represent InData Labs
Back to all case studies

Pose Estimation for Fitness and Physical Therapy Application

Improved pose estimation and error detection by 64%.

Pose estimation for therapy
Key Details

Improved pose estimation and error detection by 64%.

  • Challenge
    Develop a state-of-the-art pose estimation model to detect a human posture in a real-time scenario and perform error analysis and repetitions counting
  • Solution
    Deep learning for accurate human pose estimation and data science algorithms for error detection
  • Technologies and tools
    PyTorch, CoreML, TFLite, OpenCV, Scikit-learn

Client

The client is a US-based startup specialized in human activity recognition and motion analysis.
They have a healthcare mobile app designed to capture and estimate human body movements during workouts and physical therapy. The app is equipped with tracking tools that help the users exercise the right way and meet their fitness goals.

The open-source solutions for mobile posture estimation didn’t seem to work for the client, that’s why they sought advice from InData Labs on the problem of human pose estimation in real-time. It’s also worth mentioning that there are no common solutions for error detection and repetitions counting existed before.

Challenge: develop a state-of-the-art pose estimation model to detect a human posture in a real-time scenario

To beat up the competition in the mHealth app market, the client wanted to empower their app with AI (Artificial intelligence). They had difficulty in human body pose estimation. The primary task for our team was to improve its accuracy without affecting speed and usability.

The client wanted to scale up their fitness app with real-time error detection during workouts and physical therapy. This functionality can help the user avoid common workout mistakes and significantly reduce the risk of injury.

The InData Labs team took up the challenge of providing the client with robust pose estimation and error detection functionality from scratch.

Solution: deep learning for accurate human pose estimation and data science algorithms for error detection

Position estimation is a computer vision technique that predicts and tracks not only the location of a person or object but the joints specifically. Remarkable progress has been made in posture estimation so far, but still recognizing human activities in real-life settings remains unsettled. This task becomes even more challenging when approaching this not from standalone cameras in offline mode, but from smartphones in real-time. This happens because real-time execution significantly raises input data throughput and computations needed, while mobile devices are still limited in available computational resources.

Posture estimation can be classified into the following types: single-person or multi-person pose estimation, 3D or 2D, real-time or offline. After analyzing the client’s needs, we decided that real-time 2D single person poses estimation would be a good fit when applied to very different physical exercises.

To detect human joints in motion in real-time, we applied deep learning approaches suited for complicated computer vision problems. Our engineers had to develop a brand new neural network technology to leverage lots of insights and ideas, thus achieving competitive joints detection quality.

We started off with uniting all kinds of open datasets for various types of human posture estimation as massive amounts of data are a crucial ingredient of high-quality deep neural networks that perform accurately and robustly in the specific real-world use cases. In addition to that, we had to develop our own data augmentation methods to increase the dataset further.

After dealing with the datasets we developed an advanced human skeleton model with additional key points that greatly extended our opportunities in analyzing fitness and physical therapy exercises.

The next step was to create efficient pose estimation neural architecture. The client was concerned about the mobile app performance, that’s why we focused on optimizing the operational efficiency and reducing load time.

We used PyTorch to provide the client with a streamlined training pipeline as well as CoreML models for deployment of deep learning and CV models we developed.

Another data science problem was to implement error detection while workouts and physical therapy. Error detection helps understand appropriate and incorrect forms of performing physical exercises. It is used to identify human joints and provide the user with guidance on how to exercise the right way. For example, head pose estimation is essential when the user is doing a plank. The app estimates the position of the head to avoid injuries while exercising.

It’s worth mentioning that we had to develop custom algorithms for error detection and repetitions counting in the way that the mobile app users can get instant feedback via an audio interface with a virtual AI coach.

Result: improved pose estimation and error detection implementation

InData Labs successfully applied its computer vision and deep learning experience to aid the client with an AI-driven solution. Our expertise in mobile neural networks and deep learning helped accomplish the human activity recognition challenge in very limited terms, though deep learning approaches require months of work. Our real-time human position estimation neural network and error detection algorithms were successfully integrated for the app of workout and rehabilitation exercises.

We developed a solution that could address lighting change, frame rate loss, other people walking into frame, occlusions, etc.

Now the app goes the extra mile to make sure the user exercises the right way and gets the most out of their workout. Recommendations from the developed AI coach have valuable insights and help the users reach their fitness goals avoiding needless injury.

Choose us as your Computer Vision Service Provider

Tags:
  • Digital Health
  • Computer Vision

Contact InData Labs

Want to start getting value from your data? Fill the form. Click send. Let's talk.

    By clicking Send Message, you agree to our Terms of Use and Privacy Policy.