Skip to content

AG-MAE: Anatomically Guided Spatio-Temporal Masked Auto-Encoder for Online Hand Gesture Recognition

Notifications You must be signed in to change notification settings

lambda-xyz-01/AGMAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AG-MAE

AG-MAE: Anatomically Guided Spatio-Temporal Masked Auto-Encoder for Online Hand Gesture Recognition

image

Updates

  • Trained models available soon !

Installation

  • Create and activate conda environment:
conda create -n agmae python=3.9
conda activate agmae
  • Install all dependencies:
pip install -r requirements.txt

Dataset

Download the SHREC'21 dataset. And set the path to training and test data in the shrec21.yaml file in the configs/ folder.

 train_data_dir: './path/to/Train-set/'
 test_data_dir: './path/to/Test-set/'

Training

  • Online training:
bash ./scripts/train_online.sh
  • Offline training:
bash ./scripts/train_offline.sh

Evaluation

  • Online Evaluation:
bash ./scripts/eval.sh

The predicted results will be stored at ./experiments/shrec21/random_60/results/online_evaluation_results.txt. The evaluation scores can be calculated using the official Matlab code provided by the Shrec'21.

Some visual results:

  • Example of reconstructed window: image

  • Example of predictions: image

  • Shrec'21 online confusion matrix: image

  • Shrec'21 offline confusion matrix: image

We thank MAE and STGCN for making their code available.