publications
In the works...
2024
- Towards Explainability of Dimension Reduction Methods for Machine LearningTony Enrique Astuhuaman Davila, and Tayo Obafemi-AjayiIEEE CIBCB, 2024
Dimension reduction techniques visualize outcomes of machine learning models on complex data. The objective is to transform high-dimensionality input data to a lower-dimensional space (usually 2D or 3D) for better human comprehension. These techniques have parameters that strongly affect the visualization. Though visual inspection of projected dimensions could be appealing, interpretability of classes in relation to input features usually lacks. This work presents an automated framework for transforming dimensional reduction plots into a viable explanation space by embedding significant features onto the projected space. Our approach conducts a grid search of the parameter space of reduction methods to determine the optimal parameters based on the silhouette score. It applies an ensemble feature importance score to select the optimal subset of input features to overlay with group centroids on plots. The aim is to increase interpretability and utility of these plots in explaining the structure of groups represented by model outcomes. We demonstrate our approach by applying it to datasets from phenotypes of neurogenetic diseases. The framework is accessible on our GitHub page, providing a resource for researchers to explore and implement the methodology.
2023
- Real-Time Hand Gesture Recognition for Drone Control Using Deep LearningTony Enrique Astuhuaman Davila, and Mohammed Y. BelkhoucheCNAS Spring Symposium, 2023
In this research, we present an approach for human-drone interaction through real-time hand gesture recognition using computer vision and deep learning techniques. Our system employs a Neural Network (NN) model to classify hand gestures captured by a camera mounted on the drone. The system demonstrates real-time performance and is capable of recognizing gestures with 98% accuracy. Future work could explore the integration of additional gestures for more complex drone maneuvers and improving robustness in diverse environmental conditions.