->

"Well... how did I get here?" -Talking Heads

I grew up in Florida and earned a BSEE at the University of Florida. I have been fortunate to be part of several great design teams developing products from initial proposal to mass production. This includes personal computers, point-of-sale terminals, thin clients, and rugged tablets at Compaq, Apple, Hewlett-Packard, and Motion Computing. I enjoy working with both hardware and software so after focusing on system level hardware design and supporting initial production ramps for many years I enrolled at Texas State University to brush up on my software skills. I completed a Master of Science December 2016 and my Ph.D May 2023, both in Computer Science. In June of 2023 I moved to Colorado Springs. If you are aware of any cool small companies or supportive teaching/research opportunities in the area please let me know!

Teaching

From Spring 2017 through Spring 2023 I was a lecturer in the Department of Computer Science and in the Ingram School of Engineering at Texas State where I taught:

Research

My research as part of the Intelligent Multimodal Computing and Sensing (IMICS) lab is in the area of biosignals and machine learning. We are using non-invasive physiological sensor data such as acceleration (movement), heart rate (ECG, PPG), electroencephalography (EEG), and others as input into machine learning algorithms which provide information regarding a person's current state. My focus has largely been on human activity recognition (HAR) and emotion recognition. Applications include lifestyle improvements (inactivity is a major health issue) and providing feedback for the evaluation of new therapies such as desensitization through carefully controlled virtual reality sessions.

Publications

Hinkle, L. B. (2016). "Determination of emotional state through physiological measurement." Masters Thesis available online at Texas State Digital Library

Hinkle, L.B., Roudposhti, K. K., and Metsis V. (2019) "Physiological Measurement for Emotion Recognition in Virtual Reality," 2019 2nd International Conference on Data Intelligence and Security (ICDIS), 2019, pp. 136-143, DOI: 10.1109/ICDIS.2019.00028.

Hinkle L.B., Metsis V. (2021) "Model Evaluation Approaches for Human Activity Recognition from Time-Series Data." In the 19th International Conference on Artificial Intelligence in Medicine. AIME 2021. Lecture Notes in Computer Science, vol 12721. Springer, Cham. DOI: 10.1007/978-3-030-77211-6_23 [source code]

Byers, M, Hinkle L.B., Metsis V. (2022) "Topological Data Analysis of Time-Series as an Input Embedding for Deep Learning Models" In the 18th International Conference on Artificial Intelligence Applications and Innovations. AIAI 2022. IFIP Advances in Information and Communication Technology, vol 647. Springer, Cham. DOI: 10.1007/978-3-031-08337-2_33

Hinkle L.B., Welker M. W., Stevens, J. (2022) "Incorporating Robotics into Electrical Engineering Capstone," Capstone Design Conference, 2022, Dallas Texas. Available online at capstonedesigncommunity.org

Hinkle L.B., Metsis V. (2022) "Individual Convolution of Ankle, Hip, and Wrist Data for Activities-of-Daily-Living Classification" IE2022 18th International Conference on Intelligent Environments, Biarritz, France [paper] [source code] [short video]

Hinkle L.B., Atkinson, G., Metsis V. (2022) "An End-to-end Methodology for Semi-Supervised HAR Data Collection, Labeling, and Classification Using a Wristband" Volume 31: Workshops at 18th International Conference on Intelligent Environments (IE2022) WISHWell, Biarritz, France DOI: 10.3233/AISE220066 [source code] [short video]

Hinkle L.B., Atkinson, G., Metsis V. (2022) "Fusion of Learned Representations for Multimodal Sensor Data Classification" was presented at the 19th International Conference on Artificial Intelligence Applications and Innovations (AIAI 2023) [paper] [source code]

Hinkle L.B., Metsis V. (2023) "An LLVM-Inspired Framework for Unified Processing of Multimodal Time-Series Data" will be presented at The 16th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2023) [paper] [source code]

Hinkle L.B., Pedro, T, Lynn, T, Atkinson, G., Metsis V.(2023) "Assisted Labeling Visualizer (ALVI): A Semi-Automatic Labeling System for Time-Series Data" will be presented at the "Signal processing and machine learning to foster accessibility in cultural environments" (SPACE) workshop held in conjuction with ICASSP2023 [paper] [source code]

Datasets

TWristAR is a small three subject Human Activity recognition dataset recorded using an Empatica E4 wristband. You can run a demo 1D-CNN in Colab.

Unlabled E4 Wristband (UE4W) has over 250 hours of unlabeled Empactica E4 wristband data for self-supervised experiments.

Processed PSG-Audio is a processed version of PSG-Audio from the sciddb repository. Still work-in-progress; on sandbox for now.

Links to Jupyter notebooks

These links will open the notebook in a google colab window