Skip to content Skip to footer

Alisa Kunapinun, D.Eng.

Florida Atlantic University, USA

Address: Florida Atlantic University, Harbor Branch Oceanographic Institute
Brief info

Current Status:
 ⦁ Senior Research Fellow in Florida Atlantic University, Harbor Branch Oceanographic Institute (FAU HBOI)
 ⦁ Adjunct Research Professor in Asian Institute of Technology
 ⦁ PI/CO-PI and Researcher in Faculty of Radiology, Siriraj Hospital, Mahidol University
 ⦁ Web master of http://www.bartlab.org/

Education:
 ⦁ Post-DoctoralFlorida Atlantic University, FL, USA (Present)
 ⦁ D.Eng (Mechatronics)Asian Institute of Technology, THAILAND (2022)
 ⦁ Certificate (Machine Learning from Data to Decision)MIT, USA (2019)
 ⦁ Certificate (Applied Machine Learning)Columbia University, USA (2019)
 ⦁ M.Eng (Mechatronics)Asian Institute of Technology, THAILAND (2010)
 ⦁ B.Eng. (Computer Engineering)Mahidol University, THAILAND (2008)

Research Interest:
Maching Learning, AI, Machine Vision, MRI/CT/US Imaging, Brain MRI Imaging, Stroke Imaging and Intervention, Thyroid US Imaging, Robotics, Industrial Robotics, System Automation, Robotics, Deep Reinforcement Learning, Aquaculture System

.

Professional Timeline

Some description text for this item

2010
R&D Software Engineer

Cal-Comp Electronics (Thailand) Public Co., Ltd.
Bangkok, Thailand
(2010-2014)

2012
System Integration Supervisor

Cal-Comp Electronics (Thailand) Public Co., Ltd.
Bangkok, Thailand
(2010-2014)

2014
Software and System Integration Assistant Manager

Cal-Comp Electronics (Thailand) Public Co., Ltd.
Bangkok, Thailand
(2010-2014)

Software Robot Engineer

Shibaura Machine Co.,LTD.
Shizuoka, Japan (2014-2018)

2017
Software Robot Specialist

Shibaura Machine Co.,LTD.
Shizuoka, Japan (2014-2018)

Vice Director

Cal-Comp Automation and Industrial 4.0 (Thailand) Co., Ltd.
Bangkok, Thailand (2018-2019)

2019
Temporary Lecturer

Robotics Course Online
King Mongkut University of Technology North Bangkok

Head of System Integration

SOLITECH, Bangkok, Thailand (2019-2020)

2021
Research Assistant
  • Deep Learning on Thyroid Nodule in Ultrasound Images.
  • Deep Learning for Dementia in MRI Images

Asian Institute of Technology, Pathum Thani, Thailand
(2021-2024)

Teaching Assistant
  • Machine learning
  • Recent Trends of Machine Learning (Deep Learning)
  • Computer Vision
  • Deep Reinforcement Learning

Asian Institute of Technology, Pathum Thani, Thailand (2021-2022)

2022
Temporary Instructor

Maching Learning from Data to Decision
King Mongkut Institute of Technology Lad Krabang

D.Eng. Graduation

Asian Institute of Technology, Pathum Thani, Thailand
(2019-2023)

2023
Researcher

Asian Institute of Technology, Pathum Thani, Thailand

Post-Doc Research Fellow

Florida Atlantic University, Harbor Branch Oceanographic Institute
Fort Pierce, Florida, USA

2025
Senior Research Fellow

Florida Atlantic University, Harbor Branch Oceanographic Institute
Fort Pierce, Florida, USA

(Promoted from Post-Doc Research Fellow)

Selected publications

Publications

  1. Kunapinun, A.; Suthakorn, J.; Siravaman, D.; Songsaeng, D., Towards AI-Powered Neurovascular Intervention: From Imaging to XR–Robotic Convergence, Stroke, 11 December 2025 (Q1; IF 8.9)
  2. Joochim, C.; Muangjamnong, T.; Kunapinun, A., Development and Evaluation of a Quadruped Leg System with PID and Computed Torque Control: From Real-World Actuation to Simulated Dynamics, IEEE Access, 13 December 2025. (Q1; IF 3.6)
  3. Naing, L.T.; Kunapinun, A.; Dailey, M.N.; Patanasantichai, R.; Buathong, S.; Keatmanee, C.; Wongsripuemtet, J.; Rattanabannakid, C.; Senanarong, V.; Ekpanyapong, M.; Songsaeng, D., AI-Powered Evaluation of Dementia Severity Based on Clinical Data and Visual Scoring Systems (MTA, ERICA, GCA) from MRI, Scientific Reports, In Review. (Q1; IF 4.6) — Co-first author
  4. Kunapinun, A.; Ellison, P.; Danesh, A.; Cai, X.C.; Levy, X.; Fields, G.; Pelah, A., Joint-wise Spectral Analysis of Balance Responses Using Inverse Kinematics: A Motion Capture Study in post-acute COVID vs Non-COVID Individuals, i-CREATe 2025: 18th International Convention on Rehabilitation Engineering and Assistive Technology, Bangkok, Thailand, 24–26 November 2025.
  5. Keatmanee, C.; Songsaeng, D.; Klabwong, S.; Nakaguro Y.; Kunapinun, A.; Ekpanyapong, M.; Dailey, M.N., Curriculum Learning with Ultrasound Image Dataset for Enhanced Thyroid Nodule Assessment in Deep Learning, QIMs., 18 August 2025. (Q1; IF2.9)
  6. Keatmanee, C.; Songsaeng, D.; Klabwong, S.; Nakaguro, Y.; Kunapinun, A.; Dailey, M.N.; Ekpanyapong, M.; Enhancing Weakly Supervised Data Augmentation Networks for Thyroid Nodule Assessment Using Traditional and Doppler Ultrasound Images, Computers in Biology and Medicine, June 2025. (Q1; IF7.7)
  7. Kunapinun, A.; Fairman, W.; Wills, P.S.; Ouyang, B.; Integrating Bi-LSTM, Attention Mechanism, and Physics Constraint Models for Improved Biomass and Water Quality Prediction in the Aquaculture Systems, SPIE Defense + Commercial Sensing, Orlando, FL, USA, 13–17 April 2025.
  8. Kunapinun, A., Fairman, W.; Wills, P.S.; Dennis, H.; Ouyang, B., Advancing Seaweed Cultivation: Integrating Physics Constraint in Machine Learning Models for Enhanced Biomass Forecasting in IMTA Systems, Remote Sensing, 2024 November. (Q1; IF4.2)

  9. Keatmanee, C.; Songsaeng, D.; Klabwong, S.; Nakaguro, Y.; Kunapinun, A.; Dailey, M.N.; Ekpanyapong, M.; Enhancing Weakly Supervised Data Augmentation Networks for Thyroid Nodule Assessment Using Traditional and Doppler Ultrasound Images, Computers in Biology and Medicine, In Review/Revise. (Q1; IF4.5)

  10. Kunapinun, A.; Fairman, W.; Wills, P.S.; Meiri, S.; Kostelnik, M.; Ouyang, B.; Innovative Aquaculture Biometrics Analysis: Harnessing IR Lasers and ToF Cameras for Microscopic Fish Larvae Tracking, SPIE Defense + Commercial Sensing, 2024 June.

  11. Kunapinun, A., Fairman, W.; Wills, P.S.; Dennis, H.; Singh, S.; Ouyang, B., Advancing Seaweed Cultivation Analytics: Integrating Polynomial Regression Models with IoT for Optimized Harvesting in an Integrated Multi-Trophic Aquaculture (IMTA) System, SPIE Defense + Commercial Sensing, 2024 June.

  12. Namsena, P.; Songsaeng, D., Keatmanee, C., Klabwong, S., Kunapinun, A., Soodcheun, S., Tarathipayakul, T., Tanasoontrarat, W., Ekpanyapong, M. and Dailey, M.N. Diagnostic Performance of Artificial Intelligence in Interpreting Thyroid Cancer on Ultrasound Images in the Multi-center Study. QIMS., 2024 April. (Q2; IF 2.9)

  13. Kosavinta, S. and Kunapinun, A. (2023). CycleGAN Implementation for Creating Modern Thai Houses., 2023 ACADIA CONFERENCE. Presented on October 28, 2023. (Rank A)

  14. Kunapinun, A.; Songsaeng, D., Buathong, S., Dailey, M.N., Keatmanee, C., Ekpanyapong, M. Explainable Automated TI-RADS Evaluation of Thyroid Nodules. Sensors 2023, 23, 7289. https://doi.org/10.3390/s23167289. (Q1; IF 3.9)

  15. Kaewumporn U., Songsaeng D., Kunapinun A., Chaisinanunkul N., FLAIR vascular hyperintensity-DWI mismatch and DWI-FLAIR mismatch ASPECTS for prediction of good outcome after recanalization in anterior circulation stroke; multicenter observational study. Eur J Radiol. 2023 Jun;163:110837. doi: 10.1016/j.ejrad.2023.110837. Epub 2023 Apr 15. PMID: 37084592. (Q1; IF 3.5)

  16. Kunapinun A., Dailey M.N., Songsaeng D., Parnichkun M., Keatmanee C. Ekpanyapong M., Improving GAN Learning Dynamics for Thyroid Nodule Segmentation, Ultrasound Med Biol, 2023 Feb;416-430 doi: 10.1016/j.ultrasmedbio.2022.09.010. Epub 2022 Nov 21. PMID: 36424307. (Q1; IF 2.9)

  17. Kunapinun A., Dailey, M.N. (2022). COVID-19 X-ray Image Diagnosis Using Deep Convolutional Neural Networks. In Proceedings of Sixth International Congress on Information and Communication Technology  (pp. 733–741). Lecture Notes in Networks and Systems, vol 236. Springer.

  18. Kaewkorn, S., Joochim, C., Keeratiwintakorn, P., Kunapinun, A. (2020). Development of Pick and Place Delta Robot. The Impact of the 4th Industrial Revolution on Engineering Education. ICL 2019. Advances in Intelligent Systems and Computing, vol 1135. Springer.

  19. Kaewkorn S., Joochim C., Prasertprasasna S., Leartrussameejit C., Kuhataparuks H. and Kunapinun A. (2019), Notifying problems of a machine by using Machine Learning, 2019 Research, Invention, and Innovation Congress (RI2C). IEEE.

  20. Kaewkorn S., Joochim C. and Kunapinun A. (2019), The 9 Points Calibration Using SCARA Robot, 2019 Research, Invention, and Innovation Congress (RI2C). IEEE.

  21. 西原泰宣、Thanok, A., ばら積みピッキング支援ソフトウェアの紹介, eizojoho industrial, July, 2015.
  22. Kunapinun, A., Lorsakul, A., Teerapong, S., Suratana, J., Saengsupho, W., Sagonwimon, F., Shah, S. S. H., Direkwatana, C., and Suthakorn, J., Motion Control of Intelligence Vehicle Using Camera and Other Systems, Proceedings of the 2007 TRS Conference on Robotics and Industrial Technology (CRIT 2007), Nakorn Phatom, Thailand, June 14-15, 2007.

PI and Co-PI

Grants and Projects

  •  ⦁ Development of Sign Language Intelligence and Its Applications (2025-2027) 408,570 USD

      Project Grant: NBTC (Thailand)

  •  ⦁ Development of Artificial intelligence (A.I.) of Diagnostic Neuroradiology and Follow-up in Dementia diseases (2023-2024) 203,000 USD

      Project Grant: National Research Institute (Thailand) under Contract No. N34A660390, IRB number: 350/2566(IRB3)

  •  ⦁ Thyroid Nodule Classification and Analysis using ACR TI-RADs and Artificial Intelligence (2023)

      Project IRB: Siriraj Hospital, Mahidol University, IRB number: COE 403/2023

Selected projects in vision, robotics, and AI systems

Research & Engineering Portfolio

Smart Farming Vision Monitoring System

This project applies vision-based IoT monitoring to smart farming applications.
A movable camera system captures field images, allowing users to monitor crop conditions and analyze vegetation health through image-based green intensity measurements.
The interface supports pan–tilt camera control and region-based analysis.
Users can draw a specific area of interest to calculate green metrics locally, enabling focused evaluation rather than global averaging.
The system shares a common interface design philosophy with earlier industrial monitoring tools, emphasizing clarity, minimal interaction, and practical field usability.

Autonomous Face Drawing System

The system starts with a camera capturing a facial image, which is automatically processed by the computer.
The software extracts facial structures, generates drawing trajectories, and optimizes the stroke sequence before sending the final path to a SCARA robot.
User interaction is deliberately simple: a camera for image capture and a single Start button to initiate the process.
The robot then executes the drawing autonomously, turning visual input into a physical sketch.

Autonomous Bin Picking with 3D Vision

This system demonstrates how 3D vision and robotics can work together in a continuous industrial workflow.
A 6-axis robot uses a custom 3D vision system (TSVision3D) to identify and pick randomly oriented bottles from a bin, automatically detecting and correcting inverted objects.
The bottles are placed onto a looping conveyor, where a SCARA robot equipped with 2D vision synchronizes its motion with the moving belt to grip, print, and transfer each bottle—without stopping the conveyor.

I designed the overall system concept and workflow to clearly demonstrate how 3D vision can be applied in real industrial lines, while mechanical implementation and vision conveyor software were developed by separate teams.

TSMonitor: Industrial Robot Monitoring System

This project explores how Industrial IoT concepts can be applied to robotic systems.
Robots are connected to a central server, where their operational status—running, warnings, or errors—is continuously monitored without taking control away from the robot itself.
Each station is equipped with a camera that allows visual inspection of the robot or process area.
Through the TSMonitor interface, cameras can be remotely pan–tilt controlled, enabling quick inspection when abnormal behavior is detected.
The system was designed as a safe, non-intrusive monitoring tool, focusing on visibility rather than control.
The initial prototype supports monitoring of up to four robotic stations simultaneously.

AGV Forklift Navigation System

This project was developed at a time when line-free AGV navigation was still considered highly challenging.
Instead of using fixed guidance, the system relies on laser scanning and reflector-based localization to build a lightweight map for navigation—avoiding the memory overhead of full SLAM.
The AGV forklift localizes itself within the mapped environment and navigates to predefined storage locations, such as specific rack rows and lift levels.
Vehicle motion is controlled using Fuzzy PID, while the lift height is regulated by PD control, allowing lifting up to 230 cm.
High precision was critical: positional error was required to stay within 5 cm, including rack insertion clearance.
I developed the core navigation and control software running on the industrial PC, while low-level motor command execution was handled by another team member.

EEG-Based Assistive Control System

This patented project enables people with severe physical disabilities to control electrical devices using EEG signals.
Electrodes placed on the user’s head capture brain activity, which is processed by software to generate on–off commands for external devices through a relay system.
The interaction is intentionally subtle: the user focuses on an object displayed on the screen and blinks to issue a command.
Importantly, blinking alone does not trigger any action—control is activated only when visual focus and intentional brain activity are detected together, preventing accidental commands.
The system was implemented without deep learning or complex calibration.
By analyzing peak amplitudes and selecting discriminative EEG channels, reliable control was achieved using lightweight signal processing techniques.
I developed the software that interprets EEG signals and converts them into control commands, forming the core link between human intention and physical action.

MU-Dent Navigation Software

This project is a prototype dental navigation system that simulates guided procedures using infrared LED markers.
Stereo cameras capture marker positions by reducing exposure to isolate LED light, enabling real-time estimation of 3D coordinates.
I developed the entire software system, including the user interface and 3D visualization.
Detected markers are transformed into corresponding 3D dental instruments, allowing clinicians to intuitively track tool position and orientation.
Marker identification was implemented using a custom vector-based approach, designed without conventional matrix matching methods.