Dr FU, Hong    傅弘 博士
Associate Professor
Department of Mathematics and Information Technology
Contact
ORCiD
0000-0003-2246-7552
Phone
(852) 2948 7535
Email
hfu@eduhk.hk
Address
10 Lo Ping Road, Tai Po, New Territories, Hong Kong
Scopus ID
8586677800
Research Interests

computer vision algorithms

artificial intelligence in healthcare and education

eye tracking and motion detection

Personal Profile

Ph.D, The Hong Kong Polytechnic University, 2007

M.Eng,  Xi'an Jiaotong University, 2003 

B.Eng, B.Mgt, Xi'an Jiaotong University, 2000

Research Interests

computer vision algorithms

artificial intelligence in healthcare and education

eye tracking and motion detection

Research Outputs

Journal Publications
Publication in refereed journal
Deng, S., Xu, B., Zhao, J., & Fu, H. (2024). Advanced design for anti-freezing aqueous zinc-ion batteries. Energy Storage Materials, 70 https://doi.org/10.1016/j.ensm.2024.103490
Lo, W.-L., Chung, H. S.-H., Hsung, R. T.-C., Fu, H., & Shen, T.-Q. (2024). PV Panel Model Parameter Estimation by Using Particle Swarm Optimization and Artificial Neural Network. Sensors, 24(10) https://doi.org/10.3390/s24103006
Lu, J., Xu, B., Huang, J., Liu, X., & Fi, H. (2024). Charge Transfer and Ion Occupation Induced Ultra-Durable and All-Weather Energy Generation from Ambient Air for Over 200 Days. Advanced Functional Materials, Advance online publication. https://doi.org/10.1002/adfm.202406901
Yin, X., Fu, H., & Xu, B. (2024). Advanced Design of Light-Assisted Nanogenerators with Multifunctional Perovskites. Advanced Energy Materials, 14(13) https://doi.org/10.1002/aenm.202304355
Chung, K. Y., Xu, B., Tan, D., Yang, Q., Li, Z., & Fu, H. (2024). Naturally Crosslinked Biocompatible Carbonaceous Liquid Metal Aqueous Ink Printing Wearable Electronics for Multi-Sensing and Energy Harvesting. Nano-Micro Letters, 16 https://doi.org/10.1007/s40820-024-01362-z
Xie, K., Yin, J., Yu, H., Fu, H., & Chu, Y (2024). Passive Aggressive Ensemble for Online Portfolio Selection. Mathematics, 12(7) https://doi.org/10.3390/math12070956
Wang, Y., Li, Z., Fu, H., & Xu, B. (2023). Sustainable Triboelectric Nanogenerators Based on Recycled Materials for Biomechanical Energy Harvesting and Self-Powered Sensing. Nano Energy, 115 https://doi.org/10.1016/j.nanoen.2023.108717
Li, B., Zhang, P., Peng, J., & Fu, H. (2023). Non-Contact PPG Signal and Heart Rate Estimation with Multi-Hierarchical Convolutional Network. Pattern Recognition, 139 https://doi.org/10.1016/j.patcog.2023.109421
Li, B., Zhang, W., Li, X., Fu, H., & Xu, F. (2023). ECG Signal Reconstruction Based on Facial Videos via Combined Explicit and Implicit Supervision. Knowledge-Based Systems, 272 https://doi.org/10.1016/j.knosys.2023.110608
Tong, C. Y., Zhu, R. T.-L., Ling, Y. T., Scheeren, E. M., Lam, F. M. H., Fu, H., & Ma, C. Z.-H. (2023). Muscular and Kinematic Responses to Unexpected Translational Balance Perturbation: A Pilot Study in Healthy Young Adults. Bioengineering, 10(7) https://doi.org/10.3390/bioengineering10070831
Wen, J., Pan, X., Fu, H., & Xu, B. (2023). Advanced Designs for Electrochemically Storing Energy from Triboelectric Nanogenerators. Matter, 6(7), 2153-2181. https://doi.org/10.1016/j.matt.2023.04.004
Chen, F., Fu, H., Yu, H., & Chu, Y. (2023). No-Reference Image Quality Assessment Based on a Multitask Image Restoration Network. Applied Sciences, 13(11) https://doi.org/10.3390/app13116802
Li, B., Li, R., Wang, W., & Fu, H. (2023). Serial-parallel multi-scale feature fusion for anatomy-oriented hand joint detection. Neurocomputing, 536, 59-72. https://doi.org/10.1016/j.neucom.2023.02.046
Zhou, R., Zhang, Z., Fu, H., Zhang, L., Li, L., Huang, G., Li, F., Yang, X., Dong, Y., Zhang, Y.-T., & Liang, Z. (2023). PR-PL: A Novel Prototypical Representation Based Pairwise Learning Framework for Emotion Recognition Using EEG Signals. IEEE Transactions on Affective Computing, Early Access, 1-14. https://doi.org/10.1109/TAFFC.2023.3288118
Cao, Y., Xu, B., Li, Z., & Fu, H. (2023). Advanced Design of High-Performance Moist-Electric Generators. Advanced Functional Materials, 33(31) https://doi.org/10.1002/adfm.202301420
Chen, F., Fu, H., Yu, H., & Chu, Y. (2023). Using HVS Dual-Pathway and Contrast Sensitivity to Blindly Assess Image Quality. Sensors, 23(10) https://doi.org/10.3390/s23104974
Li, B., Zhang, W., Fu, H., Liu, H., & Xu, F. (2023). Multi-level constrained intra and inter subject feature representation for facial video based BVP signal measurement. IEEE Journal of Biomedical and Health Informatics, 27(8), 3948-3957. https://doi.org/10.1109/jbhi.2023.3273557
Chu, Y., Chen, F., Fu, H., & Yu, H. (2023). Detection of Air Pollution in Urban Areas Using Monitoring Images. Atmosphere, 14(5) https://doi.org/10.3390/atmos14050772
Gao, D., Zhu, Y., Yan, K., Fu, H., Ren, Z., Kang, W., & Soares, C.G. (2023). Joint learning system based on semi–pseudo–label reliability assessment for weak–fault diagnosis with few labels. Mechanical Systems and Signal Processing, 189 https://doi.org/10.1016/j.ymssp.2022.110089
Li, R., Fu, H., Zheng, Y., Gou, S., Yu, J.J., Kong, X., & Wang, H. (2023). Behavior Analysis With Integrated Visual-Motor Tracking for Developmental Coordination Disorder. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31, 2164-2173. https://doi.org/10.1109/TNSRE.2023.3270287
Li, Z., Xu, B., Han, J., Tan, D., Huang, J., Gao, Y., & Fu, H. (2023). Surface-modified liquid metal nanocapsules derived multiple triboelectric composites for efficient energy harvesting and wearable self-powered sensing. Chemical Engineering Journal, 460 https://doi.org/10.1016/j.cej.2023.141737
Liu, Y., Fu, H., Wei, Y., & Zhang, H. (2023). Sound Event Classification Based on Frequency-Energy Feature Representation and Two-Stage Data Dimension Reduction. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 31, 1290-1304. https://doi.org/10.1109/TASLP.2023.3260708
Lo, W.L., Chung, H.S.H., Hsung, R.T.-C., Fu, H., & Shen, T.W. (2023). PV Panel Model Parameter Estimation by Using Neural Network. Sensors, 23(7), 1-18. https://doi.org/10.3390/s23073657
Wang, H., Gao, C., Fu, H., Ma, C.Z.H., Wang, Q., He, Z., & Li, M. (2023). Automated Student Classroom Behaviors’ Perception And Identification Using Motion Sensors. Bioengineering, 10(2) https://doi.org/10.3390/bioengineering10020127
Gao, D., Zhu, Y., Kang, W., Fu, H., Yan, K., & Ren, Z. (2022). Weak Fault Detection with a Two-stage Key Frequency Focusing Model. ISA Transactions, 125, 384-399. https://doi.org/10.1016/j.isatra.2021.06.014
Chu, Y., Chen, F., Fu, H., & Yu, H. (2022). Haze Level Evaluation Using Dark and Bright Channel Prior Information. Atmosphere, 13(5) https://doi.org/10.3390/atmos13050683
Ren, Z., Zhu, Y., Kang, W., Fu, H., Niu, Q., Gao, D., Yan, K., & Hong, J. (2022). Adaptive cost-sensitive learning: Improving the convergence of intelligent diagnosis models under imbalanced data. Knowledge-Based Systems, 241 https://doi.org/10.1016/j.knosys.2022.108296
Li, Z., Xu, B., Han, J. Huang, J., & Fu H. (2021). A Polycation-Modified Nanofillers Tailored Polymer Electrolytes Fiber for Versatile Biomechanical Energy Harvesting and Full-Range Personal Healthcare Sensing. Advanced Functional Materials, 32(6) https://doi.org/10.1002/adfm.202106731
Duan, Y., Cao, H., Wu, B., Wu, Y., Liu, D., Zhou, L., Feng, A., Wang, H., Chen, H., Gu, H., Shao, Y., Huang, Y., Lin, Y., Ma, K., Fu, X., Hong, J., Fu, H., Kong, K., & Xu, Z. (2021). Dosimetric Comparison, Treatment Efficiency Estimation, and Biological Evaluation of Popular Stereotactic Radiosurgery Options in Treating Single Small Brain Metastasis. Frontiers in Oncology, 11 https://doi.org/10.3389/fonc.2021.716152
Lo, W.L., Chung, H.S.H., & Fu, H. (2021). Experimental Evaluation of PSO Based Transfer Learning Method for Meteorological Visibility Estimation. Atmosphere, 12(7), 828.
Li, S.Y., & Fu, H. (2021). Image Analysis and Evaluation for Internal Structural Properties of Cellulosic Yarn. Cellulose, 28, 6739-6756.
Wen, J., Xu, B., Gao, Y., Li, M., & Fu, H. (2021). Wearable Technologies Enable High-performance Textile Supercapacitors with Flexible, Breathable and Wearable Characteristics for Future Energy Storage. Energy Storage Materials, 37, 94-122.
Zheng, Y., Fu, H., Li, R., Hsung, T.-C., Song, Z., & Wen, D. (2021). Deep Neural Network Oriented Evolutionary Parametric Eye Modeling. Pattern Recognition, 113, 107755.
Tang, H.B., Han, Y., Fu, H., & Xu, B.G. (2021). Mathematical Modeling of Linearly-elastic Non-prestrained Cables Based on a Local Reference Frame. Applied Mathematical Modelling, 91, 695-708.
LI, J., Lo, W.L., Fu, H., Chung, H.S.H. (2021). A Transfer Learning Method for Meteorological Visibility Estimation Based on Feature Fusion Method. Applied Sciences, 11 (3), 997.

Conference Papers
Refereed conference paper
Lu, C.K., Li, R.M., Fu, H., Fu, B., Wong, Y.H., Lo, W.L., (2021, January). Precise Temporal Localization for Complete Actions with Quantified Temporal Structure. Paper presented at the 2020 25th International Conference on Pattern Recognition (ICPR), Milan, Italy.
Wang, W., & Fu, H. (2020, November). A Handwriting Evaluation System with Multi-modal Sensors. Paper presented at the International Conference on Education and Artificial Intelligence 2020 (ICEAI 2020), Hong Kong.
Other conference paper
Fu, H. (2022,6). 智能多模態感知系統與算法及其在特殊教育中的應用。論文發表於EDTECH教育科技研討會2022:特殊教育科技的創新和發展,香港,中國。

Patents, Agreements, Assignments and Companies
Patents granted
Fu, H., Xu, Y.W., Hou, B., Wang, J., Wang, Y., & Chan, C.C.H. (2024). Machine Vision-based Method and System for Determining a Range of Motion of a Joint of a Hand of a Subject [Patent granted]. Hong Kong: Intellectual Property Department, HKSAR Government.
Fu, H., Fu, B., Li, R., & Zheng, Y. (2023). An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking [Patent Granted]. Hong Kong: Intellectual Property Department, HKSAR Government.
Fu, H., Zheng, Y., & Song, Y. (2022). A System for Strabismus Assessment and a Method of Strabismus Assessment [Patent Granted]. Hong Kong: Intellectual Property Department, HKSAR Government.

Projects

Computer Vision Empowered Tactile Sensing towards Embodied AI

Project Start Year: 2024, Principal Investigator(s): FU, Hong

 
Reducing English Major Students’ Writing Errors with an Automated Writing Evaluation (AWE) System: Evidence from Eye-tracking Technology

Project Start Year: 2024, Principal Investigator(s): FU, Hong

 
Multi-Modal Handwriting Analysis Platform For Children

Project Start Year: 2024, Principal Investigator(s): FU, Hong

 
Development of AI Algorithms for Automated Strabismus Measurement

Project Start Year: 2023, Principal Investigator(s): FU, Hong

 
Posture Capturing via Wearable Elastic Clothing and AI Algorithms

Project Start Year: 2023, Principal Investigator(s): FU, Hong

 
In-Vehicle Road Surface Condition Detection System based on AI Sensor Fusion
This project focuses on developing AI algorithms for road surface classification and object detection. The algorithms involve image annotation and enhancement techniques to improve accuracy. The classification algorithm identifies different types of road surfaces, while the object detection algorithm recognizes objects on road surfaces such as vehicles, pedestrians, and obstacles. These algorithms aim to enhance road safety and improve autonomous driving systems.
Project Start Year: 2022, Principal Investigator(s): FU, Hong

 
An Intelligent Multi-Modal System for Boccia Training
This project is dedicated to developing an intelligent multimodal sensing system that effectively identifies, quantifies, and visualizes the training process for Boccia athletes and the important factors related to their performance. The goal is to establish a comprehensive system comprising a multimodal sensing system, intelligent algorithms, and a visualization program. This system will benefit both athletes and coaches by providing insights into upper limb movement, lower limb stability, and eye-hand coordination during Boccia training.
Project Start Year: 2022, Principal Investigator(s): FU, Hong

 
Smart Vest for Improving Behavioral Performance of School-aged Children with Attention Deficit Hyperactivity Disorder (ADHD)
Attention Deficit Hyperactivity Disorder (ADHD) is a common behavioral disorder. Patients usually have behavioral problems such as hyperactivity, impulsivity, and inattention, which cause learning and social difficulties and obstacles. At present, the prevalence of ADHD among school-age children worldwide is 6%-8%. According to a back-of-the-envelope calculation, Hong Kong would have about 70,000 school-age children suffering from ADHD. This project aims to develop an innovative intelligent perception vest that can effectively identify and treat school-aged children with ADHD from the perspective of behavioral intervention. There is no similar commercial product in the market. The smart vest is composed of a textile vest, a signal acquisition module, an intelligent analysis and perception module, and a feedback module. With the help of machine learning and information fusion technology, the vest can intelligently perceive and analyze the user’s behavioral data, and give a proper feedback to remind the user when there is abnormal behavior. By such doing, the user can carry out the state adjustment and emotional control, so that the behavior level can be maintained within the normal range. This project will also focus on the research and design of the textile materials, fabric structures and styles of the vest to achieve effective integration with related intelligent modules and units, and to achieve the comfort usage properties and the best effect of feedback behavior intervention for users. The behavioral data collected can also be stored and analyzed to provide objective data support for further adjustment of the treatment plan. The success of this project will have huge and extensive commercialization potential in the development of high-tech and smart textile products. It will not only bring new high value-added opportunities and markets to our local enterprises, but also make beneficial contributions to our society.
Project Start Year: 2022, Principal Investigator(s): FU, Hong

 
Smart Vest for Improving Behavioral Performance of School-aged Children with Attention Deficit Hyperactivity Disorder (ADHD)
Attention Deficit Hyperactivity Disorder (ADHD) is a common behavioral disorder. Patients usually have behavioral problems such as hyperactivity, impulsivity, and inattention, which cause learning and social difficulties and obstacles. At present, the prevalence of ADHD among school-age children worldwide is 6%-8%. According to a back-of-the-envelope calculation, Hong Kong would have about 70,000 school-age children suffering from ADHD. This project aims to develop an innovative intelligent perception vest that can effectively identify and treat school-aged children with ADHD from the perspective of behavioral intervention. There is no similar commercial product in the market. The smart vest is composed of a textile vest, a signal acquisition module, an intelligent analysis and perception module, and a feedback module. With the help of machine learning and information fusion technology, the vest can intelligently perceive and analyze the user’s behavioral data, and give a proper feedback to remind the user when there is abnormal behavior. By such doing, the user can carry out the state adjustment and emotional control, so that the behavior level can be maintained within the normal range. This project will also focus on the research and design of the textile materials, fabric structures and styles of the vest to achieve effective integration with related intelligent modules and units, and to achieve the comfort usage properties and the best effect of feedback behavior intervention for users. The behavioral data collected can also be stored and analyzed to provide objective data support for further adjustment of the treatment plan. The success of this project will have huge and extensive commercialization potential in the development of high-tech and smart textile products. It will not only bring new high value-added opportunities and markets to our local enterprises, but also make beneficial contributions to our society.
Project Start Year: 2022, Principal Investigator(s): FU, Hong

 
Action Stage Modeling with Cumulative Finite Automaton (CFA)
Movement skill assessment is a fundamental and essential research area for professionals involved in studying human actions, such as physiotherapists, occupational therapists, pediatricians, and coaches. Movement skill assessment has benefits for many applications, especially the monitoring of motor development in children. However, current action assessments rely largely on humans‎‎‎‎‎‎‎‎‎‎‎‎, which may be relatively inefficient, more expensive and tends to be subjective. This project proposes using automated movement skill assessment in which actions are recorded as long videos and assessment is performed by computational models.
To achieve automated movement skill assessments that can help movement professionals with their evaluation skills, three fundamental problems need to be addressed: a complete localization and recognition method, domain knowledge-learning and model generalization. First, most of the existing action assessment algorithms are intended to classify segmented videos, which limits the usability and automation. We aim to create a model that can pick up useful segments from the original videos, then recognize and evaluate them.
In summary, the success of this project will provide objective, faster, cost efficient and more granular tools for professional action evaluation. The outcomes of this project are potentially beneficial for fields related to movement such as physiotherapy, occupational therapy, sports science, and behavior study. In terms of computer vision, this will also make a fundamental contribution related to high level video understanding.

Project Start Year: 2021, Principal Investigator(s): FU, Hong 傅弘

 
Developing an Automated Ocular Misalignment Measurement System
Movement skill assessment is a fundamental and essential research area for professionals involved in studying human actions, such as physiotherapists, occupational therapists, pediatricians, and coaches. Movement skill assessment has benefits for many applications, especially the monitoring of motor development in children. However, current action assessments rely largely on humans‎‎‎‎‎‎‎‎‎‎‎‎, which may be relatively inefficient, more expensive and tends to be subjective. This project proposes using automated movement skill assessment in which actions are recorded as long videos and assessment is performed by computational models.
To achieve automated movement skill assessments that can help movement professionals with their evaluation skills, three fundamental problems need to be addressed: a complete localization and recognition method, domain knowledge-learning and model generalization. First, most of the existing action assessment algorithms are intended to classify segmented videos, which limits the usability and automation. We aim to create a model that can pick up useful segments from the original videos, then recognize and evaluate them.
In summary, the success of this project will provide objective, faster, cost efficient and more granular tools for professional action evaluation. The outcomes of this project are potentially beneficial for fields related to movement such as physiotherapy, occupational therapy, sports science, and behavior study. In terms of computer vision, this will also make a fundamental contribution related to high level video understanding.

Project Start Year: 2021, Principal Investigator(s): FU, Hong 傅弘

 
Geometric Eye Modeling and its Application in Strabismus Assessment
This proposed work aims to develop a geometric eye modeling method and apply it in intelligent strabismus assessment, with the following specific objectives:

• to design and develop a general framework for geometric eye image modelling;
• to develop the eye modelling algorithms, including parametric eye models, evaluation criterions, as well as parameter searching strategies;
• to evaluate the proposed eye models on benchmark datasets and compare with the state-of-the art methods; and
• to apply the proposed eye models to strabismus detection videos and evaluate the effectiveness of the models.

Project Start Year: 2021, Principal Investigator(s): FU, Hong 傅弘

 
Single image based hand joint detection: dataset and algorithms
This project aims at developing dataset and algorithms for hand joint detection from a single image.
Project Start Year: 2021, Principal Investigator(s): FU, Hong 傅弘

 
Eye hand coordination evaluation system and algorithms
The project aims to
 develop a digital system for eye hand coordination evaluation;
 study the fundamental algorithms including eye detection and action recognition; and
 verify the system and algorithms on various datasets.

Project Start Year: 2020, Principal Investigator(s): FU, Hong 傅弘

 
Skeleton based action recognition with deep learning assisted action unit representation
This project aims to achieve the following three objectives:
1. to develop an action unit representation with the aid of deep learning methods for skeleton action recognition;
2. to implement the proposed representation scheme; and
3. to validate the proposed algorithm on publicly available datasets for action recognition.

Project Start Year: 2020, Principal Investigator(s): FU, Hong 傅弘

 
Prizes and awards

Silver Medal
Dr FU Hong's "An Intelligent Ocular Misalignment Measurement System" won a Silver Medal from the AEII 2023.
Date of receipt: /12/2023, Conferred by: 3rd Asia Exhibition of Innovations and Inventions
 
Gold Medal
Dr FU Hong's "An Intelligent Ocular Misalignment Measurement System" won a Gold Medal and a Jury's Choice Award from the iCAN 2023.
Date of receipt: /8/2023, Conferred by: International Invention Innovation Competition in Canada (iCAN) 2023
 
Jury's Choice Award
Dr FU Hong's "An Intelligent Ocular Misalignment Measurement System" won a Gold Medal and a Jury's Choice Award from the iCAN 2023.
Date of receipt: /8/2023, Conferred by: International Invention Innovation Competition in Canada (iCAN) 2023
 
Gold Medal

Date of receipt: /4/2023, Conferred by: International Exhibition of Inventions of Geneva 2023
 
Patents

基於機器視覺的手部關節運動範圍測定方法和系統
This invention generally relates to hand function assessment. More specifically, this invention relates to a machine vision-based method for determining the range of motion of the joints of a hand of a subject and a system for implementing the same. - G/G01
 
基於機器視覺的手部關節運動範圍測定方法和系統
本發明涉及手部功能評估技術領域,具體公開了一種基於機器視覺的確定受試者手關節運動範圍的方法和系統 - A/G01
 
Machine Learning-based Method for Calibrating a Camera with Respect to a Scene
This invention generally relates to camera calibration. More specifically, the present invention relates to a machine learning-based method for calibrating a camera with respect to a large scene. - A/G03
 
基於機器視覺的手部關節運動範圍測定方法和系統
This invention generally relates to hand function assessment. More specifically, this invention relates to a machine vision-based method for determining the range of motion of the joints of a hand of a subject and a system for implementing the same. - A/G01
 
基於機器視覺的手部關節運動範圍測定方法和系統
This invention generally relates to hand function assessment. More specifically, this invention relates to a machine vision-based method for determining the range of motion of the joints of a hand of a subject and a system for implementing the same. - A/G01
 
A System for Strabismus Assessment and a Method of Strabismus Assessment
A System for Strabismus Assessment and a Method of Strabismus Assessment - A/G01
 
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking - G/G01
 
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking - A/G01
 
A System for Strabismus Assessment and a Method of Strabismus Assessment
A System for Strabismus Assessment and a Method of Strabismus Assessment - A/G01
 
A System for Strabismus Assessment and a Method of Strabismus Assessment
A System for Strabismus Assessment and a Method of Strabismus Assessment - G/G01
 
A System for Strabismus Assessment and a Method of Strabismus Assessment
A System for Strabismus Assessment and a Method of Strabismus Assessment - A/G01
 
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking
An Eye-gaze Tracking Apparatus and a Method of Eye-gaze Tracking - A/G01