Lower-limb exoskeletons have become increasingly popular in rehabilitation to help patients with disabilities regain mobility and independence. Brain-computer interface (BCI) offers a natural control method for these exoskeletons, allowing users to operate them through their electroencephalogram (EEG) signals. However, the limited EEG decoding performance of the BCI system restricts its application for lower limb exoskeletons. To address this challenge, we propose an attention-based motor imagery BCI system for lower limb exoskeletons. The decoding module of the proposed BCI system combines the convolutional neural network (CNN) with a lightweight attention module. The CNN aims to extract meaningful features from EEG signals, while the lightweight attention module aims to capture global dependencies among these features. The experiments are divided into offline and online experiments. The offline experiment is conducted to evaluate the effectiveness of different decoding methods, while the online experiment is conducted on a customized lower limb exoskeleton to evaluate the proposed BCI system. Eight subjects are recruited for the experiments. The experimental results demonstrate the great classification performance of the decoding method and validate the feasibility of the proposed BCI system. Our approach establishes a promising BCI system for the lower limb exoskeleton and is expected to achieve a more effective and user-friendly rehabilitation process.
© 2024 Author(s). Published under an exclusive license by AIP Publishing.