Artificial intelligence, with its remarkable adaptability, has gradually integrated into daily life. The emergence of the self-attention mechanism has propelled the Transformer architecture into diverse fields, including a role as an efficient and precise diagnostic and predictive tool in medicine. To enhance accuracy, we propose the Double-Attention (DA) method, which improves the neural network's biomimetic performance of human attention. By incorporating matrices generated from shifted images into the self-attention mechanism, the network gains the ability to preemptively acquire information from surrounding regions. Experimental results demonstrate the superior performance of our approaches across various benchmark datasets, validating their effectiveness. Furthermore, the method was applied to patient kidney datasets collected from hospitals for diabetes diagnosis, where they achieved high accuracy with significantly reduced computational demands. This advancement showcases the potential of our methods in the field of biomimetics, aligning well with the goals of developing innovative bioinspired diagnostic tools.
Keywords: deep learning; human attention; medical image; self-attention; shifted window.