
We investigate next‑generation adaptive distributed autonomous systems through the lens of anthropomorphic prosthetics, androids, and intelligent robotic platforms. Our research integrates cutting‑edge neuroscience, artificial intelligence, neuromorphic computing, and robotics to create highly responsive, lifelike systems capable of operating autonomously while adapting to human intent and dynamic environments.
Leveraging neuromorphic architectures and spiking neural networks, we develop control frameworks that enable natural, intuitive interaction between artificial limbs, androids, and biological systems. These brain‑inspired models support real‑time adaptation, low‑power operation, and seamless communication across distributed components.
Our work on non‑invasive neural interfaces allows prosthetic devices to adjust continuously to user intent, improving precision, comfort, and fluidity of motion. In parallel, our research on advanced sensory processing equips androids with human‑like perceptual capabilities, enabling them to interpret complex environmental stimuli, collaborate with humans, and function autonomously within distributed multi‑agent settings.