Company developing AI-driven prosthetics describes real‑time sensing and emotional impact

Nucleus Institute Deep Tech Panel (hosted with Atlassian) · December 12, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Biologic Input Output Systems said it uses AI to translate nerve signals into lifelike movement and sensory feedback, sampling nerve data at high frequency; a company example described a prosthetic enabling a user to feel a spouse's touch.

Tim Miller, chief development officer of Biologic Input Output Systems (Bios), described the company’s work on devices that translate the language of the brain into control signals for the body in real time, using AI to process very high‑frequency neural data.

Miller said the systems sample nerves “at 30,000 times a second” and process tens of thousands of megabytes per minute, which would be impossible to analyze without AI. He said those data feed models that generate lifelike movement and can even return sensation to a user.

“Someone was able to hold his wife's hand and actually feel her touch through a robotic hand,” Miller said, describing an emotional user result that moved people in the room to tears. He framed the work around alleviating suffering for amputees and people with neuropathy and chronic pain, saying the goal is to make disabilities “no longer a disadvantage.”

Miller and other panelists placed the work in a pro‑human AI context: technology that augments human capability and improves quality of life rather than replacing human judgment. The panel did not present clinical trial data or peer‑reviewed studies; Miller’s remarks described device capabilities and company experience to date.