First off, I applaud this effort. Amazing science in progress.
However, look at that diagram...they want to create an awfully complicated design. Tiny optical mirrors are placed along equally microscopic optical fibers, which are then wound around individual nerves. A nerve signal headed for an amputated limb is detected by these microsensors and they transmit the signal via the optical fiber to a CPU of some sort which will then move the limb. Theoretically feedback could go the other way - touching an object could fire off haptic sensors which would then send an infrared laser signal through the fiber optic cable to the mirror and trigger a nerve pulse, which would then be carried to the brain.
Okay, here's the thing. The article, and the researches in it, make brain-machine interface seem like a really crappy technology with no promise. But practical, functioning prototypes of their technology is "a decade off" they admit. They claim: "Even a bleeding-edge, brain-based prosthetic would only offer a few degrees of movement, and because electrical signals are relatively slow, you couldn’t move as quickly as someone with a real arm."
The problem with this claim is that anyone with a basic understanding of bioelectric signals knows that the transmission speed of electrical signals in copper wires is thousands and thousands and thousands of times faster than the signal transmission speed in nerves. In fact, because the transmission speed of nerves is SO slow, early multi-cellular life forms evolved myelin sheaths, which speed up the transmission rate of nerve signals, at the cost of signal strength. Myelin sheaths are spaced along a nerve fiber, and the signal shoots through them, then reconcentrates in the inter-mylin nerve fiber space, then shoots through the next one. It's all explained here.
The point is that their core argument against brain-implant prosthetics is that the slowness of them is what causes the difficulty in doing simple tasks. That is simply not true. They further claim their fiber optic method will mitigate this problem...that's probably not true....but we won't know for another decade, right?
The reason, dear readers, that brain-controlled prostheses lack mobility is two-fold. First, the electromechanical design of prosthetics is still limited by our ability to make artificial muscle. We use servos to simulate elbows and knees. In a way we do it backwards of nature. Nature puts the muscles between the joints, then pulls on the joints to move them. We put the "muscle" in the joint, and actuate it right there on site.
The second reason for the difficulties in brain-controlled prostheses is that we simply have not developed software to decode the brain. Put one of these babies on, and you can get a pretty diverse and interesting real-time electrical output from the brain. But first, no one wants to wear that, and second, the amount of data is simply overwhelming. When I think about typing the letter "t", my brain produces a very specific electrical signal. However, it's lost in the noise of me thinking about moving my eyeballs, thinking about maintaining my posture in this chair (or should I say maintaining my slouch in this chair?), and thinking whatever else I am thinking. When you have billions of signal generators that maintain trillions of interconnections...you simply get a TON of noise. And so the difficulties in moving a prosthetic with your thoughts alone has nothing to do with signal velocity and instead has everything to do with signal integrity. If I had a prosthetic arm and wanted to type the letter 't' I would have to think really really hard about it so that the software in the brain-implant-computer-whatever could clearly through all the noise see I wanted to type that letter. Thinking hard is a lot slower than regular thinking.
So I humbly submit: the hardware you use to interface with the amputee is irrelevant, mostly. As long as you can interrupt and monitor the brain's commands...either at the brain or at the stump...you can move the hardware. What matters is signal processing, which lies in the software. This is a problem, I think, for mechanical engineers like me and electrical engineers and neurophotonics researchers to accept: we're not the important part of this puzzle.
But this goes back to TAE's Law of Bionics that I submitted on this blog many times, and it bears repeating now: All You Need Is Drivers. The only thing stopping me from having a functioning USB port on my arm is that we lack the drivers for the two hardware systems to communicate. All you need is drivers. All you need is drivers. All you need is drivers.
_
Thursday, 3 November 2011
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment