Indeed, it is pretty wild to think how flexible the brain is though. Like you apply a completely new type of input and it learns to figure out how to make sense of the data and use it in a meaningful way. I haven’t really followed up on whether there’s been any progress with that device. I agree the initial version doesn’t sound very comfortable.
Yeah, I know of deaf people who were able to use echolocation, listening to the sound of their footsteps echoing to walk around, even doing things like playing basketball that way. Though something like that doesn’t really work in a loud environment like a city street, so using “taste” to replace sight like this would be much more useful in those sorts of situations.
Indeed, it is pretty wild to think how flexible the brain is though. Like you apply a completely new type of input and it learns to figure out how to make sense of the data and use it in a meaningful way. I haven’t really followed up on whether there’s been any progress with that device. I agree the initial version doesn’t sound very comfortable.
Yeah, I know of deaf people who were able to use echolocation, listening to the sound of their footsteps echoing to walk around, even doing things like playing basketball that way. Though something like that doesn’t really work in a loud environment like a city street, so using “taste” to replace sight like this would be much more useful in those sorts of situations.