Farisco, M., & Evers, K. (Eds.). (2016). Neurotechnology and direct brain communication: New insights and responsibilities concerning speechless but communicative subjects. Routledge.
Anyways. I'm reading, have some reactions from their introduction:
I think the mention of how brain and mind relate, as being philosophically considered the same thing, substantially different, or somewhere in between, is a nice touch, as is the admission that science tends towards taking the two to be pretty much the same. Maybe I'll get to see more discussion related to that in later chapters? It'd be cool.
One of my classmates told me that when she did research related to "motor imagery" brain computer interfaces, while most participants did start by imagining a certain motion, they didn't necessarily keep imagining that same motion as they kept using the system. The signal kept coming from the same place, but what they were doing to make that signal changed. I think that says something about localization.
They ask if it's possible to interpret what someone is thinking directly from brain signals. Thinking as someone who does work in brain computer interfaces, I don't know if that might someday be possible, but what we have right now is nowhere near that. It's a few characters of text per minute, when things work well, which isn't always the case. (As opposed to the 60+ words per minute I type with my hands, or the over a hundred words per minute of typical conversational speech.)
And yay paying attention to the assumptions used in neurotechnology. I like it when people recognize that technologies aren't conceptually neutral.
I wonder what age range they're taking as infants, and how they're determining that infants don't understand language. Because the coordinated movements involved in speaking are an issue, and receptive language tends to be way ahead of expressive language for quite a bit of child development. That is, people understand more than they can say. And I spoke at six months. It's true that speaking at six months is unusual, but if we're doing the "interrogate assumptions" thing, we should interrogate this assumption too. Especially when it's being used to question the use of the word "communication" when applied to babies. And especially when the first chapter goes on to discuss neurological responses that suggest hearing infants do recognize spoken language.
And now the first chapter:
Lagercrantz, H. & Padilla, N. (2016). The emergence of consciousness: from foetal to newborn life. In Farisco, M., & Evers, K (Eds.) Neurotechnology and Direct Brain Communication (pp. 21-34). Routledge.
The authors ask what it's like to be a baby. I don't know what it's like to be a baby -- I don't remember anything from when I was that young, and even though I was talking a bit at six months, I don't think my parents asked me what it was like to be a baby. Maybe if I meet another six month old who talks, I'll ask them what it's like to be a baby.
Oh no. Oh no. Early identification of "risk" for autism and then early intervention. In a world that used a model more like the Foundation for Divergent Minds one, I'd be totally cool with early identification, and early actions to support people. But I know what model is really used in early intervention, and it increases our risk of PTSD. So, no. Here's another assumption I'd like to see challenged, thank you.
Discussion of whether dreaming during REM sleep is a conscious or unconscious state is interesting. However, I do want to question the assumption that insight and self-reflection are absent during dreams. Lucid dreaming (dreaming while aware that it's a dream) is a thing, and both insight and self-reflecting are totally possible in that state.
EEG and NIRS are the same technologies we tend to use in our lab, because they're portable. It's interesting to see them come up in infant studies for similar reasons.
I do wonder how they're deciding certain neuronal connections are required for consciousness, as opposed to being required in order to communicate consciousness to outsiders. Those aren't the same thing. See also, "I heard it all" or "I understood it all" from people who were in comas, as well as from non-speaking autistic folks who get access to communication later.
"Resting" neural activity is definitely a thing. There are always, always neurons firing in alive people. That's why, when we do neuroimaging studies, there's often a comparison between activity at rest and activity during whatever task we're asking people to do. It's because things are still happening when we're resting. Autonomic breathing control, for example, is still a thing. So is sensory processing. I wonder what my rest state looks like compared to that of my neurotypical classmates.
"However, dreaming is tightly linked to the ability to imagine things visually, which is less likely to occur in the foetus and extremely preterm infant." (p. 12).Wait, really? People with minds eyes confuse me. My imagination doesn't get to plug into the monitor anytime other than while I'm asleep. I don't have the ability to imagine things visually when awake, and I can't make an extra layer of intentionally imagining more things visually while dreaming, either. But I do dream, and often in color. My other senses often work in dreams too -- things like taste and touch. I would never have come up with an association between dreaming and visual imagination on my own, even though "do you dream?" is one of the first questions I'm asked when I tell people I'm aphantasiac and explain what that means.
There are some interesting sensory findings here. Apparently typical newborns already have some capacity for facial recognition, though their visual acuity isn't great. (I wonder if they're better at recognizing faces than I am. And at what point developmental/congenital prosopagnosia can be detected, if typical newborns already have some facial recognition. See Meltzoff, A. N. & Moore, M. K. 1977. Imitation of facial and manual gestures by human neonates. Science, 198, 75-8; Farroni, T. Chiarelli, A. M., Lloyd-Fox, S., Massaccesi, S., Merla, A., Di Gangi, V., Mattarello, T., Faraguna, D., & Johnson, M. H. 2013. Infant cortex responds to other humans from shortly after birth. Sci Rep, 3. and stuff citing them for references if I ever try to look more closely at this, I guess?)
I think it's pretty cool that infants can start acquiring another language if someone reads and tells them stories in that other language. It makes sense, considering how many kids are bilingual from a young age due to immersion in multiple languages.
"Even the preterm infant ex utero may open its eyes and establish a minimal eye contact with its mother and show other signs of conciousness like cortical responses to pain." (p. 16)Wait, we're using eye contact as a sign of conciousness now? I'm too autistic for this. Nope.
Part 2 is/will be here!