#ai #llm
Created at 240423
# [Anonymous feedback](https://www.admonymous.co/louis030195)
# [[Epistemic status]]
#shower-thought
Last modified date: 240423
Commit: 0
# Related
- [[Computing/Consciousness]]
- [[Biology/The evolutionary need for consciousness]]
- [[Philosophy/Humans/Compound attention]]
- [[Computing/Intelligence/Machine Learning/Scalable attention]]
# Attention and conciousness in the mind vs in LLMs
>Quite simply, a signal body state or its surrogate may have
been activated but not been made the focus of attention. Without
attention, neither will be part of consciousness, although either can
be part of a covert action on the mechanisms that govern, without
willful control, our appetitive (approach) or aversive (withdrawal)
attitudes toward the world. While the hidden machinery underneath
has been activated, our consciousness will never know it. Moreover,
triggering of activity from neurotransmitter nuclei, which I de
scribed as one part of the emotional response, can bias cognitive
processes in a covert manner and thus influence the reasoning and
decision-making mode.
~ [[Antonio Damasio - Descartes' Error Emotion - Reason and the Human Brain]]
"Attention is all you need" popular paper author, and Cohere.ai CEO says that their intuition for exploring attention in [[Artificial intelligence|AI]] came from looking at the human mind, that attention was such an important concept in human software, conviction supported by [[Geoffrey Hinton]] who often creates algorithms based on [[Philosophy/Rationality/Intelligence|organic intelligence]].