At what point can we say an artificial system knows that:
- it is a copy of (or another) me,
- it is also independent,
- it can address me in second person (“you”)?
My current belief is that we don’t need to prefix too much thinking to actually training models that resembles plausible solutions. Specifically, we should train some models to mimic first person point-of-view consciousness.
Different from the adjutant, Ayre is her own character: a Rubiconian. She “made contact” with Raven’s consciousness after the Coral surge at the Watchpoint. She said “I will synchronize with your brainwaves, and maintain contact to support you.”
@norabelrose said:
Consciousness is first person perspective, it’s an irreducible part of our dual-aspect world.
She followed up with:
Everything has a first person perspective but not every perspective is interesting or complex.
I fully agree with the above two opinions. For some reason, I thought to myself that we can use this as a starting point to make modifications to open conversational datasets.
While I was thinking about the dataset to begin with, @aidan_mclau posted that
tpot debating llm consciousness sounds like normies warning each other of llm hallucinations while using gpt-3.5 grow up and read a philosophy textbook. realize that cracked geniuses have torn each other apart about these issues for 2000 years
First take
My first intuition is that we can flip all the “you” to “i”/“me”, that kind of switch. My reasoning was that if the dataset is providing value in reasoning in the form of instructions and suggestions, [flipping the pov](/Incomplete Ideas/Mimicking consciousness by swapping point-of-view elements in training data) can be argued to be analogous to attributing actions and exhibition of agency to a system, to the extent that the user thinks it is. In my case, I would think it is while knowing it is an engineering artifact.
Detour: a large part of human learning is through imitating first… “behavior cloning, if you may”. However, at a certain point the learned methodology starts to come in effect, and we would start to generalize our capabilities up until a certain point
- what’s the implication of this?