11 Jun 2021 |
@inphovore:matrix.org | Quantum computing is closer to consciousness than any automated statistics will ever be. No matter how many googleflops of information | 18:06:22 |
@inphovore:matrix.org | Not however with qubits. Our brain quantum computers are much different | 18:07:26 |
@inphovore:matrix.org | More analogue and holographic than bits | 18:07:42 |
@inphovore:matrix.org | Intelligence is the mitigation of uncertainty. If it does not mitigate uncertainty, it is not Intelligence | 18:08:48 |
@arjix:matrix.org | In reply to @inphovore:matrix.org Firstly, you can think of many things as autonomous if they are self represented. You respect the property of others, a car or even a laptops belonging to someone and you generally treat these with property level respect due to laws and customs. Regardless of their intelligence or “sentience”. We’ll talk to our toasters and organizers as though they’re sentient, even when we know they are far more limited than a dense human. This is a good point, but i don't think it's the same thing. In ancient times, slaves were also respected, but not for their sentience (moral reasons), but rather because they "belonged" to someone whose sentience they respected. In today's time we respect property, but it's moral only by proxy because we don't assume the property has sentience of it's own. This could be very different case with artificial intelligence. | 18:11:22 |
@arjix:matrix.org | In reply to @inphovore:matrix.org Not however with qubits. Our brain quantum computers are much different Ah right, because you assume our brain are quantum computers, which is what makes our consciousness something special. I very much disagree with that, i think our brains are just plain analogue computers. | 18:14:43 |
@arjix:matrix.org | In reply to @inphovore:matrix.org Not however with qubits. Our brain quantum computers are much different * Ah right, because you assume our brains are quantum computers, which is what makes our consciousness something special. I very much disagree with that, i think our brains are just plain analogue computers. | 18:15:04 |
@arjix:matrix.org | In reply to @inphovore:matrix.org More analogue and holographic than bits I'm not sure what you mean by "holographic" but yes, neurons output digital bits but the encoding of information is analogue because the circuits are time sensitive. | 18:17:54 |
@inphovore:matrix.org | In reply to @arjix:matrix.org I'm not sure what you mean by "holographic" but yes, neurons output digital bits but the encoding of information is analogue because the circuits are time sensitive. I believe this to be incorrect. Our neurons are not digital, they are potential | 18:20:47 |
@inphovore:matrix.org | Potential is analogue | 18:21:00 |
@inphovore:matrix.org | Like a sound wave | 18:21:09 |
@arjix:matrix.org | In reply to @inphovore:matrix.org Intelligence is the mitigation of uncertainty. If it does not mitigate uncertainty, it is not Intelligence I think this interpretation of intelligence is fine, yes. I personally interpret intelligence as the ability of abstracting out data. | 18:21:10 |
@inphovore:matrix.org | Information is that which removes uncertainty. If it does not remove uncertainty, it is not information | 18:21:55 |
@inphovore:matrix.org | Intelligence and information have exactly the relationship you describe | 18:22:15 |
@arjix:matrix.org | In reply to @inphovore:matrix.org I believe this to be incorrect. Our neurons are not digital, they are potential So how the logic of neurons works each neuron has a threshold of potential, each neuron has different threshold, that threshold is the only variable that gets optimized (that's how the neuron learns). If the input potential passes the threshold, the neuron becomes "activated" and fires a single spike of signal, which acts as a digital bit. That single bit goes to other neurons, building up their own potential. I think the accumulated potential also drops with time, so some insensitive neurons get only activated by being repeatedly stimulated in very short amount of time. | 18:26:06 |
@inphovore:matrix.org | The quantum dynamic of consciousness is such that localized singularity acts as a sieve of potential. This behavior is infinitely more energy efficient than a model of linear cognition on which bits hierarchically combine for a cause/effect state machine output | 18:26:06 |
@inphovore:matrix.org | Read the prior comment | 18:26:30 |
@inphovore:matrix.org | Posted simultaneously:/ | 18:26:43 |
@inphovore:matrix.org | In the holographic singularity, the more mirror neurons, the higher the resolution | 18:27:38 |
@inphovore:matrix.org | So splitting the number of mirror clusters would produce a thought half as clear | 18:28:13 |
@inphovore:matrix.org | We’ve all had fuzzy/clear thoughts, and many in different forms at the same time | 18:28:42 |
@inphovore:matrix.org | These proportion to how much (and of what quality) the neural interactions | 18:29:46 |
@inphovore:matrix.org | Not binary or sequential, though temporality and sequence are thought technologies | 18:30:35 |
@arjix:matrix.org | In reply to @inphovore:matrix.org Information is that which removes uncertainty. If it does not remove uncertainty, it is not information I don't really like the use of word uncertainty in the definition of intelligence, because to be certain of something implies consciousness, and exactly as you stated, there can be intelligent actors without them being conscious. Maybe that's just my interpretation of the word uncertainty. | 18:31:43 |
@inphovore:matrix.org | Potential is the domain of probability. The reduction of potential is the reduction of uncertainty (entropy) without anthropomorphism | 18:33:10 |
@inphovore:matrix.org | So the super amplitude of potentials is how uncertain you may be that the outcome will be any one specific thing. | 18:33:54 |
@inphovore:matrix.org | Though you may be mostly confident that the outcome will be among potentials, in a well modeled system | 18:34:28 |
@inphovore:matrix.org | YET NEVER CERTAIN! | 18:34:38 |
@inphovore:matrix.org | Only less uncertain | 18:35:08 |
@arjix:matrix.org | In reply to @inphovore:matrix.org The quantum dynamic of consciousness is such that localized singularity acts as a sieve of potential. This behavior is infinitely more energy efficient than a model of linear cognition on which bits hierarchically combine for a cause/effect state machine output I agree with that quantum dynamics are able to encode a lot more information, yet that which we are observing in science points to our brains working as "simple" statistical machines. It is actually well established at this point. | 18:35:36 |