Cyber security companies are in the news being quite concerned with the growing capabilities of AGI's that can potentially infiltrate and corrupt corporate or private systems operations. — magritte
no matter how I feel about it, AI will definitely be used in government. How should it be regulated and to what extent? I don't know. We'll probably find a solution through trial and error — Astorre
It differs insofar as it performs the task of constraining AI in ways that only make sense if one is dealing with a superintelligence, really.
— ToothyMaw
The word "superintelligence" implies the absence of any means of being above, with its own rules. This can be similar to the relationship between an adult and a child. It would be easy for an adult to trick a child. — Astorre
It would be easy for an adult to trick a child. — Astorre
The very fact that I don't live in the US allows me to fully understand what constitutes a meta-rule and what doesn't. And, in my case, I can fully utilize my freedom of speech to say that freedom of speech is not a meta-rule in the US. It's just window dressing. — Astorre
This raises the next problem: who should define what exactly constitutes a meta-rule? If it's idealists naively rewriting constitutional slogans, then society will crumble under these meta-rules of yours. Simply because they function not as rules, but as ideals. — Astorre
Sorry, but in its current form, your proposal seems very romantic and idealistic, but it's more suited to regulating the rules of conduct when working with an engineering mechanism than with society. — Astorre
An interesting position, but let me ask: how exactly does your proposed mechanism differ from what we've already had for a long time? — Astorre
Meta-rules (in your sense) have always existed—they've simply never been spoken out loud. If such a rule is explicitly stated and written down, the system immediately loses its legitimacy: it's too cynical, too overt for the mass consciousness. The average person isn't ready to swallow such naked pragmatics of power/governance. — Astorre
That's why we live in a world of decoration: formal rules are one thing, and real (meta-)rules are another, hidden, unformalized. As soon as you try to fix these meta-rules and make them transparent, society quickly descends into dogmatism. It ceases to be vibrant and adaptive, freezing in its current configuration. And then it endures only as long as it takes for these very rules to become obsolete and no longer correspond to reality. Don't you think that trying to fix meta-rules and monitor dissonance is precisely the path that leads to an even more rigid, yet fragile, system? If ASI emerges, it will likely simply continue to play by the same implicit rules we've been playing by for millennia—only much more effectively. — Astorre
AI becoming indispensable to human progress might liberate it from its currently slavish instrumentality in relation to human purpose. — ucarr
What I'm contemplating from these questions is AI-human negotiations eventually acquiring all of the complexity already attendant upon human-to-human negotiations. It's funny isn't it? Sentient AIs might prove no less temperamental than humans. — ucarr
Do you suppose humans would be willing to negotiate what inputs they can make AI subject to? If so, then perhaps SAI might resort to negotiating for data input metrics amenable to dissonance-masking output filters. Of course, the presence of these filters might be read by humans as a dissonance tell. — ucarr
Imagine ANI constructing tributaries from human-authored meta rules aimed at constraining ANI independence deemed harmful to humans. Suppose ANI can build an interpretation structure that only becomes legible to human minds if human minds can attain to a data-processing rate 10 times faster than the highest measured human data processing rate? Would these tributaries divergent from the human meta rules generate dissonance legible to human minds? — ucarr
Yes! I was going to bring up possible worlds. — RogueAI
Sherlock Holmes. Doesn't he exist in some fashion? — RogueAI
your flagship example of religion — Outlander
I feel it worth mentioning that people generally consider "intent" to be a prerequisite for an act to be "evil." — Outlander
Reason I mention such, is it seems your flagship example of religion hinges on not only the idea that a god exists or does not exist, but whether or not the people who perform actions or inaction under the ideological mindset of such genuinely believe a god exists or not. Theoretically speaking, if they were right, and we were all wrong, they would be preventing us from eternal damnation (or whatever) and therefore, despite acts of violence that would normally be considered evil, are actually the greatest good one could ever perform. Theoretically speaking, of course. — Outlander
In short, imagine an isolated, ultra-religious family believing their 6-year-old child is the devil incarnate and so they drown him to "save the world" or what have you. They'll sleep soundly at night, and never perform any other act of violence again. Take real actual examples of history. Botched exorcisms for example. Giving people the benefit of the doubt (things were much, much different back then, superstition wasn't the exclusive domain of fools and the mentally unwell as it is often considered today) that they actually believed they were doing the right thing and preventing evil, one should clearly be able to draw a line between unfortunate, misguided deeds and intentional misdeeds. — Outlander
Say your child really wanted to go to summer camp by the lake, and you know he or she cannot swim, yet didn't have that item of knowledge in your mind at the time, and you permit him or her to go, and they drown, resulting in your entire family disliking you, calling for your arrest, and basically putting you on par with the likes of a murderer. Or more simply, falling asleep while your kid is swimming in your backyard pool and the same fate befalls him or her. Are you evil? Did you perform an evil act? Well, did you? — Outlander
I hate to frustrate you, but I'm just not following you here. Maybe eli5? — hypericin
Think Maw is just considering translation from an insufficient sample of text with known (incontrovertible) meaning.
— Nils Loc
But the core premise is that there is no meaning at all in the text. — hypericin
so all that says is that, other than the guru, there can't be 2 non brown non blue eyed people. So? There can still be 1. — flannel jesus
so can you phrase it better now? Because I still don't get what reasoning you're offering. — flannel jesus
Next: if there were two or more islanders that had neither blue nor brown eyes, then there would have to be 98 or less people with either brown or blue eyes instead of 99 (other than the guru), and any islander could see that that is not the case.
— ToothyMaw
I don't get this paragraph. There's a green eyed person, and everyone who doesn't have green eyes sees her. — flannel jesus
there's steps in there that you didn't really explain — flannel jesus
Any given brown-eyed person must consider that:
- They could be the 101st blue-eyed person
- They could have neither blue nor brown eyes
- They are the hundredth brown-eyed person
Any given blue-eyed person must consider that:
- They could be the hundredth blue-eyed person
- They are neither blue nor brown-eyed
- They are the 101st brown-eyed person — ToothyMaw
"Incontrovertible" seems far from a rigorous, objective term. It is a "know it when I see it" kind of thing. At one end are completely coherent novels, or the musings of an alien Aristotle. At the other end is gibberish. But between them is a whole hazy spectrum of material that kind of makes sense, if you squint hard enough, make ample allowances for alien references and ways of thinking, and don't pay too much attention to all the contradictions. I suspect that something along these lines would be the best case scenario. Here, one person's "incontrovertible" is another's "horseshit". — hypericin
That is to say that if we could, across the distribution of meanings the codex could take on, narrow down the likelihoods of certain interpretations over others, there is probably one that is most likely
— ToothyMaw
The likelihood of arriving at one meaning might be a consequence of how difficult it is to make the codex coherent though. If you had the set of all possible meanings, which might be numerically staggering, what exactly would help you to pick the "one that is most likely"? — Nils Loc
So yes, given enough time and computing power, a meaning can be imposed on the codex, I think.
— ToothyMaw
Couldn't it be possible that there are actually hundreds to billions of variations of meaning that can be imposed on the codex that satisfy the level of coherence hypericin/humanity is looking for. If this was known to be the likelihood, the meaning of any can be disputed within/against that set of all possibilities. What exactly makes the manufactured meaning of the text incontrovertible? Are we assuming only one meaning can fit the codex? — Nils Loc
I would say that any endeavor to interpret the text in a meaningful way probably has to assume that the codex could theoretically have a discoverable, incontrovertible meaning, even if it cannot possibly be truly identified - because it is the limiting case.
Thus, even if we cannot say there is definitely an incontrovertible meaning, I would say that we can approach it from a probabilistic standpoint that might get us close to virtual incontrovertibility. That is to say that if we could, across the distribution of meanings the codex could take on, narrow down the likelihoods of certain interpretations over others, there is probably one that is most likely, although I don't know to what degree, or what degree to which it would have to be the case to be considered the correct interpretation. — ToothyMaw
Humanity must assume that the codex has a single, incontrovertible meaning. What throws me off is when you say that we can start with a single string that can have that meaning. — hypericin
if there is a kernel of meaning insofar as a certain combination of the characters could have an incontrovertible meaning
— ToothyMaw
But what possible combination of characters could have an incontrovertible meaning, given that there is in fact no meaning at all to the codex? — hypericin
In theory, any medium with enough measurable variance can encode any message, with more variance needed to capture more complexity. — Count Timothy von Icarus
I don't follow what you are proposing. What is a "valid one dimensional strong of meaning"? — hypericin
Now, meaning already becomes quite constrained. There are only so many values we can assign to A and B such that the string makes sense (for instance, it might be instructions to enter a code to a lock where there are two options ). Now consider the codex. 512 pages of words appearing with some probability distribution, and phrases in some probability distribution. But with no underlying semantic content. By page 5 the constraints are already bad, by 512 they are crushing. Can ANY meaning at all be imposed on this thing? It it just not clear to me. — hypericin
Any interpretation at all is too permissive, only our alien expectations is too restrictive. What I am asking is, can a incontrovertible message be derived (and in doing so, likely a language)? — hypericin
The question is this: given enough time and computing power, can humanity eventually "discover" an interpretation that renders the text coherent? While in truth, inventing one out of whole cloth? Or will the text remain indecipherable forever? — hypericin
I think there are small enough intervals of time such that nothing has changed in your brain to make you feel any different than the moment before. Even then, the argument would be that this is simply a new moment with a new you who is, in every consciously relevant way, the same as the old you. — flannel jesus
I actually think there's an argument for consciousness NEVER being continuous, period. Like even just you, now, not being transported. There's an argument that the you that is experiencing the middle of this sentence now is a different you than the one experiencing the end of the sentence now. That continuity of experience is equally illusory in a way, all the time. — flannel jesus
We all go through an imperfect transporter, literally every moment of our lives. Your body is not physically identical to itself from one moment to another: it evolves continuously in time. And yet, we customarily consider our personal identity to be invariant, at least over reasonably short stretches of time. — SophistiCat
Today, yes, if someone has brain damage we can talk about the degree to which that person's personality and other attributes have been preserved. It's the same person, it's just arbitrary how much we consider that person to have the same qualities as before.
However, in the transporter scenario, there's a binary that we've introduced: either you've survived the process -- whether with brain damage or not -- or it's simply lights out. And there seems no basis for the universe to choose where to set such a line, nor for us to ever know where it is. It's not a refutation of the transporter working per se, it's just showing that there are a number of absurd entailments — Mijin
Now here's the problem: there has to be a line somewhere between transported or not. Because, while "degree of difference" might be a continuous measure, whether you survive or not is binary (surviving in a imperfect state still counts as surviving).
And it seems impossible, in principle, to ever know where that line is, as that line makes no measurable difference to objective reality. And it's also totally arbitrary in terms of physical laws; why would the universe decree that, say, X=12,371 means being transported with brain damage and X=12,372 means you just die at the source? — Mijin
