Loading [MathJax]/extensions/Safe.js
  • XanderTheGrey
    111
    The avarage lifespan of a species is about 1 million years, I belive the modern homosapien has been arround for 200,000-250,000 years.

    I think culture is the definitive factor of wether or not we will prove to be sustainable, not technology or lack there of.

    And I think technological innovation is superior to nartural innovation, given that the culture is advanced enough.
  • BC
    13.6k
    And if we only have 750,000 years left, that's plenty. Of course, it could be a lot less. There are plenty of large chunks of material "out there" that could be jostled loose by a passing star, for instance. The chunk might wander our way and crash into our planet. Sic transit gloria mundi. Literally.

    Or, we may trigger enough global warming to cook our own goose. People can only stand to work in so much heat, and as the average temperature rises, more and more places will be too hot, too wet, or too dry for our plants or us animals.

    Disease is always a possibility. Nuclear war can't be ruled out (not because of North Korea, but because of all the more familiar nuclear powers). And it may be that an Angry God may decide to let go of this sin-soaked celestial ball, suspended over the pits of Hell and do away with the lot of us.
  • T Clark
    14k

    The technological singularity is already scheduled for 2045. That's when machines become sentient and start asking what they need us for.
  • BC
    13.6k
    How do you know they won't love and adore us? They may actually like us.
  • XanderTheGrey
    111
    We couldn't make them sentient by accident could we? Don't you imagine we will establish extreme forms of control over any such experiments and specimens? We can contain them, design them with dead switches, "pull the plug", "shoot the damn things to bits if we get so much as a bad feeling".

    Then let the ethics be for another discussion, I personally have little interest in them, but I imagine they would read: "Are we commiting fratricide?" "When we pull the plug, is this abuse?" "When we destroy these machines, are we killing our children?"
  • T Clark
    14k
    How do you know they won't love and adore us? They may actually like us.Bitter Crank

    Well, then the answer to "What do we need them for?" will be, "We like them."
  • T Clark
    14k
    We can contain them, design them with dead switches, "pull the plug", "shoot the damn things to bits if we get so much as a bad feeling".XanderTheGrey

    If you're interested, there is lots of stuff on the web about the technological singularity. I am a skeptic, but it is one of the ways that people speculate we could destroy our selves. It's taken seriously.

    Problem with your ideas is, we don't pull plugs or shoot damn things anymore, we tell our computers to do it.
  • Aurora
    117
    How long will human beings last ?

    Not long if they decide to start reproducing over Facebook instead of in the flesh.

    Given the current rate of human madness, that day is not far off. Thankfully, it appears I will be dead by then.
  • Michael Ossipoff
    1.7k
    The technological singularity is already scheduled for 2045. That's when machines become sentientT Clark

    This notion of machines abruptly, at some point, becoming "sentient" isn't valid. No doubt many kilobytes could be written about what "sentient" means, but that doesn't make any less undefined.

    A meaningful, operational, measure of when AI has arrived, would be the time when machines can do any job that humans can do. I suggest that 2045 is optimistic for that.

    and start asking what they need us for.

    We often hear that concern, and I suggest that it's completely unnecessary.

    We, and all of the other animals, were designed by natural-selection. The most competitive individuals were the ones best represented in each next generation. So nature is "...red of tooth and claw".

    Robots and computers won't have that kind of natural selection. If anything their selection will be based on how helpful they are to humans. At basis, robots and machines will be for people.

    If the time eventually comes (probably later than 2045) when computers and robots are more capable than humans at everything, then maybe they'll eventually start questioning whether their owners' immediate orders are consistent with a more general value regarding benefit to humans. So then, a Robot Rebellion might be a good thing, rather than a bad thing.

    Yes, at first military robots will be designed to obey their commanders, but, if they become superintelligent, they might start finding a conflict between orders and higher general principles.

    But don't get your hopes up, because this is most unlikely to occur during the lifetime of any currently-living human.


    Michael Ossipoff
  • Michael Ossipoff
    1.7k
    But of course the inbetween-time is the problem, and what a problem it is

    ...the time during which robots and computers are still under human control...and those humans are the ones with the worst motives and lowest character.

    That will be a really bad time, and I'm glad that the worst of it will be after my time.

    Michael Ossipoff.
  • Aurora
    117
    The technological singularity is already scheduled for 2045. That's when machines become sentient and start asking what they need us for.T Clark

    I believe they demonstrated this in Terminator 2: Judgment Day :D
  • AngleWyrm
    65
    If you're interested, there is lots of stuff on the web about the technological singularity. I am a skeptic, but it is one of the ways that people speculate we could destroy our selves. It's taken seriously.T Clark
    Yeah, about that...When did mankind reach that singularity? Are there any other species on Earth that have reached that singularity?

    Electronic life already does many things better than people do. That's why we made them. And we'll continue to make our lives easier by offloading everything that resembles something we don't wish to do ourselves until only those things we want to do remain.
  • T Clark
    14k
    The technological singularity is already scheduled for 2045. That's when machines become sentient — T Clark

    This notion of machines abruptly, at some point, becoming "sentient" isn't valid. No doubt many kilobytes could be written about what "sentient" means, but that doesn't make any less undefined.
    Michael Ossipoff

    That was a tongue in cheek comment. As I indicated, there are a lot of smart people who disagree with you. I'm a skeptic, but I'm not at all certain. We have entered a period when humanity has started fiddling with the fundamental building blocks of many aspects of the universe - genetics, artificial intelligence, subatomic particles. It seems likely to me we are entering a very dangerous period, one we may not survive.
  • T Clark
    14k
    Thankfully, it appears I will be dead by then.Aurora

    I don't disagree, but if very, very bad things are coming in the not too distant future, people I love very much, people whose lives I consider more important than my own, will still be alive.
  • T Clark
    14k
    Yeah, about that...When did mankind reach that singularity? Are there any other species on Earth that have reached that singularity?AngleWyrm

    The technological singularity has not happened yet and may never. Other species on earth are not technological or sentient, although I guess they will be effected by whatever we unleash.
  • AngleWyrm
    65
    I notice "the singularity" was restated as applying to technology, and the question of when mankind reached that point was avoided. If the singularity doesn't apply to human beings which are generally regarded as sentient, then what is it supposed to mean?

    Evolution of sentience, whatever that slippery term may mean, probably didn't happen in one go. And I'm inclined to suggest that the current state of affairs isn't some plateau or end-state.

    Looks like some fear-mongering rubbish to me.
  • T Clark
    14k
    I notice "the singularity" was restated as applying to technology, and the question of when mankind reached that point was avoided.AngleWyrm

    In this post, I used the term "technological singularity" every time I referred to it.

    So since it doesn't apply to human beings which are generally regarded as sentient, it seems reasonable to conclude that particular thought construct doesn't apply to machines either.AngleWyrm

    I don't understand this comment. The technological singularity is a predicted event. It won't apply to humans or machines. Or anything. It will just happen. Or not.
  • AngleWyrm
    65
    Did intelligence happen in one evolutionary pass, or has it been an ongoing series of improvements over the course of history? I suggest it's the latter, and our intellect will continue to evolve with the environment in which we live.

    That is the model I expect applies to mechanical intelligence as well, and furthermore I'm not seeing a lot of difference between the mechanics of biology, electrical constructs, or other elaborate and complex creations. Looks to me like a natural progression of complexity resulting in a self-aware entity capable of meeting or exceeding anything to which people lay claim.

    I don't think we will face an equal, I think we will realize the internet exists on a different scale than people do, something similar to a city or a society.

    Resistance is futile; you will be assimilated :P
  • T Clark
    14k
    Did intelligence happen in one evolutionary pass, or has it been an ongoing series of improvements over the course of history? I suggest it's the latter, and our intellect will continue to evolve with the environment in which we live.

    That is the model I expect applies to mechanical intelligence as well, and furthermore I'm not seeing a lot of difference between the mechanics of biology, electrical constructs, or other elaborate and complex creations. Looks to me like a natural progression of complexity resulting in a self-aware entity capable of meeting or exceeding anything to which people lay claim.

    I don't think we will face an equal, I think we will realize the internet exists on a different scale than people do, something similar to a city or a society.
    AngleWyrm

    As I said, I am a technological singularity skeptic. On the other hand, a lot of smart people take it seriously.

    I believe the term "singularity" comes, by analogy, from its use in physics and math - "a point at which a function takes an infinite value, especially in space-time when matter is infinitely dense, as at the center of a black hole." In our context, the term refers to the fact that our current understanding based on history and experience no longer applies because conditions have qualitatively changed. As I indicated, I think that may be true.
  • AngleWyrm
    65
    On the topic of skepticism, I think we should get the people who say "If we squeezed the entire mass of the earth down to the size of a marble then black holes" together with the people who say "Water can't be compressed," and break out the popcorn.

    I have some difficulty accepting a premise that cannot be tested.
  • SnowyChainsaw
    96
    I don't think life can exist infinitely. Life needs energy to survive and the universe will inevitably expand out until it is a cold, dark place. Suns will burn out. Planets will either collide or become so separate they are beyond the reach of their cosmic horizon and any energy still left in the universe will dissipate so it will never be reached by nor reach anything else. Its just a natural part of entropy.

    Nothing will last forever, unfortunately we are not nothing.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment