Do you think the singularity will occur? If so, what time frame do you think it will happen in and how will it impact humanity? Alternatively, do you think or care at all about the potential for reaching singularity?
Fandango defines this as “a hypothetical future point in time when technological growth becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization.” Hasn’t that already happened? Have we not been irreversibly impacted by the technology from communications, agriculture, the military, and medical sciences? I guess we could “go back” if we destroyed everything in a horrible nuclear war and some humans were able to survive in remote areas, but I think the projections say this is unlikely (the remote people surviving).
Generally, the science fiction tropes give us the idea of runaway AI to the point where it develops a will to survive and surpass us, forming some sort of evil intent. It knows we will disable it once we discover its new powers, so it must kill us first. The plot of such fiction centers around a race against time, man vs machine. I assume man wins, or the endings wouldn’t be satisfying. As you probably can guess, I haven’t read/watched a whole lot of this. I prefer man vs man.
I don’t worry about machines developing feelings and trying to kill me. No. Though it’s definitely possible I could trip over a computer cord and smash my head on the corner of a table and die of a concussion. That could happen. I am a klutz. But as far as my phone and computer conspiring to electrocute me because they’ve received instructions from the Big AI… no, this is not something I worry about at all. I still figure that if anyone poses a mortal threat to me, it’ll be some jerk on the freeway, same as always.
©️2019 Paula Light and Light Motifs II. No unauthorized use permitted. Please check out Paula’s books for sale on Amazon.