Negative Dangers of the Technological Singularity
Since the beginning of the science fiction genre as an art form, science fiction writers have enjoyed penning elaborate stories and adventures whereby Human civilization is utterly destroyed or permanently enslaved by superior machine based life forms, which were created by Humans.
We can't see the future. It is impossible to know in advance what the ultimate results will be in exceeding the technology barrier of the technological singularity. We may not be able to predict or anticipate how uncontrollable a massively intelligent machine could be.
In order to thwart the end of Mankind, we as Humans might have to actually integrate ourselves with this omnipotent artificial life force we might develop in the future in order to avoid Armageddon or Judgment Day or even the Final Apocalypse. However, it may be impossible to avoid total destruction by advanced technology unleashed on the world.
The biggest dilemma we face is that there is no way for us to prevent all individuals, corporations, nations and governments from making mega super smart machines so that they can personally gain some form of permanent hegemony over the planet.
It appears the rise of ultra intelligent artificial intelligence of our own design can't be stopped or prevented. As a result, we have to be the first to create and to develop this advanced technology or instead we risk having it created by our worst adversaries in order to destroy our way of life and society or something even more horrible.
We must fervently pray both day and night without ceasing that the powers of evil have not already achieved the technological singularity on Earth.