The fears people who like to talk about the singularity like to propose is that there will be one ‘rogue’ misaligned ASI that progressively takes over everything - i.e. all the AI in the world works against all the people.
My point is that more likely is there will be lots of ASI or AGI systems, not aligned to each other, most on the side of the humans.
The fears people who like to talk about the singularity like to propose is that there will be one ‘rogue’ misaligned ASI that progressively takes over everything - i.e. all the AI in the world works against all the people.
My point is that more likely is there will be lots of ASI or AGI systems, not aligned to each other, most on the side of the humans.