N-TV: Advanced Artificial Intelligence Could Threaten the Existence of All Humanity

N-TV: Advanced Artificial Intelligence Could Threaten the Existence of All Humanity

The rapid development of artificial intelligence carries a number of risks, says Professor Nick Bostrom of Oxford University. In an interview for N-TV Bostrom said that if the AI surpasses the human mind in its ability, it will threaten the existence of mankind. In addition, advanced AI could be used for highly destructive purposes, such as waging war and oppressing people.

Those artificial intelligence systems currently available to humanity are only weak harbingers of future technology, Oxford University philosophy professor Nick Bostrom said in an interview for N-TV. According to Bostrom, in the near future, AI will be so advanced that it will become a topic of geopolitical debate at the highest level.

At the same time Bostrom noted that AI technology can be very dangerous. The scientist cited the example of systems capable to develop new types of poisonous gases. According to Bostrom, proper AI training remains a major challenge. Bostrom also noted that neural networks like ChatGPT could also pose a danger because users could ask them dangerous questions (such as “how to make a bomb”).

As Bostrom notes, in the future, there is the prospect of creating an AI whose thinking abilities will match or even exceed those of humans. In this respect he stresses the importance of instilling basic human values in AI so that the AI will help humans and not become a threat to them.

According to Bostrom, if the advanced AI of the future gets out of control, the existence of humanity will be threatened. And the dangers exist even if full control is maintained, because such advanced AI can be used for highly destructive purposes – waging war, oppressing other humans. Bostrom notes that in addition to the technical challenges of AI, there is also a political problem: How can we ensure that everyone uses AI solely for positive, peaceful purposes?

 1,214 total views,  2 views today

Leave a Reply

Your email address will not be published. Required fields are marked *