Musk, Experts Urge Pause on Training of AI Systems That Can Outperform GPT-4

Reuters2023-03-29

March 28 (Reuters) - Elon Musk and a group of artificial intelligence experts and industry executives are calling for a six-month pause in training of systems more powerful than GPT-4, they said in an open letter, citing potential risks to society and humanity.

The letter, issued by the non-profit Future of Life Institute and signed by more than 1,000 people including Musk, Apple co-founder Steve Wozniak and Stability AI CEO Emad Mostaque, called for a pause on advanced AI development until shared safety protocols for such designs were developed, implemented and audited by independent experts.

"Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable," the letter said.

The letter also detailed potential risks to society and civilization by human-competitive AI systems in the form of economic and political disruptions, and called on developers to work with policymakers on governance and regulatory authorities.

The letter comes as EU police force Europol on Monday joined a chorus of ethical and legal concerns over advanced AI like ChatGPT, warning about the potential misuse of the system in phishing attempts, disinformation and cybercrime.

Since its release last year, Microsoft-backed OpenAI's ChatGPT has prompted rivals to launch similar products, and companies to integrate it or similar technologies into their apps and products.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

  • Alfred1007
    2023-03-30
    Alfred1007
    Hope I won't see skynet in the next few decades..
  • JediGingerNinja
    2023-03-29
    JediGingerNinja
    Thank goodness there are people at the top of the food chain thinking about the people as a whole and not solely on profit and control.. I watched lex's interview with open AI ceo yesterday. Not mentioning the underlying technology being sold to a man who has now not only spread it to major private players and himself announcing that he has disbanded the ethic division of his AI sector! In my opinion... the intelligence agencies are not that incompetent to not know the risks this causes to the people they have taken oaths to protect! Which means they are actually are wanting this kind of behavior! #duckingoaths
Leave a comment
2