Microsoft Bing AI Ends Chat When Prompted About Feelings

Bloomberg2023-02-23

Microsoft Corp. appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing internet search engine, with the system going mum after prompts mentioning “...

Source Link
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

Leave a comment
3