Nvidia CEO Jensen Huang Urges Tech Leaders to Avoid Spreading AI Panic Sentiment

robot
Abstract generation in progress

Nvidia CEO Jensen Huang said that tech industry leaders need to be cautious when addressing disputes between Anthropic and the Pentagon to avoid causing public panic about artificial intelligence.

“Sending warnings to the public about AI is a very good starting point,” Huang said at a panel discussion at Nvidia’s tech conference. “Warnings are good, but scare tactics are not, because this technology is too important to us.”

Huang believes the biggest national security risk related to AI is that the public may become angry, fearful, or suspicious, which could cause the U.S. to fall behind its competitors in adopting the technology. Anthropic is an important customer of Nvidia and the developer of the chatbot Claude. The company currently opposes military use of AI tools and is in a standoff with the Trump administration.

Last month, Anthropic’s relationship with the Pentagon broke down after CEO Dario Amodei insisted on including restrictions in the contract, prohibiting the use of its products for domestic surveillance of Americans and the development of fully autonomous weapons. The Trump administration then declared Anthropic a supply chain risk and began efforts to exclude it from government contracts. Anthropic subsequently filed a lawsuit seeking to have the supply chain risk designation overturned.

However, Huang remains optimistic about Anthropic’s financial prospects. He believes that by 2030, Anthropic’s revenue could exceed $1 trillion, and he considers Amodei’s forecasts somewhat conservative.

Anthropic’s spokesperson has not yet responded to requests for comment.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin