OpenAI's recent roadmap: will reduce the cost of GPT-4 API, and is considering open source GPT-3

According to AI development platform HumanLoop, OpenAI CEO Sam Altman said in a closed-door seminar that OpenAI is currently severely limited by the GPU, causing them to postpone many short-term plans. Most of the problems with ChatGPT reliability and speed are due to Caused by shortage of GPU resources.

Sam Altman also shared OpenAI's recent roadmap: GPT-4 API costs will be reduced; longer ChatGPT context windows (up to 1 million tokens), and there will be an API version that remembers conversation history in the future; GPT-4 The multimodal capabilities of GPT-4 will not be publicly available until 2024, and the visual version of GPT-4 cannot be extended to everyone until more GPU resources are obtained.

Also, OpenAI is considering open-sourcing GPT-3, part of the reason they haven't open-sourced it is because they feel that there are not many people and companies capable of properly managing such a large language model. Many recent articles claim that "the era of giant AI models is over" is not correct. OpenAI's internal data shows that the law of proportionality between scale and performance still holds true, and OpenAI's model size may double or triple every year (multiple information shows GPT-4 parameter scale 1 trillion), rather than increasing by many orders of magnitude.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)