• All articles
  • Language models
  • New Tech
  • Safety, Regulation & Ethics
  • Company tracker
    • Apple
    • Google
    • Meta
    • OpenAI
No Result
View All Result
  • English
    • Slovenčina (Slovak)
  • All articles
  • Language models
  • New Tech
  • Safety, Regulation & Ethics
  • Company tracker
    • Apple
    • Google
    • Meta
    • OpenAI
No Result
View All Result
Daily AI Watch
No Result
View All Result
Home New Tech

Nvidia Unveils Advanced AI Chip, Promising Drastic Reduction in Operational Costs

A Strategic Leap in the Competitive AI Hardware Landscape

Daily AI Watch by Daily AI Watch
18. August 2023
0 0
Nvidia Unveils Advanced AI Chip, Promising Drastic Reduction in Operational Costs
3
VIEWS
Share on FacebookShare on Twitter

Key Points:

  • Nvidia announces the GH200, a new chip designed to run AI models, aiming to maintain its lead in the AI hardware market.
  • The GH200 features cutting-edge memory and a powerful ARM central processor, enhancing AI model operations.
  • Nvidia’s new chip is tailored for inference, promising to significantly reduce the cost of running large language models.

Nvidia’s Latest Innovation in AI Chips

Nvidia, a dominant player in the AI chip market, has announced the launch of its new chip, the GH200, designed to optimize artificial intelligence model operations. This move is part of Nvidia’s strategy to stay ahead of competitors like AMD, Google, and Amazon in the AI hardware space. The GH200, equipped with the same GPU as Nvidia’s highest-end AI chip, the H100, pairs this with 141 gigabytes of advanced memory and a 72-core ARM central processor.

Enhancing AI Model Performance
Nvidia CEO Jensen Huang emphasized the GH200’s design, which is aimed at scaling out data centers worldwide. The chip is set to be available from Nvidia’s distributors in the second quarter of the next year and will be ready for sampling by the end of this year. While the price remains undisclosed, the GH200 is expected to significantly boost processor performance for AI applications.

Focus on Inference and Large Language Models
The GH200 is specifically designed for inference, a key phase in AI model operations involving constant computational work to make predictions or generate content. This focus on inference is crucial as it occurs more frequently than training, which is only required when updating the model. The GH200’s increased memory capacity allows larger AI models to fit on a single system, enhancing efficiency and reducing costs.

Competitive Landscape and Future Prospects
Nvidia’s announcement comes as AMD recently introduced its own AI-oriented chip, the MI300X, with a higher memory capacity. However, Nvidia’s GH200, with its focus on inference and large memory, positions itself as a strong contender in the market. The new chip is expected to lower the inference cost of large language models significantly, marking a notable advancement in AI chip technology.


Food for Thought:

  1. How will Nvidia’s GH200 chip impact the development and deployment of AI models in various industries?
  2. What are the implications of Nvidia’s focus on inference for the future of AI model operations?
  3. How might the introduction of the GH200 influence the competitive landscape of AI hardware technology?

Let us know what you think in the comments below!


Author and Source: Article by Kif Leswing for CNBC.

Disclaimer: Summary written by ChatGPT.

author avatar
Daily AI Watch
See Full Bio
Tags: AI ChipAI NewsHardwareNvidia
Next Post
Siemens and Nvidia Collaborate to Craft AI-Powered Digital Twins for Wind Farms

Siemens and Nvidia Collaborate to Craft AI-Powered Digital Twins for Wind Farms

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended.

Klarna, AI News, AI Assistant

Klarna: AI Powered Customer Service (Revolution?)

6. March 2024
AI and Robots: Revolutionising the Future of Materials Science

AI and Robots: Revolutionising the Future of Materials Science

30. November 2023

Trending.

Devin, AI News, LLM, Assistant

AI Software Engineer Devin Revolutionizes Coding

13. March 2024
Hugging Face and IBM Collaborate on the Next-Gen AI Studio, Watsonx.ai

AI’s Role in Disaster Relief: A Case Study of Turkey and Syria Earthquakes

18. August 2023
A Guide to Leveraging Large Language Models on Private Data

A Guide to Leveraging Large Language Models on Private Data

25. August 2023
Job replacement, AI News, White collar

AI Impact on White-Collar Jobs

13. February 2024
Apple, OpenAI

Apple Plans AI Features in iOS 18 Amid OpenAI Partnership

28. May 2024
  • About us
  • Archive
  • Cookie Policy (EU)
  • Home
  • Terms & Conditions
  • Zásady ochrany osobných údajov

© 2023 Lumina AI s.r.o.

No Result
View All Result
  • All articles
  • Language models
  • New Tech
  • Safety, Regulation & Ethics
  • Company tracker
    • Apple
    • Google
    • Meta
    • OpenAI

© 2023 Lumina AI s.r.o.

Welcome Back!

Sign In with Google
OR

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Manage cookie consent
We use technologies like cookies to store and/or access device information. We do this to improve browsing experience and to show (non-) personalized ads. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
Technical storage or access is absolutely necessary for the legitimate purpose of enabling the use of a specific service that the participant or user has expressly requested, or for the sole purpose of carrying out the transmission of communication over an electronic communication network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
A technical repository or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
Technical storage or access is necessary to create user profiles to send advertising or track a user on a website or across websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Show preferences
{title} {title} {title}
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?