Nvidia Will Spend $26B to Build Open-Weight AI Models

(wired.com)

27 points | by bigwheels 18 hours ago ago

9 comments

  • sriramgonella 8 hours ago ago

    I think this will be soon a game changer as they already own GPU servers

  • lemonish97 17 hours ago ago

    Some of the nemotron models are really good. Hope this encourages more open-weight/source models from the west

  • gigatexal 17 hours ago ago

    Why would they do this? Ahh to keep the AI bubble afloat. Got it.

    • Havoc 15 hours ago ago

      I think it’s small change for them and they realise tinkering keeps the momentum going.

      There are also some people that have an aversion to Chinese models so NVIDIA backed is good there.

    • Skyy93 17 hours ago ago

      Do you really think it is still only a bubble? The progress Anthropic did with Claude Code the last few weeks is tremendous.

      • owebmaster 16 hours ago ago

        What progress?

        • Skyy93 10 hours ago ago

          A few months ago, I was also skeptical about all of this. However, what we are currently seeing with Claude Code, or Codex, is astonishing: they're building stuff that works and actually improves upon human-coded apps, optimizations, and research. I work in research, and my colleagues are building apps themselves for data acquisition, which usually takes months. They build them in two days. Several companies say their developers aren't coding anymore; they're asking Claude to do it. It feels like an evolution of programming, like the invention of the automatic sewing machine in 19th-century England.

          There are many problems and unresolved issues right now. However, we are getting closer. In my opinion, we do not know what we have unleashed.

        • saulpw 16 hours ago ago

          I mean, are you using it? Things have really moved in the past few months.

  • ivanvoid 16 hours ago ago

    I will just leave here article that open-weight is not open-training.

    when i use model i wanna be able to see and modify it, i don’t want another 12Gb black box.

    https://www.workshoplabs.ai/blog/open-weights-open-training