• just_another_person@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.

    For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

    • Eager Eagle@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.

      • just_another_person@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.