• fruitycoder@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    Damn. I though this thread was being hyperbolic but they really wrote it like Intel will, for the first time in their history, making GPUs lmao

  • Reygle@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    2 days ago

    Am I living in an alternate timeline? They’ve been making GPUs for quite some time- and B580 was actually pretty good, incredibly good for the price.

  • Itdidnttrickledown@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    The problem with intel. They never just keep going. They announce some new gpu/graphics product and when it falls short they don’t or wont stick with it. They abandon it and use it as a write off. They have done this multiple times and I have no reason to believe they will do anything different. The last time was just a few years ago and when sales and performance lagged they just quit.

      • imetators@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        Intel ARC is a GPU brand by Intel that are half the price of a typical Nvidia card at almost the same performance. They been unpopular due to shaky drivers but they have never been canceled. So, stating that Intel will finally enter GPU market is just plain misleading.

      • treesquid@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        English clearly isn’t their first language, but the intent is pretty obviously “As if they aren’t already making ARC GPUs?”

  • Wioum@lemmy.world
    link
    fedilink
    English
    arrow-up
    209
    ·
    3 days ago

    I had to check the date on the article. They’ve been making GPUs for 3 years now, but I guess this announcement–although weird–is a sign that Arc is here to stay, which is good news.

  • Goodeye8@piefed.social
    link
    fedilink
    English
    arrow-up
    36
    ·
    3 days ago

    Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.

    And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it’s simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It’s not perfect but we can’t expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.

    All is to say, I don’t understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I’d like it to be someone else than Intel but as long as the price comes down I don’t care who brings it down.

    And to be clear, if Intels new strategy is keeping the prices as they are I’m all for “fuck Intel”.

      • gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        This is a big part of it, imo. They kissed the ring.

        The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.

    • ZeDoTelhado@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      CPU overhead is quite well known and actually damages a lot the arc cards’ position on the budget class

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    23 hours ago

    Oh great, some wildly overpriced and underperforming GPUs.

    Edit: went looking at Intel’s desktop GPUs and found this gem:

    Powerful AI Engines

    Unlock new AI experiences with up to 233 TOPS of AI engine performance for content creation, real-time AI chat, editing, and upscaled gaming.3

    And checked out the specs for performance of Intel’s top cards (B580/A770) against a basic 3080 card (no OC/TI, whatever) and the intel cards ranked well below the older 3080, and weren’t even in the ballpark against upper tier 4- and 5- series Nvidia cards. Plus missing features like DLSS, etc.

    Good enough for non-FPS dependent gaming? Sure. Can’t beat the price, I was wrong about that. Want to play high-FPS demanding twitch gaming? No.

    • Zetta@mander.xyz
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      2 days ago

      “oh great, competition in a market with no competition. Horrible.”

      Intel has already been making discrete GPUs for two generations and they are very cheap and aren’t the most performant but fantastic for the price.

      I’d rather a non-US player enter the market like moorethreads, but because of us capitalist assholes, handicapping China competition for a long time they aren’t going to be able to make cards that are up to our performance standards till the 2030s probably

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Intel GPU support?

        ZLUDA previously supported Intel GPUs, but not currently. It is possible to revive the Intel backend. The development team is focusing on high‑quality AMD GPU support and welcomes contributions.

        Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.

        OneAPI is decent, but apparently usually fairly cumbersome to work with and people prefer to write software in cuda as it’s the industry standard (and the standard in academia)

        • woelkchen@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.

          Good. So prices might actually be reasonable.