• Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    41
    ·
    2 months ago

    This is sad, actually, because this very technology is absolutely fantastic at identifying things in images. That’s how image generation works behind the scenes!

    esp32-cam identifying a cat, a bike, and a car in an image

    ChatGPT screwed this up so badly because it’s programmed to generate images instead of using reference images and then identifying the relevant parts. Which is something a tiny little microcontroller board can do.

    If they just paid to license a data set of medical imagesOh wait! They already did that!

    Sigh

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      2 months ago

      Yeah, I mean, this can be done with text too.

      People should mostly be using a RAG retrieval system, not pure LLM slop like this, for reference. It just hasn’t really been made at scale because Google Search functioned as that well enough, and AI Bros seem to think everything should be done within LLM weights instead of proper databases.

      I mean… WTF. What if human minds were not allowed to use references?

      WolframAlpha was kinda trying to build this, but stalled.

    • Melvin_Ferd@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      How much of the public outcry against data collection has resulted in us getting an inferior product publicly.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        For images, it’s not even data collection because all the images that are used for these AI image generation tools are out on the internet for free for anyone to download right now. That’s how they’re obtained: A huge database of (highly categorized) image URLs (e.g. ImageNET) is crawled/downloaded.

        That’s not even remotely the same thing as “data collection”. That’s when a company vacuums everything they can from your private shit. Not that photo of an interesting building you uploaded to flickr over a decade ago.

    • bampop@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Maybe it knows but doesn’t care. The general idea of meatbag anatomy is that they are full of soft squishy tubes and they die easy. Who cares about the details?

  • DarkFuture@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 months ago

    I’m on year 5 of a pudendal nerve injury.

    Have come pretty close to blowing my brains out a few times.

    Been in physical therapy for almost 2 years and finally seeing SOME improvement that makes me want to blow my brains out less. Can’t ride bikes anymore. Can’t really jog or run. Going to the bathroom can irritate it. Always on egg shells hoping I don’t relapse, which happens sometimes.

    If you look into these types of injuries they don’t really do surgery for them cuz the region is such a web of nerves, muscles, and tendons that they can’t really pinpoint the injury or work on it if they did cuz they’d cause permanent nerve damage during surgery. So most people just go to PT and cross their fingers that they can get to a place where they can just manage the pain for the rest of their lives.

    Pelvic floor injuries are no fucking joke.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    2 months ago

    Anal birth would not be impossible, but would require a lot of training beforehand, and not everyone can be expanded to that size, and with large expansion comes some issues.

    However giving birth through the urethra, especially though the penis, would be impossible. The farthest any women (any trans man into urethral insertions?) went is a bit larger than a penis, often at the cost of continence.

  • ckmnstr@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    2 months ago

    You didn’t say FACTUAL diagram! So clearly the godly, superintelligent LLM made no mistake - AGI confirmed

    • wjrii@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      If you haven’t spent as much time refining the prompt as you would have to take an art class and do the medical research, have you really used the AI properly at all?