The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).

Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?

  • Deestan@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    4 days ago

    Things that can affect it, with some wild estimates on how it reduces the 800kh:

    • Processors are 10-100 times faster. Divide by 100ish.
    • A common laptop CPU has 16 cores. Divide by 16.
    • GPUs and CPUs have more and faster math operations for numbers. Divide by 10.
    • RAM speeds and processor cache lines are larger and faster. Divide by 10.
    • Modern processors have more and stronger SIMD instructions. Divide by 10.
    • Ray tracing algorithms may be replaced with more efficient ones. Divide by 2.

    That brings it down to 3-4 hours I think, which can be brought to realtime by tweaking resolution.

    So it looks plausible!