The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).
Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?
The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).
Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?
Things that can affect it, with some wild estimates on how it reduces the 800kh:
That brings it down to 3-4 hours I think, which can be brought to realtime by tweaking resolution.
So it looks plausible!