index

Getting RTX 5090

My old 3070 still works, but local AI has a way of turning “still fine” into “please stop.” So I built a compact little monster.

· life, tech

So recently I have been playing around with local AI with my old trusted 3070. He is still working. Still loyal. Still boots up like nothing’s wrong.

But local AI is a different kind of workload. It does not feel like gaming where you spike the GPU and then chill. It feels like you are constantly asking the machine to hold its breath for a long time. After a while, you start noticing the little signs. Fans ramping more often. Temps that take longer to come down. That subtle “bro, again?” energy.

And I get it. The 3070 has served. I have pushed it through years of work, side projects, and whatever phase I was in at the time. But lately I’ve been in my “run everything locally” arc. Models, embeddings, image gen, experiments that start small and then suddenly become “why is my whole machine stuttering.”

So I bought a new PC after almost 5 years.

The haul

Here’s the pile before it becomes a machine.

From the photo:

  • Palit GeForce RTX 5090 GameRock (32GB)
    The heart. The whole reason this build exists. 32GB VRAM is not “nice to have” for local AI. It changes what you can do without constant compromises.

  • AORUS X870 motherboard
    Modern platform, solid stability, less drama. I just want things to work.

  • AMD Ryzen 9
    GPU does the heavy lifting for inference, but CPU still carries the system. This box will compile, multitask, and run services too.

  • DeepCool Mystique 360 AIO
    Ok no comment

  • Lian Li case
    Big boy 5090 need big case

  • KLEVV CRAS V RGB DDR5 (4 sticks)
    The only option they have

  • FSP Hydro PTM Pro 1200W (ATX 3.0)
    The adult part of the build. High-end GPUs don’t care about your optimism. Power delivery needs to be stable, and ATX 3.0 is the right move here.