I would like to start by calling out that although I wrote most of this post manually it was a little messy and all over the place, so I use ChatGPT to tidy it up and make it a little more professional.
I’ve been seeing a lot of posts lately where people are running or asking about Blue Iris + AI on older PCs and GPUs — and I get it, we all like to reuse hardware and for many years I did the same with a GTX 970 and Intel i7 Gen6.
But if your system runs 24/7 (which most of ours do), it’s worth stepping back and looking at the total cost, not just the upfront cost.
Older CPUs and GPUs tend to be significantly less power efficient than newer ones. That matters a lot when your machine never turns off.
Let’s say your older setup pulls an extra 50–100 watts compared to a newer efficient system.
Running 24/7:
Upgrading doesn’t just save power — you usually get:
This is the key point a lot of people miss:
You can often find much newer, far more efficient hardware for cheap if you look around:
If you:Save ~£150/year in electricity
Break-even in 1–2 years
After that, it’s just ongoing savings.
Running older gear isn’t wrong — especially if it’s lightly loaded.
But if you’re:
Running multiple cameras
In a lot of cases, the “cheap” option ends up being the expensive one over time.
I’ve been seeing a lot of posts lately where people are running or asking about Blue Iris + AI on older PCs and GPUs — and I get it, we all like to reuse hardware and for many years I did the same with a GTX 970 and Intel i7 Gen6.
But if your system runs 24/7 (which most of ours do), it’s worth stepping back and looking at the total cost, not just the upfront cost.
Old hardware = hidden running cost
Older CPUs and GPUs tend to be significantly less power efficient than newer ones. That matters a lot when your machine never turns off.
- Blue Iris is already CPU-heavy by design, especially with multiple cameras and AI processing
- Power consumption directly impacts cost because these systems run continuously
- Some GPU options (especially older GTX 970/980/1080 NVIDIA cards) can actually increase long-term power usage and heat output
The bit people overlook: energy cost over time
Let’s say your older setup pulls an extra 50–100 watts compared to a newer efficient system.
Running 24/7:
- 50W = ~438 kWh/year
- 100W = ~876 kWh/year
- £100–£300+ per year (and rising…)
What newer hardware gives you (besides lower power)
Upgrading doesn’t just save power — you usually get:
- Faster AI inference times
- Lower CPU usage (less pegged at 90–100%)
- Better hardware decoding
- Smoother UI and remote access
- Less heat, less noise, more stability
You don’t have to buy brand new
This is the key point a lot of people miss:
You can often find much newer, far more efficient hardware for cheap if you look around:
- eBay
- Facebook Marketplace
- Refurb office PCs (Dell OptiPlex, HP Elitedesk, Intel NUC, etc.)
- Faster
- Quieter
- Half the power draw
The reality: upgrades can pay for themselves
If you:Save ~£150/year in electricity
- Spend ~£200–£400 on newer used hardware
Break-even in 1–2 years
After that, it’s just ongoing savings.
Final thought
Running older gear isn’t wrong — especially if it’s lightly loaded.
But if you’re:
Running multiple cameras
- Using AI continuously
- Seeing high CPU usage
- Or your system pulls a lot of power
In a lot of cases, the “cheap” option ends up being the expensive one over time.