BI AI on older GPU cards?

EagleEye7

Getting the hang of it
Jul 29, 2024
116
35
UK
I am in the process of configuring new BI server (on some older hardware). I have a Nvidia Quadro 4000 GPU in the system, which I am considering potentially using for AI tasks within BI (or CPAI).

However, I am aware this is an older card, and wondered about compatibility with the AI models commonly used in BI. I think the compute capability level is 2, and I believe the card has 2GB dedicated VRAM.

Have people had success ruining the sort of AI used in BI on these sorts of cards?

Or do they create compatibility issues?
 
I believe my 750 Ti is older than that card. I say give it a try and see before buying a new one. With large model size and yolo11, I'm getting 130-450ms detection times.
 
I think I will try it. I am more worried that the models will fail to run due to the age of the card / low compute capability (2.0), rather than just slow detection times.

Does look like used GPUs are available quite cheaply, which have a higher compute rating...
 
Yes. I am going to try and setup BI to use internal IVS where possibly - haven't got that far yet, still installing BI on the server.

I would like to have the capability to run AI on the server though, for cases where the camera doesn't support what I need.
 
I run a 4th gen with no GPU and use mainly camera AI, but have BI AI running a few instances and the CPU version works fine.

I have one instance where the snapshot for IVS at night gets headline shine and no car, so I use CPAI to get the whole vehicle in the alert image snapshot and I run LPR.

Unless you are going overboard with cameras, most find the CPU version to be more stable than the GPU version. Wrong CUDU or paddle and the entire thing shuts down.
 
I am in the process of configuring new BI server (on some older hardware). I have a Nvidia Quadro 4000 GPU in the system, which I am considering potentially using for AI tasks within BI (or CPAI).

However, I am aware this is an older card, and wondered about compatibility with the AI models commonly used in BI. I think the compute capability level is 2, and I believe the card has 2GB dedicated VRAM.

Have people had success ruining the sort of AI used in BI on these sorts of cards?

Or do they create compatibility issues?
It should work just make sure you have the latest GPU driver installed
 
OK, will do that, although I can only install the driver after passing through the GPU obviously (haven't done that yet). Also the "latest" driver from NVIDIA is still from about 2018, and for a slightly older OS version, but I am hoping it'll work OK.
 
I just upgraded an old Xeon E3 to use a 3060. It was a PITA to get t e drivers working correctly on windows, I ad to install proxmox to get it to play well wit old ardware....But I get speeds maybe 6x was doing before.. I do mostly alpr... spilled coffee on my kb so some keys arent working so ot.

























Q`
\]\]\09-657890-=6789090-23456789-=[PO0876543234567890-==-09876543ERWFGHJJ \

\\
 
Unless you are going overboard with cameras, most find the CPU version to be more stable than the GPU version. Wrong CUDU or paddle and the entire thing shuts down.
Fair enough. I will experiment and see what CPU usage looks like first.

I guess unless CPU is too slow or can't cope, hassling with the GPU may not be worth it...