New CodeProject.AI Object Detection (YOLO11 .NET) Module

I've been successfully running this YOLO11.NET module for a couple of days but with the various -v5 models (combined, dark and animal). It's fine, but statistically slower on average than the YOLO5.NET module on my i5-8500 machine: I was getting roughly 150ms times (long term average) on the older module, now seeing about 190ms on the newer module. Is this expected to improve once we see the fully native/optimized for YOLO11 models?
 
I've been successfully running this YOLO11.NET module for a couple of days but with the various -v5 models (combined, dark and animal). It's fine, but statistically slower on average than the YOLO5.NET module on my i5-8500 machine: I was getting roughly 150ms times (long term average) on the older module, now seeing about 190ms on the newer module. Is this expected to improve once we see the fully native/optimized for YOLO11 models?
Are you using the YOLO11.NET module in Docker or Windows? Docker will be slower.
 
  • Like
Reactions: Skinny1
Are you using the YOLO11.NET module in Docker or Windows? Docker will be slower.
Just a conventional (bare metal) Windows 11 installation, no VM or Docker involved. I thought it was slightly odd to see the slower responses too, but questioned whether it might be due to the "non-optimized" models running under a newer branch of the newer underlying module. Not a huge difference, but slightly disappointing nevertheless.