Running CPAI on OrangePi/RK3588--RKNN modules are "not available"

I thought I used Joshua's image but I tried so many things that my eyes may have mixed up the files when burning them (could also be lack of sleep LOL).

I'm going to start from scratch. Did you see my lost post about the Docker image? Which Docker image did you use?

Thank you both Pete and Mike. I'm probably making user errors but appreciate your help. :)
I thought I used Joshua's image but I tried so many things that my eyes may have mixed up the files when burning them (could also be lack of sleep LOL).

I'm going to start from scratch. Did you see my lost post about the Docker image? Which Docker image did you use?

Thank you both Pete and Mike. I'm probably making user errors but appreciate your help. :)


this is my docker compose file:

services:
CodeProjectAI:
image: codeproject/ai-server:arm64
container_name: codeproject-ai-server-arm64
privileged: True
hostname: codeproject-ai-server
restart: unless-stopped
ports:
- "5000:32168"
environment:
- TZ=America/Los_Angeles
volumes:
- /dev/bus/usb:/dev/bus/usb
- codeproject_ai_data:/etc/codeproject/ai
- codeproject_ai_modules:/app/modules
devices:
- /dev/bus/usb:/dev/bus/usb
volumes:
codeproject_ai_data:
codeproject_ai_modules:
 
  • Like
Reactions: AlwaysSomething
Oh and are you using any cases/fans for your board? I bought a case with heatsinks and a fan. This tiny fan is the loudest thing in my office!! Just curious if you're not using a fan if getting any overheating/throttling.

I used some 25x25x7mm pure copper heat sinks from Amazon, and a Anvision 40x40x10mm DC 12v fan, wired it up to the 5v pins on the board.

And I 3d printed the case: Orange Pi 5 Max Case with Fan and Antenna by Deez | Download free STL model | Printables.com
 
  • Like
Reactions: AlwaysSomething
I started from scratch and got everything working!! Thank you @PeteJ and @MikeLud

Had to make sure to use the codeproject/ai-server:arm64-2.9.5

FYI (and for the next person) - There must be a bug in CPAI with getting the OS version or in Joshua's image because I verified I used Joshua 24.04 and CPAI does not match. Look at my image in post #12 to see what I mean.

Pete you have the same thing if you look at your image in Post #17. This was driving me crazy until I finally caught that you have the same thing.

For regular Object Detection I am getting inference of around 40 ms and API round trip of around 50 ms (unfortunately I just rebooted and forgot to capture the exact average from CPAI but those were the numbers I remember).

For LPR I'm getting much longer times than you. I was getting like 100-200 ms for inference and API round trip of around 1,000-1,200 ms. The API round trip number will climb if I get a bunch of consecutive cars (e.g.: I hit 2,000 ms when I had few cars back to back). I know that's all the other processing like resizing, rotating, and OCR that is adding up.

I think I averaged 7-9 watts with idle around 4 watts. Amazing!!!!