Thanks true. Maybe me just getting hopeful that the coral module and models are working better. Going to give it another chance - saving 23-30w an hour with an idling gpu and more when it's running hard. Plus the heat it gives off in summer (southern hermisphere here). I'll put it back in during...
Is it me or if the coral module working better? I went back to my gpu a while ago but tried coral again with this new version and appears to be functioning well.
I'll give it a few days chance.
I'm on 2.9.4 though not .5
Scammed! Bought 50 mts of cat5e 100% copper. I always check with a stanley knife. Scrape a little off and silver!! Broke easily also
Put a claim in for a refund. Last batch was good copper. Genuine mistake or taken me for a fool.. will never know
Thanks. Will do. I wouldn't use it inside as I heard they can be a fire hazzard with poe due to the poor conductivity and overheating issues if pulling some current.
Cabling is easy so if after 1 season it produces poor results I'll upgrade it. Better than wifi for sure though
Hey guys.
In addition to the camera forums nice to see this also.
I have a fairly comprehensive alarm system.
Used to pay a security firm a monthly fee. Realised it was a waste of money.
It was a paradox panel and k32 keypad. wired PIR's and 3 exterior takex sensors
The company took away the...
I need to cable 3 exterior cameras at a summer house we only use 3 months a year.
They are on wifi right now as this was only temporary. This month I'll be installing cable and possibly poe.
Now for the cabling inside from the router to the waterproof housing outside I will be using 100%...
CPAI is still here and works very well - doesn't rely on any cloud service. You can still use Deepstack if you want.
Let me see if I can get it to you later , very easy and well documented
Hi all,
i have an Ezviz H8c with ptz auto tracking feature. I'd like to control it from BI due to my firewall blocking the cloud.
I can use the ptz controls from BI and that works.
I have set a couple of presets, also camera is onvi configured in video.
But still can't for the life of me get...
I've tried most configurations.
I have 2 favourite combos.
Both involve using proxmox.
1- frigate on proxmox lxc (debian 12 lxc with docker/portainer) with openvino model (no tpu required), or with the pcie tpu - with HAOS on a VM in proxmox. With the addon 'Frigate Proxy' and also the hacs...
Thanks - managed to get online just now and do the same
'You have cancelled your Blue Iris Extended Maintenance and Support subscription. You will also receive a cancellation confirmation by e-mail.'
So I guess we just have to wait for instructions on what to do next? Mine is valid until 2025...
I was just about to ask this.
Looking at the beelink N100 (16gb) mini s12 Pro on aliexpress
Using openvino model on cpu.
What inference speeds can this cpu achiev do you know?
This model is super efficient and even runs on my old i5 3470t at 14-18ms. Not shabby
My GTX970 runs very well with CPAI. But as we know they are old.
What new GPU would compare to a gtx970 in terms of inference speeds? (cudos cores)
Nothing expensive
Thanks
I run BI on a proxmox VM, one on a amd 3400g and another on an intel i5 3470T lenovo. As a proxmox VM really easy to back up and move to other pc's if required
All substreams 12 cams.
CPAI on my unraid with a gpu for AI only.
Idles at 9%.
Verions 14.1
Testing on an old i5 3470t (which I thought wasn't supported using openvino cpu).
This old underpowered cpu still manages 18ms inference speeds!! Ubuntu distro
My i7 6th gen 6700 4ms!! openvino may not be as accuracte but if you don't have a tpu then this is a good option
go
good to know thanks - my automatic payments section was under wallet on my app.
But looks like I didn't use paypal which I thought I did. Not sure how I did, but I know I used a virtual card so that is frozen, Will erase it just in case.
Thanks
Unfortunately I'm still struggling with this as the BI integration sets lock everytime and doesn't give you an option
Tried this with no such luck.
- switch:
name: blueiris_profile_daytime
command_on: 'curl -X GET "http://192.168.1.96:81/admin?profile=-6&lock=2"'
command_off: 'curl...
After some weeks of using BI with a coral tpu/CPAI I've come to the conclusion that it's not worth the extra saving of running a gpu. Too much trouble and not very good accuracy.
GPU had far superior accuracy as it can utilize larger models at fast speeds.
TPU still only works well with frigate...
No harm is trying out frigate - you don't need a coral using the openvino which works very well even on old gen intels. Says it should work with amd also.
I have it as a backup on a small mini pc i3 7th gen. inference speeds of 14ms using openvino model
I've been trying to get CPAI working in a proxmox lxc container (debain). CPAI runs but just can't get the coral usb mapped. conf file has it mapped but the coral module is just not picking it up..
The cpai container on unraid works fine.
The CPAI on the proxmox windows VM works ok and picks up...
The AItool (when it works) is a great bit if kit
Releases · VorlonCD/bi-aidetection (github.com)
You can add more control and more servers to send analysis to.
This may be of interest in the json ("PreInstall": false for all EfficientDet models).
MobileNet is set to true
I am changing this to true to see the effect.
"DownloadableModels":[
{ "Name": "MobileNet Large", "Filename": "objectdetection-mobilenet-large-edgetpu.zip"...
Also the Vision detection on the dashboard does not say what size model it is using. This is after I press custome detect several times so it is 'warmed up'
I'm assuming this is medium or even large as the same pics only take 40+ms on my cameras
#
Label
Confidence
0
person
79%...
I'm sure in future versions it will improve. I've removed my gpu now as I'm happy so far.
efficientdet-lite whether it is small or medium
But also my 2nd instance with yolov8 large using the AITOOL to my unraid cpai docker works well as a failover and double check
So the blue line indicates the...
Good catch see what you mean.
I had mine set to efficientdet-lite medium.
Tested in the dashboard and took 434ms!
Then ran trigger on my cameras and got this in the logs (went from 434ms to 79ms with it stating it forced a model reload.
Then I tested the same pic in the dashboard and it was...
I had to go into the modulesettings.json and update several parameters myself and save it.
autostart/model/size etc
Thought they had fixed that in 2.8 though
C:\Program Files\CodeProject\AI\modules\ObjectDetectionCoral\
So many settings in BI also that maybe we have different set ups. I still don't fully understand what they all do after 3-4 years of using BI.
Yes always found CPAI more stable in a docker container, ether on proxmox lxc or on unraid which I have in addition.
The AItool on my Blueiris VM is a...
I have 2 instances running and the one in docker (unraid app) seems to be much more stable than the cpai service on the windows. Maybe due to it using the M2 coral as opposed to the usb coral on the windows.
Plan to get a couple more pcie corals and put them in the m2 wifi slots or pcie adaptor...