I'm currently using a Local End Point in Blue Iris to control a virtual switch in Hubitat.
It works great, with the exception that I only want alerts when Deepstack identifies a "person", not on every motion trigger.
I've researched, but am clearly out of my depth. Seems like it should be possible, hoping I'm overlooking something simple.
Any ideas? Thanks for any help.
Camera settings, alerts tab, actions, on alert. Specify "person" as required object.
Can you elaborate so I can mimick it. Maybe a screenshot of the rule.
I switched from DeepStack to CodeProject AI a while ago, but the setup should be the same regardless of which one you're using. As @zerosum74 mentioned, you can add the required AI object in the on alert configuration, but if you have your camera configured to only detect a person then this isn't necessary.
...and be sure highlighted box is checked
I'm using Code Project AI as well but like @Vettester said, the setup is the same.
Thanks guys, can you show a screen if the Hubitat side?
While I got a few people using Blue Iris here, is anybody using their cams on the dash? Do they work outside of the house? Think what im looking for is to put my blue iris machine and hubitat hub behind a wireguard vpn server.
That's what I am doing. Both HE and Blue Iris behind a VPN. I don't even use cloud dashboard.
Same here. VPN. No cloud dashboard. I do have cameras on dash but only using local addresses. You can add them individually or as a group and have it cycle to alerted cameras. Very flexible. HTTP interface for BI is well documented. Outside of the house, I tend to use UI3 more than Hubitat dashboard though. Again, very flexible.
what exactly are you wanting to see?
I'm using VPN to access my cameras (and Hubitat) remotely as well. When I'm away I use the BI app via WIreGuard to access my cameras. I don't care for Hubitat's dashborads so I use Home Assistant to create my dashboards. These can be accessed remotely via the HA app without a VPN connection, Here are a few examples of my dashboards in HA where I utilize a live camera feed.
I think that's it, the "Required AI Objects" field is blank on mine. - I'll test it overnight.
I had the AI section configured identically to Vettester below.
I stole this from another site that it won't let me link..
Use your favorite search and look for "Hubitat Receive Alerts from Blue Iris"
You'll get a site: hubitatapps.shoutwiki with the following instructions.
(Not trying to break rules here, just trying to be helpful - will remove if necessary)
In Rule Machine:
- 'Create New Rule', 'Define a Trigger', 'Name the Trigger'
- 'Select Trigger Events', 'Select capability' = 'Local End Point'
- Note: you will need the 'Local End Point URL' in Blue Iris
- 'Select Actions' to do whatever you like!
In Blue Iris
- Select the Camera to use as a Trigger then 'Right Click' and select 'Camera Properties'.
- Select 'Alerts' and then check-off 'Post to a web address...' and then click 'Configure'
- Under the heading 'When Triggered', make sure that "Http://" is selected
- Then next to that box, put in the 'Local End Point' URL from Hubitat. Note: do not include the Http:// ... just start with the ip address. (ie. 192.168.1.123/apps/api/....)
- Click 'OK' and your done!
You can also use Hubitat's Maker API as alternative to RM endpoint. If all you're doing is flipping a virtual switch I find Maker API much easier to setup on the Hubitat side. As for BI, it doesn't make much of a difference. You'd just be pasting a different URL into the action.
Join the hub owners group and that restriction will go away. Happy linking .
Anyone doing similar to the above (passing a plate or face AI event from Blue Iris to Hubitat), BUT instead of flipping a switch, actually somehow passing the payload (the plate number or the face name) into Hubitat?
Envision if I could store in a hub variable, the possibilities are endless. Just don’t know how to get the payload from BI to HE.
That does sound cool. Ive never tried to dial in BI that far to know olates and peoples names
Or, you could just quote the post from right here.
Actually that’s the easy part. BI has plate recognition and face recognition partner software built-in. So it’s straightforward (if less than perfect) to get motion events tagged with a license plate number or a person’s name, where relevant. Happy to help here for those who are interested.
What I don’t know how to do is pass that info back to Hubitat, other than flip a boolean.
Did you ever figure out a way to pass it back?
Not elegantly, no. Here's what I do. Let's say I'm trying to detect "person" and "dog" off the same camera and zone. I create two alert actions -- one with a Required AI Object = person and the other with Required AI Object = dog. Then, each action calls a different web address via Maker API, one for a "person detector" virtual motion sensor and one for a "dog detector" virtual motion sensor.
In theory I could do the same with individual license plates or individual faces/people. But it doesn't scale well, that's for sure.
Sorry wish I had a better answer.