I have a camera that is connected to Blue Iris. When motion is detected on the camera, I send an HTTP get to Vera to run a scene. The scene has some basic start lua that checks if a series of virtual switches are active. If any VS is active (return true), the scene runs, and if inactive (return false), the scene doesn’t run.
I would like to use the power of reactor (to make more complex conditions) in place of the lua code in the scene startup (essentially replacing the scene with a sensor), but am unsure how to proceed. A thought I had was to enhance Reactor so that one could trigger a sensor via HTTP (currently doable), but one could make the sensor only be triggered by an HTTP call. Whereby the sensor would wait for an HTTP get, and never trigger unless it received a get, and then would evaluate further conditions in the sensor–allowing varying activities based on these downstream true group conditions.
Is what I’m trying to do workable with the current Reactor version, or if not, would it be a feasible addition to Reactor?— basically, stop the sensor from evaluating state until, and only until, the sensor receives an HTTP get.