Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Unsolved
Collapse
Discussion Forum to share and further the development of home control and automation, independent of platforms.
  1. Home
  2. Software
  3. Multi-System Reactor
  4. Vera reload detected where there does not appear to be one.
Device log?
G
@toggledbits is there a log that will show me what rule is turning on a specific device? I've got a switch that has been kicking on at 2200 ET for several nights now and the reactor.log doesn't have a thing in it that I can see on a device level (it being more rules-based).
Multi-System Reactor
Midnight crossing not working in date/time condition (build 25325)
tunnusT
Topic thumbnail image
Multi-System Reactor
Error: Command timeout
G
at _ClientAPI._commandTimeout (http://192.168.1.100:8111/client/ClientAPI.js:807:179 Seeing this randomly when returning to open browser tab after being away awhile. Once, maybe twice a day. "What did you do to trigger it?" Literally nothing, just walked away and returned and there it was. Actions taken in reasonably close proximity to this particular instance of it popping up: I'd restarted the MSR container in Portainer. I'll try to grab some logs here shortly.
Multi-System Reactor
Reactor (Multi-System/Multi-Hub) Announcements
toggledbitsT
Build 21228 has been released. Docker images available from DockerHub as usual, and bare-metal packages here. Home Assistant up to version 2021.8.6 supported; the online version of the manual will now state the current supported versions; Fix an error in OWMWeatherController that could cause it to stop updating; Unify the approach to entity filtering on all hub interface classes (controllers); this works for device entities only; it may be extended to other entities later; Improve error detail in messages for EzloController during auth phase; Add isRuleSet() and isRuleEnabled() functions to expressions extensions; Implement set action for lock and passage capabilities (makes them more easily scriptable in some cases); Fix a place in the UI where 24-hour time was not being displayed.
Multi-System Reactor
[Solved] Local expression in Rule does not evaluate as they used to do
CrilleC
Topic thumbnail image
Multi-System Reactor
Home Assistant 2025.11.2 and latest-25315
CrilleC
Topic thumbnail image
Multi-System Reactor
Notice to Docker + ARM Users (RPi 3/4/5 and others)
toggledbitsT
This post does not apply to users of Intel/AMD-based systems. If you are using a Reactor image tagged latest-amd64 or stable-amd64, then this post does not apply to you. It also does not apply to bare-metal installs; it's for users of docker images on ARM-based systems only (principally Raspberry Pi hosts, but could be others). After January 15, 2026, I will no longer produce the aarch64-tagged docker image for Reactor. The ARM images will be arm64 for 64-bit operating systems, and armv7l for 32-bit operating systems. For those of you running a container from the aarch64 image today, this will be a relatively simple change: you just need to switch the image used for your docker container to a differently-tagged image. If you are using docker-compose, then this is a relatively simple matter of changing the image line in your docker-compose.yaml file and then stopping (docker-compose down) and restarting (docker-compose up -d) your Reactor daemon. But there's a catch... not all of you can safely just switch from the aarch64 image to the arm64 image. And, you can't just trust the output of uname -m, for example, because this exposes the CPU architecture, but not the word size of the OS running on that CPU. For Raspberry Pi systems, the transition to 64-bit operating systems was long (starting in 2016) and not always obvious — although there was a first "official" 64-bit OS for RPis in 2020, it did not become a default recommendation in the Raspberry Pi Imager until 2021, and then that was only the default for Pi 3/4 systems with >4GB RAM; it was 2022 before it was universally recommended for all 64-bit CPUs regardless of RAM size. Depending on when you first imaged your RPi system and what default you may have been offered/chosen, you could today easily have a 64-bit CPU Raspberry Pi running a 32-bit version of the operating system. Upgrades along the way would not change this; changing it to fully 64-bit requires a full reimage of the system. To establish if your OS is 64- or 32-bit, log in to your Pi and run: sudo dpkg-architecture -q DEB_HOST_ARCH. If the response is arm64 or aarch64, then you are running a 64-bit OS and you should use the arm64-tagged image. If it's anything else, you are running a 32-bit OS, and you should use the armv7l-tagged image. pi@rpi4-1:~ $ sudo dpkg-architecture -q DEB_HOST_ARCH armhf pi@rpi4-1:~ $ uname -m aarch64 pi@rpi4-1:~ $ In the example above, the uname command reports that the CPU is 64-bit architecture (aarch64), which is true for the host on which I ran these commands, but the DEB_HOST_ARCH value is armhf, indicating a 32-bit operating system. This system has to use the armv7l-tagged image. Other systems will have their own ways of determining the word size of the running OS. Since the majority of Reactor users running ARM systems are on Raspberry Pis, I am able to supply the above instructions, but if you happen to have a different ARM system, you'll need to do some web searching to figure out how to expose that information. Or, you can just try the arm64 image, and if it doesn't start up, try the armv7l image. Remember to always back up your system before making any changes. For everyone, please make this change as soon as possible, and if you have any trouble finding a working image, please (1) go back to the current aarch64 image; and (2) let me know in this thread along with as much detail about your host system as you can offer (including the output of the dpkg-architecture command mentioned above).
Multi-System Reactor
Requesting a proper ARM64/aarch64 Docker image (Pi 5 support)
M
Hi, I'm in the process of migrating from a Raspberry Pi 4 (ARMv7) to a Raspberry Pi 5 (ARMv8/aarch64), but I’ve run into an issue: there is no proper ARMv8/aarch64 image available. None of the existing images run on the Pi 5 - they all exit immediately with code 139 (segmentation fault), which typically indicates that the binaries inside the image are not compatible with the ARM64/aarch64 architecture used by the Pi 5. Would it be possible to publish a correct ARMv8/aarch64 (linux/arm64) image? Building one should be relatively straightforward using docker buildx with multi-arch support. For example, my own Node.js images are built this way: docker buildx build --push \ -t <localrepo>/<project>:<tag> \ --platform=linux/arm64,linux/amd64 \ --file ./apps/<project>/Dockerfile . This produces both the AMD64 and ARM64/v8 variants automatically. Also, as a side note, it may be best to avoid using Alpine as the base image for the ARM64 build, since musl-based builds often cause compatibility issues and unnecessary headaches. A glibc-based base image (e.g., Debian or Ubuntu) tends to work far more reliably on ARM64, especially for Node.js applications. @toggledbits - tagging you in case you missed this. Thanks, mgvra
Multi-System Reactor
Script action and custom timers
therealdbT
Sorry to write here without trying, but I’m flying today. Am I correct if i say that script action with alarm() makes it possible to execute a reaction in a given interval, lets say 15 seconds or 3.5 minutes? That sounds amazing, since I’ve used weird tricks, including a custom controller, just to do this.
Multi-System Reactor
Help resolve change in behaviour post update
CatmanV2C
Topic thumbnail image
Multi-System Reactor
Reactor w/HA 2025.11 error on set_datetime service call setting only time
CrilleC
@toggledbits Do you know if this is related to that PR or is it a change they made in 2025.11.1? [latest-25310]2025-11-11T13:16:24.319Z <HassController:INFO> HassController#hass perform x_hass_input_datetime.set_datetime on Entity#hass>input_datetime_vvb_dag with { "time": "10:45" } [latest-25310]2025-11-11T13:16:24.320Z <HassController:INFO> HassController#hass: sending payload for x_hass_input_datetime.set_datetime on Entity#hass>input_datetime_vvb_dag action: { "type": "call_service", "service_data": { "date": (null), "time": "10:45", "datetime": (null), "timestamp": (null) }, "domain": "input_datetime", "service": "set_datetime", "target": { "entity_id": "input_datetime.vvb_dag" } } [latest-25310]2025-11-11T13:16:24.321Z <HassController:ERR> HassController#hass request 1762866984320<2025-11-11 14:16:24> (call_service) failed: [Error] Not a parseable type for dictionary value @ data['date'] [-] [latest-25310]2025-11-11T13:16:24.321Z <HassController:WARN> HassController#hass action x_hass_input_datetime.set_datetime({ "time": "10:45" }) on Entity#hass>input_datetime_vvb_dag failed! [latest-25310]2025-11-11T13:16:24.321Z <HassController:INFO> Service call payload: {"type":"call_service","service_data":{"date":null,"time":"10:45","datetime":null,"timestamp":null},"domain":"input_datetime","service":"set_datetime","target":{"entity_id":"input_datetime.vvb_dag"},"id":1762866984320} [latest-25310]2025-11-11T13:16:24.322Z <HassController:INFO> Service data: {"fields":{"date":{"example":"\"2019-04-20\"","selector":{"text":{"multiline":false,"multiple":false}}},"time":{"example":"\"05:04:20\"","selector":{"time":{}}},"datetime":{"example":"\"2019-04-20 05:04:20\"","selector":{"text":{"multiline":false,"multiple":false}}},"timestamp":{"selector":{"number":{"min":0,"max":9223372036854776000,"mode":"box","step":1}}}},"target":{"entity":[{"domain":["input_datetime"]}]}} [latest-25310]2025-11-11T13:16:24.322Z <Engine:ERR> Engine#1 reaction rule-mgb8pfhs:S step 0 perform x_hass_input_datetime.set_datetime failed: [Error] Not a parseable type for dictionary value @ data['date'] [-] [latest-25310]2025-11-11T13:16:24.322Z <Engine:INFO> Engine#1 action args: { "time": "10:45" } [latest-25310]2025-11-11T13:16:24.322Z <Engine:INFO> Resuming reaction Sätt Schema VVB i Home Assistant<AKTIV> (rule-mgb8pfhs:S) from step 1 [latest-25310]2025-11-11T13:16:24.323Z <HassController:INFO> HassController#hass perform x_hass_input_datetime.set_datetime on Entity#hass>input_datetime_vvb_natt with { "time": "03:00", "timestamp": 0 } [latest-25310]2025-11-11T13:16:24.323Z <HassController:INFO> HassController#hass: sending payload for x_hass_input_datetime.set_datetime on Entity#hass>input_datetime_vvb_natt action: { "type": "call_service", "service_data": { "date": (null), "time": "03:00", "datetime": (null), "timestamp": 0 }, "domain": "input_datetime", "service": "set_datetime", "target": { "entity_id": "input_datetime.vvb_natt" } } [latest-25310]2025-11-11T13:16:24.324Z <HassController:ERR> HassController#hass request 1762866984323<2025-11-11 14:16:24> (call_service) failed: [Error] Not a parseable type for dictionary value @ data['date'] [-] [latest-25310]2025-11-11T13:16:24.324Z <HassController:WARN> HassController#hass action x_hass_input_datetime.set_datetime({ "time": "03:00", "timestamp": 0 }) on Entity#hass>input_datetime_vvb_natt failed! [latest-25310]2025-11-11T13:16:24.324Z <HassController:INFO> Service call payload: {"type":"call_service","service_data":{"date":null,"time":"03:00","datetime":null,"timestamp":0},"domain":"input_datetime","service":"set_datetime","target":{"entity_id":"input_datetime.vvb_natt"},"id":1762866984323} [latest-25310]2025-11-11T13:16:24.324Z <HassController:INFO> Service data: {"fields":{"date":{"example":"\"2019-04-20\"","selector":{"text":{"multiline":false,"multiple":false}}},"time":{"example":"\"05:04:20\"","selector":{"time":{}}},"datetime":{"example":"\"2019-04-20 05:04:20\"","selector":{"text":{"multiline":false,"multiple":false}}},"timestamp":{"selector":{"number":{"min":0,"max":9223372036854776000,"mode":"box","step":1}}}},"target":{"entity":[{"domain":["input_datetime"]}]}} [latest-25310]2025-11-11T13:16:24.324Z <Engine:ERR> Engine#1 reaction rule-mgb8pfhs:S step 1 perform x_hass_input_datetime.set_datetime failed: [Error] Not a parseable type for dictionary value @ data['date'] [-] [latest-25310]2025-11-11T13:16:24.324Z <Engine:INFO> Engine#1 action args: { "time": "03:00", "timestamp": 0 } [latest-25310]2025-11-11T13:16:24.325Z <Engine:INFO> Resuming reaction Sätt Schema VVB i Home Assistant<AKTIV> (rule-mgb8pfhs:S) from step 2 [latest-25310]2025-11-11T13:16:24.325Z <Engine:INFO> Sätt Schema VVB i Home Assistant<AKTIV> all actions completed.
Multi-System Reactor
Reactor Version 25310 : Office Light control via rule in reactor no longer working since last update.
P
Hello, I currently have an office light (connected via a Leviton Zwave Dimmer switch) controlled from a Gen5 Aeotech Zwave switch installed on my Synology 720+ NAS. I run HA(2025.11.10) in a virtual machine from my NAS and Reactor on the container manager of the same NAS. Prior to updating to 25304 the rule I had set to turn the light on to a specific dimming value worked correctly. Now the rule appears to follow the decision tree, however the reaction does not trigger setting the dimming or turning on the office light? Strangely I can still turn the light on and off as well as dim it directly from HASS..? I have tried using the ''try this action'' button in the rules reaction setting and it will not control the light and does not throw an error flagÉ Please help, P.S Reactor has been rock steady for me over the last few years and I'm a big fan of this solution.
Multi-System Reactor
[Solved] alarm() in global expression throws error in log.
CrilleC
Topic thumbnail image
Multi-System Reactor
[Solved] Define function issue in latest-25304
CrilleC
Topic thumbnail image
Multi-System Reactor
No Upgrade Notification for Build 25308?
CatmanV2C
FWIW I'm no longer getting a notification from MSR that there's an update. Just thought I'd mention it C
Multi-System Reactor
Strange behavior in MSR latest-25304 with disabled groups in Reaction
therealdbT
Topic thumbnail image
Multi-System Reactor
[Reactor] Variables not updating correctly in latest-25201-2aa18550
therealdbT
Topic thumbnail image
Multi-System Reactor
The reaction stopped working (Google Nest max playing a video)
F
Topic thumbnail image
Multi-System Reactor
Handling Dead Entities and Renamed Entities
PablaP
Hello all.. been a minute! I recently rebuilt my Z wave network and migrated to a new z wave stick. In order to prevent any downtime I kept my original z wave network up and ran a docker version of Z Wave JS UI with my new controller. This way I could add device by device without having any devices down. I finally moved all the devices over to my new stick today. The final step was to migrate everything from my Docker instance of Z Wave JS UI to the HA add-on of Z Wave JS UI. However during this migration some of the names didn't populate correctly which I later managed to import back into Z Wave JS UI. The issue was in Reactor it is stuck on the default names and the entities are not updating. I removed the controller from Reactor, restarted, hard refreshed, and added the controller back however the new entity names have not updated. Also it seems like the old entities from my previous instance of Z Wave JS UI are lingering and not being marked as dead (I believe a certain amount of time needs to lapse before they're marked as dead in Reactor). My goal is to basically purge all the entities for the 'ZWaveJS' controller in Reactor so it can pull all the updated entity names and only the entities that exist in Z Wave JS UI. I cannot find a quick way to do this, I know entities can be deleted one by one, but with over 100 entities this would take long I am guessing that if I added the controller with a new name in in the Reactor config it would pull the updated entities and names but I think that would break my rules since the entity IDs would change (I made sure to name all the entities the exact same as they were previously to prevent this issue).
Multi-System Reactor
Strange behavior for MQTT templates using payload and attributes
therealdbT
Topic thumbnail image
Multi-System Reactor

Vera reload detected where there does not appear to be one.

Scheduled Pinned Locked Moved Multi-System Reactor
3 Posts 2 Posters 500 Views 2 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • CatmanV2C Offline
    CatmanV2C Offline
    CatmanV2
    wrote on last edited by CatmanV2
    #1

    I may have posted this in the wrong section. MSR running on Bare metal Debian bullseye. Both Openluup and MSR are on the same device (an Intel NUC) at IP 192.168.70.249. Any suggestions as to where I go to resolve?

    TIA

    Happy new year, everyone! Hope all are well!

    Looking for some pointers troubleshooting a slightly puzzling to me issue. When digging around on a different issue I noticed this happening regularly in the MSR logs:

    [latest-24366]2025-01-10T19:50:07.630Z <Engine:NOTICE> Starting reaction Garden lights on when the doors are open<SET> (rule-lb2h69nb:S)
    [latest-24366]2025-01-10T19:50:07.630Z <VeraController:INFO> VeraController#vera perform action power_switch.on on Switch#vera>device_20060 with [Object]{  }
    [latest-24366]2025-01-10T19:50:07.630Z <VeraController:INFO> VeraController#vera perform action power_switch.set on Switch#vera>device_20060 with [Object]{ "state": true }
    [latest-24366]2025-01-10T19:50:07.670Z <VeraController:NOTICE> VeraController#vera action power_switch.set([Object]{ "state": true }) on Switch#vera>device_20060 succeeded
    [latest-24366]2025-01-10T19:50:07.671Z <Engine:INFO> Resuming reaction Garden lights on when the doors are open<SET> (rule-lb2h69nb:S) from step 1
    [latest-24366]2025-01-10T19:50:07.672Z <Engine:NOTICE> Garden lights on when the doors are open<SET> delaying until 1736538787672<10/01/2025, 19:53:07>
    [latest-24366]2025-01-10T19:50:19.595Z <Rule:INFO> Garden lights on when the doors are open (rule-lb2h69nb in Outside Lights) evaluated; rule state transition from SET to RESET!
    [latest-24366]2025-01-10T19:52:16.506Z <Rule:INFO> Garden lights on when the doors are open (rule-lb2h69nb in Outside Lights) evaluated; rule state transition from RESET to SET!
    [latest-24366]2025-01-10T19:52:16.515Z <Engine:INFO> [Engine]Engine#1 not enqueueing rule-lb2h69nb:S: already in queue with status 2
    [latest-24366]2025-01-10T19:52:20.823Z <Rule:INFO> Garden lights on when the doors are open (rule-lb2h69nb in Outside Lights) evaluated; rule state transition from SET to RESET!
    [latest-24366]2025-01-10T19:53:07.676Z <Engine:INFO> Resuming reaction Garden lights on when the doors are open<SET> (rule-lb2h69nb:S) from step 2
    [latest-24366]2025-01-10T19:53:07.677Z <VeraController:INFO> VeraController#vera perform action power_switch.off on Switch#vera>device_20060 with [Object]{  }
    [latest-24366]2025-01-10T19:53:07.678Z <VeraController:INFO> VeraController#vera perform action power_switch.set on Switch#vera>device_20060 with [Object]{ "state": false }
    [latest-24366]2025-01-10T19:53:07.719Z <VeraController:NOTICE> VeraController#vera action power_switch.set([Object]{ "state": false }) on Switch#vera>device_20060 succeeded
    [latest-24366]2025-01-10T19:53:07.720Z <Engine:INFO> Resuming reaction Garden lights on when the doors are open<SET> (rule-lb2h69nb:S) from step 3
    [latest-24366]2025-01-10T19:53:07.721Z <Engine:INFO> Garden lights on when the doors are open<SET> all actions completed.
    [latest-24366]2025-01-10T19:55:04.468Z <VeraController:ERR> VeraController#vera update request failed: [FetchError] network timeout at: http://192.168.70.249:3480/data_request?id=status&Timeout=15&DataVersion=416912953&MinimumDelay=50&output_format=json&_r=1736538886459 [-]
    [latest-24366]2025-01-10T19:55:09.646Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20050: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20050) [-]
    [latest-24366]2025-01-10T19:55:09.646Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.646Z <VeraController:CRIT> *Entity#vera>device_20050
    [latest-24366]2025-01-10T19:55:09.656Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20570: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20570) [-]
    [latest-24366]2025-01-10T19:55:09.656Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.656Z <VeraController:CRIT> *Entity#vera>device_20570
    [latest-24366]2025-01-10T19:55:09.678Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20610: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20610) [-]
    [latest-24366]2025-01-10T19:55:09.679Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.679Z <VeraController:CRIT> *Entity#vera>device_20610
    [latest-24366]2025-01-10T19:55:09.744Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20631: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20631) [-]
    [latest-24366]2025-01-10T19:55:09.744Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.744Z <VeraController:CRIT> *Entity#vera>device_20631
    [latest-24366]2025-01-10T19:55:09.889Z <VeraController:NOTICE> VeraController#vera reload detected!
    [latest-24366]2025-01-10T19:55:09.910Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20050: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20050) [-]
    [latest-24366]2025-01-10T19:55:09.910Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.910Z <VeraController:CRIT> *Entity#vera>device_20050
    [latest-24366]2025-01-10T19:55:09.935Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20570: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20570) [-]
    [latest-24366]2025-01-10T19:55:09.936Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.936Z <VeraController:CRIT> *Entity#vera>device_20570
    [latest-24366]2025-01-10T19:55:09.937Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20610: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20610) [-]
    [latest-24366]2025-01-10T19:55:09.937Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.937Z <VeraController:CRIT> *Entity#vera>device_20610
    [latest-24366]2025-01-10T19:55:09.939Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20631: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20631) [-]
    [latest-24366]2025-01-10T19:55:09.939Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.939Z <VeraController:CRIT> *Entity#vera>device_20631
    [latest-24366]2025-01-10T19:55:09.968Z <Controller:INFO> VeraController#vera 0 dead entities older than 86400000s purged
    [latest-24366]2025-01-10T19:55:10.037Z <VeraController:NOTICE> VeraController#vera reload detected!
    
    

    That repeats until something like this:

    [latest-24366]2025-01-10T19:55:10.049Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20050: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20050) [-]
    [latest-24366]2025-01-10T19:55:10.049Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:10.049Z <VeraController:CRIT> *Entity#vera>device_20050
    [latest-24366]2025-01-10T19:55:10.053Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20570: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20570) [-]
    [latest-24366]2025-01-10T19:55:10.053Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:10.053Z <VeraController:CRIT> *Entity#vera>device_20570
    [latest-24366]2025-01-10T19:55:10.062Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20610: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20610) [-]
    [latest-24366]2025-01-10T19:55:10.062Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:10.062Z <VeraController:CRIT> *Entity#vera>device_20610
    [latest-24366]2025-01-10T19:55:10.112Z <VeraController:WARN> VeraController#vera failed to apply attribute scene_activation.scene_id to Entity#vera>device_20631: [TypeError] Can't set NaN on attribute scene_activation.scene_id (vera>device_20631) [-]
    [latest-24366]2025-01-10T19:55:10.112Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:10.113Z <VeraController:CRIT> *Entity#vera>device_20631
    [latest-24366]2025-01-10T20:00:05.003Z <Engine:INFO> [Engine]Engine#1 master timer tick, local time "10/01/2025 20:00:05" (TZ offset 0 mins from UTC)
    [latest-24366]2025-01-10T20:13:51.872Z <Rule:INFO> No motion in Cinema (rule-m4ocglke in Cinema Environment) evaluated; rule state transition from SET to RESET!
    [latest-24366]2025-01-10T20:13:51.882Z <Rule:INFO> Cinema Heater On (rule-m4ocf1di in Cinema Environment) evaluated; rule state transition from RESET to SET!
    [latest-24366]2025-01-10T20:13:51.888Z <Engine:INFO> Enqueueing "Cinema Heater On<SET>" (rule-m4ocf1di:S)
    

    And the errors / reloads just stop.

    From Openluup:

    2025-01-10 19:49:56.379   luup_log:63: BroadLink_Mk2 debug: RM3 Mini - IR 1: urn:schemas-micasaverde-com:device:IrTransmitter:1
    2025-01-10 19:50:00.085   luup_log:0: 14Mb, 1.7%cpu, 36.1days
    2025-01-10 19:50:07.591   luup.variable_set:: 20160.urn:micasaverde-com:serviceId:EnergyMetering1.KWH was: 18.6793008 now: 18.6805008 #hooks:0
    2025-01-10 19:50:07.591   luup.variable_set:: 20160.urn:micasaverde-com:serviceId:EnergyMetering1.KWHReading was: 1736538000 now: 1736538600 #hooks:0
    2025-01-10 19:50:07.591   luup.variable_set:: 20160.urn:micasaverde-com:serviceId:EnergyMetering1.Watts was: 7.4 now: 7.3 #hooks:0
    2025-01-10 19:50:07.591   luup.variable_set:: 20170.urn:micasaverde-com:serviceId:EnergyMetering1.KWH was: 32.2417984 now: 32.2470016 #hooks:0
    2025-01-10 19:50:07.591   luup.variable_set:: 20170.urn:micasaverde-com:serviceId:EnergyMetering1.KWHReading was: 1736538000 now: 1736538600 #hooks:0
    2025-01-10 19:50:07.591   luup.variable_set:: 20330.urn:micasaverde-com:serviceId:EnergyMetering1.KWHReading was: 1736538000 now: 1736538600 #hooks:0
    2025-01-10 19:50:07.592   luup.variable_set:: 20770.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 0 now: 1 #hooks:0
    2025-01-10 19:50:07.592   luup.variable_set:: 20770.urn:micasaverde-com:serviceId:SecuritySensor1.LastTrip was: 1736534850 now: 1736538607 #hooks:0
    2025-01-10 19:50:07.593   openLuup.server:: request completed (3392 bytes, 1 chunks, 12875 ms) tcp{client}: 0x55c3299a9cf8
    2025-01-10 19:50:07.618   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c3299a9cf8
    2025-01-10 19:50:07.624   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329d0a5b8
    2025-01-10 19:50:07.624   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912906&MinimumDelay=50&output_format=json&_r=1736538607623 HTTP/1.1 tcp{client}: 0x55c329d0a5b8
    2025-01-10 19:50:07.632   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c3292ed678
    2025-01-10 19:50:07.633   openLuup.server:: GET /data_request?newTargetValue=1&DeviceNum=20060&id=action&serviceId=urn%3Aupnp-org%3AserviceId%3ASwitchPower1&action=SetTarget&output_format=json&_r=1736538607631 HTTP/1.1 tcp{client}: 0x55c3
    292ed678
    2025-01-10 19:50:07.633   luup.call_action:: 20060.urn:upnp-org:serviceId:SwitchPower1.SetTarget 
    2025-01-10 19:50:07.633   luup.call_action:: action will be handled by parent: 37
    2025-01-10 19:50:07.633   luup.variable_set:: 20060.urn:upnp-org:serviceId:SwitchPower1.Target was: 0 now: 1 #hooks:0
    2025-01-10 19:50:07.669   openLuup.server:: request completed (35 bytes, 1 chunks, 35 ms) tcp{client}: 0x55c3292ed678
    2025-01-10 19:50:07.673   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c3292ed678
    2025-01-10 19:50:07.776   openLuup.server:: request completed (821 bytes, 1 chunks, 151 ms) tcp{client}: 0x55c329d0a5b8
    2025-01-10 19:50:07.784   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329d0a5b8
    2025-01-10 19:50:07.795   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c3287bc8f8
    2025-01-10 19:50:07.796   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912907&MinimumDelay=50&output_format=json&_r=1736538607794 HTTP/1.1 tcp{client}: 0x55c3287bc8f8
    2025-01-10 19:50:08.644   luup.variable_set:: 20060.urn:upnp-org:serviceId:SwitchPower1.Status was: 0 now: 1 #hooks:0
    2025-01-10 19:50:08.950   openLuup.server:: request completed (821 bytes, 1 chunks, 1154 ms) tcp{client}: 0x55c3287bc8f8
    2025-01-10 19:50:08.958   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c3287bc8f8
    2025-01-10 19:50:08.969   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c3297e95a8
    2025-01-10 19:50:08.970   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912908&MinimumDelay=50&output_format=json&_r=1736538608969 HTTP/1.1 tcp{client}: 0x55c3297e95a8
    2025-01-10 19:50:19.181   luup.variable_set:: 20770.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 1 now: 0 #hooks:0
    2025-01-10 19:50:19.585   openLuup.server:: request completed (832 bytes, 1 chunks, 10615 ms) tcp{client}: 0x55c3297e95a8
    2025-01-10 19:50:19.602   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c3297e95a8
    2025-01-10 19:50:19.605   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328d298a8
    2025-01-10 19:50:19.605   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912909&MinimumDelay=50&output_format=json&_r=1736538619604 HTTP/1.1 tcp{client}: 0x55c328d298a8
    2025-01-10 19:50:34.950   openLuup.server:: request completed (593 bytes, 1 chunks, 15344 ms) tcp{client}: 0x55c328d298a8
    2025-01-10 19:50:34.953   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c328d298a8
    2025-01-10 19:50:34.965   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328c48a58
    2025-01-10 19:50:34.966   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912909&MinimumDelay=50&output_format=json&_r=1736538634964 HTTP/1.1 tcp{client}: 0x55c328c48a58
    2025-01-10 19:50:34.989   luup.variable_set:: 25019.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 0 now: 1 #hooks:0
    2025-01-10 19:50:34.990   luup.variable_set:: 25019.urn:micasaverde-com:serviceId:SecuritySensor1.LastTrip was: 1736534437 now: 1736538634 #hooks:0
    2025-01-10 19:50:35.094   openLuup.server:: request completed (975 bytes, 1 chunks, 127 ms) tcp{client}: 0x55c328c48a58
    2025-01-10 19:50:35.101   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c328c48a58
    2025-01-10 19:50:35.113   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c32985e298
    2025-01-10 19:50:35.113   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912911&MinimumDelay=50&output_format=json&_r=1736538635111 HTTP/1.1 tcp{client}: 0x55c32985e298
    2025-01-10 19:50:40.255   luup.variable_set:: 25021.urn:micasaverde-com:serviceId:LightSensor1.CurrentLevel was: 0 now: 30 #hooks:1
    2025-01-10 19:50:40.256   scheduler.watch_callback:: 25021.urn:micasaverde-com:serviceId:LightSensor1.CurrentLevel called [20]DataWatcherCallback() function: 0x55c3288a8d20
    2025-01-10 19:50:40.460   openLuup.server:: request completed (835 bytes, 1 chunks, 5346 ms) tcp{client}: 0x55c32985e298
    2025-01-10 19:50:40.472   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c32985e298
    2025-01-10 19:50:40.478   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329b28238
    2025-01-10 19:50:40.479   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912912&MinimumDelay=50&output_format=json&_r=1736538640478 HTTP/1.1 tcp{client}: 0x55c329b28238
    2025-01-10 19:50:44.471   luup.variable_set:: 25007.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 0 now: 1 #hooks:1
    2025-01-10 19:50:44.472   luup.variable_set:: 25007.urn:micasaverde-com:serviceId:SecuritySensor1.LastTrip was: 1736538400 now: 1736538644 #hooks:0
    2025-01-10 19:50:44.472   scheduler.watch_callback:: 25007.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped called [20]DataWatcherCallback() function: 0x55c3288a8d20
    2025-01-10 19:50:44.775   openLuup.server:: request completed (975 bytes, 1 chunks, 4296 ms) tcp{client}: 0x55c329b28238
    2025-01-10 19:50:44.775   openLuup.server:: request completed (975 bytes, 1 chunks, 4296 ms) tcp{client}: 0x55c329b28238
    2025-01-10 19:50:44.782   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329b28238
    2025-01-10 19:50:44.793   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328f1e968
    2025-01-10 19:50:44.793   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912914&MinimumDelay=50&output_format=json&_r=1736538644791 HTTP/1.1 tcp{client}: 0x55c328f1e968
    2025-01-10 19:51:00.122   openLuup.server:: request completed (593 bytes, 1 chunks, 15328 ms) tcp{client}: 0x55c328f1e968
    2025-01-10 19:51:00.125   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c328f1e968
    2025-01-10 19:51:00.136   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c32995b318
    2025-01-10 19:51:00.136   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912914&MinimumDelay=50&output_format=json&_r=1736538660134 HTTP/1.1 tcp{client}: 0x55c32995b318
    2025-01-10 19:51:15.481   openLuup.server:: request completed (593 bytes, 1 chunks, 15344 ms) tcp{client}: 0x55c32995b318
    2025-01-10 19:51:15.484   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c32995b318
    2025-01-10 19:51:15.495   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c32998b068
    2025-01-10 19:51:15.497   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912914&MinimumDelay=50&output_format=json&_r=1736538675493 HTTP/1.1 tcp{client}: 0x55c32998b068
    2025-01-10 19:51:30.869   openLuup.server:: request completed (593 bytes, 1 chunks, 15371 ms) tcp{client}: 0x55c32998b068
    2025-01-10 19:51:30.872   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c32998b068
    2025-01-10 19:51:30.884   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c32905bda8
    2025-01-10 19:51:30.885   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912914&MinimumDelay=50&output_format=json&_r=1736538690882 HTTP/1.1 tcp{client}: 0x55c32905bda8
    2025-01-10 19:51:32.886   luup.variable_set:: 20380.urn:upnp-org:serviceId:TemperatureSensor1.CurrentTemperature was: 21 now: 22 #hooks:0
    2025-01-10 19:51:33.090   openLuup.server:: request completed (841 bytes, 1 chunks, 2205 ms) tcp{client}: 0x55c32905bda8
    2025-01-10 19:51:33.100   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c32905bda8
    2025-01-10 19:51:33.112   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328de0d58
    2025-01-10 19:51:33.112   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912915&MinimumDelay=50&output_format=json&_r=1736538693111 HTTP/1.1 tcp{client}: 0x55c328de0d58
    2025-01-10 19:51:36.064   luup.variable_set:: 25007.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 1 now: 0 #hooks:1
    2025-01-10 19:51:36.065   scheduler.watch_callback:: 25007.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped called [20]DataWatcherCallback() function: 0x55c3288a8d20
    2025-01-10 19:51:36.369   openLuup.server:: request completed (832 bytes, 1 chunks, 3256 ms) tcp{client}: 0x55c328de0d58
    2025-01-10 19:51:36.377   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c328de0d58
    2025-01-10 19:51:36.387   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329054188
    2025-01-10 19:51:36.388   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912916&MinimumDelay=50&output_format=json&_r=1736538696386 HTTP/1.1 tcp{client}: 0x55c329054188
    2025-01-10 19:51:37.134   luup.variable_set:: 20380.urn:upnp-org:serviceId:TemperatureSensor1.CurrentTemperature was: 22 now: 21 #hooks:0
    2025-01-10 19:51:37.540   openLuup.server:: request completed (841 bytes, 1 chunks, 1152 ms) tcp{client}: 0x55c329054188
    2025-01-10 19:51:37.553   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329054188
    2025-01-10 19:51:37.566   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328d97568
    2025-01-10 19:51:37.566   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912917&MinimumDelay=50&output_format=json&_r=1736538697564 HTTP/1.1 tcp{client}: 0x55c328d97568
    2025-01-10 19:51:41.367   luup.variable_set:: 20380.urn:upnp-org:serviceId:TemperatureSensor1.CurrentTemperature was: 21 now: 22 #hooks:0
    2025-01-10 19:51:41.874   openLuup.server:: request completed (841 bytes, 1 chunks, 4307 ms) tcp{client}: 0x55c328d97568
    2025-01-10 19:51:41.884   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c328d97568
    2025-01-10 19:51:41.895   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329385678
    2025-01-10 19:51:41.896   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912918&MinimumDelay=50&output_format=json&_r=1736538701894 HTTP/1.1 tcp{client}: 0x55c329385678
    2025-01-10 19:51:57.168   openLuup.server:: request completed (593 bytes, 1 chunks, 15272 ms) tcp{client}: 0x55c329385678
    2025-01-10 19:51:57.171   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329385678
    2025-01-10 19:51:57.183   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329b092b8
    2025-01-10 19:51:57.184   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912918&MinimumDelay=50&output_format=json&_r=1736538717182 HTTP/1.1 tcp{client}: 0x55c329b092b8
    2025-01-10 19:52:00.124   luup_log:0: 14Mb, 1.6%cpu, 36.1days
    2025-01-10 19:52:00.476   openLuup.server:: request completed (1841 bytes, 1 chunks, 3292 ms) tcp{client}: 0x55c329b092b8
    2025-01-10 19:52:00.483   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329b092b8
    2025-01-10 19:52:00.495   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c3297be088
    2025-01-10 19:52:00.495   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912929&MinimumDelay=50&output_format=json&_r=1736538720494 HTTP/1.1 tcp{client}: 0x55c3297be088
    2025-01-10 19:52:09.867   luup.variable_set:: 25021.urn:micasaverde-com:serviceId:LightSensor1.CurrentLevel was: 30 now: 0 #hooks:1
    2025-01-10 19:52:09.868   scheduler.watch_callback:: 25021.urn:micasaverde-com:serviceId:LightSensor1.CurrentLevel called [20]DataWatcherCallback() function: 0x55c3288a8d20
    2025-01-10 19:52:10.071   openLuup.server:: request completed (834 bytes, 1 chunks, 9575 ms) tcp{client}: 0x55c3297be088
    2025-01-10 19:52:10.079   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c3297be088
    2025-01-10 19:52:10.088   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c329c16a08
    2025-01-10 19:52:10.089   openLuup.server:: GET /data_request?id=status&Timeout=15&DataVersion=416912930&MinimumDelay=50&output_format=json&_r=1736538730087 HTTP/1.1 tcp{client}: 0x55c329c16a08
    2025-01-10 19:52:16.194   luup.variable_set:: 20770.urn:micasaverde-com:serviceId:SecuritySensor1.Tripped was: 0 now: 1 #hooks:0
    2025-01-10 19:52:16.195   luup.variable_set:: 20770.urn:micasaverde-com:serviceId:SecuritySensor1.LastTrip was: 1736538607 now: 1736538736 #hooks:0
    2025-01-10 19:52:16.498   openLuup.server:: request completed (976 bytes, 1 chunks, 6409 ms) tcp{client}: 0x55c329c16a08
    2025-01-10 19:52:16.515   openLuup.io.server:: HTTP:3480 connection closed openLuup.server.receive closed tcp{client}: 0x55c329c16a08
    2025-01-10 19:52:16.516   openLuup.io.server:: HTTP:3480 connection from 192.168.70.249 tcp{client}: 0x55c328dbad18
    

    Nothing I can see indicating that Openluup is reloading?

    .249 IP address is the internal IP of the NUC that hosts both Openluup and MSR.

    Any thoughts as to how I can troubleshoot this? It's not a big deal, but would like to get to the bottom of it.

    I should add that all the devices listed in entries like this:

    [latest-24366]2025-01-10T19:55:09.744Z <VeraController:INFO> VeraController#vera class scene_controller meta [Object]{ "source": "urn:micasaverde-com:serviceId:SceneController1/sl_SceneActivated", "expr": "int(value)" } orig  final NaN
    [latest-24366]2025-01-10T19:55:09.744Z <VeraController:CRIT> *Entity#vera>device_20631
    

    Are the tamper switches on Fibaro FGMS001 multifunction detectors, of which I have 4, and they correspond exactly to the devices listed.

    TIA

    C

    The Ex-Vera abuser know as CatmanV2.....

    1 Reply Last reply
    0
    • toggledbitsT Offline
      toggledbitsT Offline
      toggledbits
      wrote on last edited by toggledbits
      #2

      OpenLuup is not Vera. There are differences in the way it operates, and I don't make any representation that I will address those differences in Reactor, and in my view it's better that openLuup addresses them if needed.

      The "reload detected" message is emitted when the "loadtime" value in a user_data or status API response is different from the prior value. On Vera, this indicates that Luup has reloaded (the value is the timestamp of the reload); it remains stable otherwise. If you are seeing this message on openLuup when it hasn't reloaded, that is different from Vera Luup's native behavior.

      That said, the message is informational only. The only action taken is it updates a counter that drives the x_vera_sys.reloads attribute on the system entity for the VeraController instance. Unless you are using this attribute for some purpose, it has no effect on anything and can be safely ignored.

      Author of Multi-system Reactor and Reactor, DelayLight, Switchboard, and about a dozen other plugins that run on Vera and openLuup.

      1 Reply Last reply
      1
      • CatmanV2C Offline
        CatmanV2C Offline
        CatmanV2
        wrote on last edited by
        #3

        That's fabulously clear. Thanks, as ever

        C

        The Ex-Vera abuser know as CatmanV2.....

        1 Reply Last reply
        0
        • toggledbitsT toggledbits unlocked this topic on
        • toggledbitsT toggledbits locked this topic on
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        Recent Topics

        • Device log?
          G
          gwp1
          0
          1
          19

        • Midnight crossing not working in date/time condition (build 25325)
          tunnusT
          tunnus
          0
          3
          33

        • Error: Command timeout
          G
          gwp1
          0
          6
          123

        • Reactor (Multi-System/Multi-Hub) Announcements
          toggledbitsT
          toggledbits
          5
          131
          74.8k

        • [Solved] Local expression in Rule does not evaluate as they used to do
          CrilleC
          Crille
          0
          5
          204

        • Home Assistant 2025.11.2 and latest-25315
          G
          gwp1
          0
          6
          228

        • Notice to Docker + ARM Users (RPi 3/4/5 and others)
          toggledbitsT
          toggledbits
          1
          1
          82

        • Requesting a proper ARM64/aarch64 Docker image (Pi 5 support)
          M
          mgvra
          1
          3
          182

        • Script action and custom timers
          toggledbitsT
          toggledbits
          0
          4
          197

        • Help resolve change in behaviour post update
          CatmanV2C
          CatmanV2
          0
          12
          512

        • There is an alternative to homebridge-mqttthing
          akbooerA
          akbooer
          1
          2
          151

        • Reactor w/HA 2025.11 error on set_datetime service call setting only time
          CrilleC
          Crille
          0
          6
          230
        Powered by NodeBB | Contributors
        Hosted freely by 10RUPTiV - Solutions Technologiques | Contact us
        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • Unsolved