Navigation

    Logo
    • Register
    • Login
    • Search
    • Recent
    • Tags
    • Unread
    • Categories
    • Unreplied
    • Popular
    • GitHub
    • Docu
    • Hilfe
    1. Home
    2. Deutsch
    3. Entwicklung
    4. [Neuer Adapter] Proxmox VM

    NEWS

    • Neuer Blog: Fotos und Eindrücke aus Solingen

    • ioBroker@Smart Living Forum Solingen, 14.06. - Agenda added

    • ioBroker goes Matter ... Matter Adapter in Stable

    [Neuer Adapter] Proxmox VM

    This topic has been deleted. Only users with topic management privileges can see it.
    • Neuschwansteini
      Neuschwansteini @arteck last edited by

      @arteck sagte in [Neuer Adapter] Proxmox VM:

      isch guck später nach

      ... wenn du schon dabei bist... hier kam gerade waehrend des Backups des LXC, indem iobroker laeuft, diese Errors:

      2023-10-27 12:26:58.713  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:26:58.714  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:26:58.832  - info: mqtt.1 (1223) Client [Presence1221] connection closed: closed
      2023-10-27 12:26:58.669  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:26:58.713  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:26:58.714  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:26:58.832  - info: mqtt.1 (1223) Client [Presence1221] connection closed: closed
      2023-10-27 12:27:01.065  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:01.121  - info: sonoff.0 (640) Client [S5V-4] connection closed: closed
      2023-10-27 12:27:01.065  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:01.121  - info: sonoff.0 (640) Client [S5V-4] connection closed: closed
      2023-10-27 12:27:02.964  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Querying lock battery state: failed
      2023-10-27 12:27:02.964  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Querying lock battery state: failed
      2023-10-27 12:27:03.716  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:03.716  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:08.116  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Reading config. Result: failed
      2023-10-27 12:27:08.116  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Reading config. Result: failed
      2023-10-27 12:27:08.719  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:08.719  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:09.061  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402429044_5996
      2023-10-27 12:27:08.719  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:08.719  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:09.061  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402429044_5996
      2023-10-27 12:27:11.065  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:11.065  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:12.460  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Reading advanced config. Result: failed
      2023-10-27 12:27:12.460  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub BI: Reading advanced config. Result: failed
      2023-10-27 12:27:13.721  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:13.721  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:18.723  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:18.723  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:18.723  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:18.723  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:21.064  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:21.064  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:23.725  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:23.725  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:28.727  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:28.727  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:28.727  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:28.727  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:31.066  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:31.066  - warn: tuya.0 (788) bf88c2fc494909d538nhnf.20: A set command is already in progress. Can not issue a second one that also should return a response.. Try to use cloud.
      2023-10-27 12:27:32.385  - info: mqtt.1 (1223) Client [Presence1221] connection closed: timeout
      2023-10-27 12:27:32.385  - info: mqtt.1 (1223) Client [Presence1221] connection closed: timeout
      2023-10-27 12:27:33.730  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:33.730  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:38.732  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:38.732  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:38.732  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:38.732  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:43.734  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:43.734  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:48.735  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:48.736  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:48.735  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:48.736  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:53.738  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:53.738  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:27:56.752  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:27:56.752  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:27:58.740  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:58.740  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:58.740  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:27:58.740  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:02.713  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402482713_3323
      2023-10-27 12:28:02.713  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402482713_3323
      2023-10-27 12:28:03.321  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: unlocked
      2023-10-27 12:28:03.742  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:03.321  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: unlocked
      2023-10-27 12:28:03.742  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:08.744  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:08.744  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:08.744  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:08.744  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:13.746  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:13.746  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:17.676  - warn: synochat.0 (5912) Attempt to send message '8b032070-74b3-11ee-8289-9fa88030d194' failed due to too fast sending sequence: 'create post too fast' > Delaying the message for 1 second/s...
      2023-10-27 12:28:17.676  - warn: synochat.0 (5912) Attempt to send message '8b032070-74b3-11ee-8289-9fa88030d194' failed due to too fast sending sequence: 'create post too fast' > Delaying the message for 1 second/s...
      2023-10-27 12:28:18.748  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:18.748  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:18.748  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:18.748  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:23.750  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:23.750  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:28.752  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:28.752  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:28.752  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:28.752  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:33.753  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:33.753  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:38.756  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:38.756  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:38.756  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:38.756  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:41.532  - info: mqtt.1 (1223) Client [Presence1220] connection closed: timeout
      2023-10-27 12:28:41.532  - info: mqtt.1 (1223) Client [Presence1220] connection closed: timeout
      2023-10-27 12:28:43.758  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:43.758  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:48.760  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:48.760  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:48.890  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:28:48.760  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:48.760  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:48.890  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: Querying lock state: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:28:53.762  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:53.762  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:28:58.764  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:58.764  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:58.764  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:28:58.764  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:01.064  - info: mqtt.1 (1223) Client [Presence1222] connection closed: timeout
      2023-10-27 12:29:01.064  - info: mqtt.1 (1223) Client [Presence1222] connection closed: timeout
      2023-10-27 12:29:02.932  - info: mqtt.1 (1223) Client [Presence1221] connection closed: closed
      2023-10-27 12:29:02.932  - info: mqtt.1 (1223) Client [Presence1221] connection closed: closed
      2023-10-27 12:29:03.766  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:03.766  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:07.889  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:29:07.889  - info: javascript.0 (461) script.js.common.Nuki.mqtt2log: NUKI Hub WT: Querying lock state: Querying lock state: unlocked
      2023-10-27 12:29:08.769  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:08.769  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:08.769  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:08.769  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:13.229  - info: mqtt.1 (1223) Client [Presence1220] connected with secret 1698402553186_4082
      2023-10-27 12:29:13.229  - info: mqtt.1 (1223) Client [Presence1220] connected with secret 1698402553186_4082
      2023-10-27 12:29:13.771  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:13.771  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:16.778  - info: bluelink.0 (1435) Read new update for KMHKR81AFNU011963 directly from the car
      2023-10-27 12:29:16.778  - info: bluelink.0 (1435) Read new update for KMHKR81AFNU011963 directly from the car
      2023-10-27 12:29:18.773  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:18.773  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:18.773  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:18.773  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:21.407  - info: sonoff.0 (640) Client [nx4519-1] reconnected. Old secret 1698388565136_7187. New secret 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends publish. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends pingreq. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends publish. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.889  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402561878_5954
      2023-10-27 12:29:21.407  - info: sonoff.0 (640) Client [nx4519-1] reconnected. Old secret 1698388565136_7187. New secret 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends publish. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends pingreq. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.506  - warn: sonoff.0 (640) Old client nx4519-1 with secret 1698388565136_7187 sends publish. Ignore! Actual secret is 1698402561404_1831
      2023-10-27 12:29:21.889  - info: mqtt.1 (1223) Client [Presence1221] connected with secret 1698402561878_5954
      2023-10-27 12:29:23.775  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:23.775  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:28.777  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:28.777  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:28.777  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:28.777  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:31.327  - info: mqtt.1 (1223) Client [Presence1222] connected with secret 1698402571314_8823
      2023-10-27 12:29:31.327  - info: mqtt.1 (1223) Client [Presence1222] connected with secret 1698402571314_8823
      2023-10-27 12:29:32.379  - info: tuya.0 (788) 30530530c45bbedb673e: Connect locally to device
      2023-10-27 12:29:32.380  - info: tuya.0 (788) 30530530c45bbedb673e Init with IP=10.1.12.106, Key=>':x*F6`S#`J14cF, Version=3.3
      2023-10-27 12:29:32.379  - info: tuya.0 (788) 30530530c45bbedb673e: Connect locally to device
      2023-10-27 12:29:32.380  - info: tuya.0 (788) 30530530c45bbedb673e Init with IP=10.1.12.106, Key=>':x*F6`S#`J14cF, Version=3.3
      2023-10-27 12:29:33.779  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:33.779  - error: proxmox.0 (78860) Unable to update ticket: -1
      2023-10-27 12:29:37.865  - info: bluelink.0 (1435) Update for KMHKR81AFNU011963 successfull
      2023-10-27 12:29:37.865  - info: bluelink.0 (1435) Update for KMHKR81AFNU011963 successfull
      2023-10-27 12:29:38.781  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:38.781  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:38.781  - error: proxmox.0 (78860) Use Next Proxmox Host because of communication failure https://192.168.0.210:8006/api2/json/nodes/hpg380/storage/VMBackup/content
      2023-10-27 12:29:38.781  - error: proxmox.0 (78860) Error received response from /nodes/hpg380/storage/VMBackup/content
      

      arteck 1 Reply Last reply Reply Quote 0
      • arteck
        arteck Developer Most Active @Neuschwansteini last edited by arteck

        @ilovegym sagte in [Neuer Adapter] Proxmox VM:

        /nodes/hpg380/storage/VMBackup/content

        da ist der storage nicht da.. und da du nur 1 node hast oder nur angegeben hast, kann der nicht switchen

        früher war die meldung unterdrückt

        Neuschwansteini 1 Reply Last reply Reply Quote 0
        • Neuschwansteini
          Neuschwansteini @arteck last edited by

          @arteck

          ja, weil er gerade am backup schreiben ist.. hmmpff d.h. die Meldung kommt jetzt immer, wenn ich ein Backup mache in Proxmox.. ? mit dem LXC.. ?
          Kann man da das Timeout hoeher setzen.. fuer den Fehler? er kommt genau einmal, wie im Logfile, dann ist wieder ruhe..

          arteck 1 Reply Last reply Reply Quote 0
          • Negalein
            Negalein Global Moderator @arteck last edited by

            @arteck sagte in [Neuer Adapter] Proxmox VM:

            lösche den mal .. und starte den adapter neu

            Danke, funktioniert wieder.

            1 Reply Last reply Reply Quote 0
            • arteck
              arteck Developer Most Active @Neuschwansteini last edited by

              @ilovegym dann guckemal jetzt

              bahnuhr 1 Reply Last reply Reply Quote 0
              • bahnuhr
                bahnuhr Forum Testing Most Active @arteck last edited by bahnuhr

                @arteck

                habe auch von git installiert.

                Bitte noch überprüfen:
                maxmem wird bei den Objekten nicht aktualisiert.

                sowohl dies:
                9ee736f7-41b3-48f8-9de2-277d157592bd-image.png

                als auch dies:
                77ef60ee-ba2c-47db-8f83-fdfd9aa269e7-image.png

                arteck 1 Reply Last reply Reply Quote 0
                • arteck
                  arteck Developer Most Active @bahnuhr last edited by arteck

                  @bahnuhr

                  vm :
                  doch..die maschiene muss aber laufen..

                  node schau ich noch

                  das ist die einstellung
                  0b00ce71-205e-4b1a-8231-b713ce935bd5-grafik.png

                  bahnuhr 1 Reply Last reply Reply Quote 0
                  • bahnuhr
                    bahnuhr Forum Testing Most Active @arteck last edited by

                    @arteck
                    Na klar ist die Maschine an.

                    Und ja, der DP wird bei start der Instanz aktualisiert.
                    Aber dann nicht mehr.

                    Die anderen DP von mem werden dagegen regelmäßig (so alle 30 Sek.) aktualisiert.

                    arteck ? 2 Replies Last reply Reply Quote 0
                    • arteck
                      arteck Developer Most Active @bahnuhr last edited by arteck

                      @bahnuhr guckmal jetzt

                      ausserdem macht der nur eine Aktualisierung wenn sich was geändert hat...

                      1 Reply Last reply Reply Quote 0
                      • ?
                        A Former User @bahnuhr last edited by A Former User

                        @bahnuhr sagte in [Neuer Adapter] Proxmox VM:

                        Und ja, der DP wird bei start der Instanz aktualisiert.
                        Aber dann nicht mehr.

                        Moin,

                        max Mem. ist ja eher ein statischer Wert, der sich nicht laufend ändert, wohin gegen Mem. sich zur Laufzeit schon veränder, Du muss mal mit der Einstellung von MaxMem, spielen und dann schauen, ob die Änderung angezeigt wird.

                        VG
                        Bernd

                        1 Reply Last reply Reply Quote 1
                        • David G.
                          David G. last edited by David G.

                          Neulich ist ja ein DP mit einer JSON der die Backups beinhaltet in den Adapter gekommen.

                          Ich habe da mal eine kleine Tabelle für meine Visu für gemacht.

                          Ist die Unterste Tabelle im ersten Post
                          https://forum.iobroker.net/topic/59661/zusammenstellung-meiner-tabellen?_=1701274643122

                          f2f1282a-fdf4-424b-ad8d-518164d450ce-image.png

                          a26af34e-45dc-439b-9c0f-212dbc408f69-image.png

                          be772d21-c508-4de7-9c94-324880100356-image.png

                          Edit:

                          @Meistertr
                          Wie oft wird die Datei mit den Infos über die Backups aktualisiert?
                          Durch die Tabelle habe ich jetzt gemerkt, dass es Teilweise Stunden braucht, bis ich aktuelle Daten bekomme.

                          1 Reply Last reply Reply Quote 1
                          • David G.
                            David G. last edited by David G.

                            Seit dem ich die oben genannte Tabelle im Einsatz habe ist mir aufgefallen, dass mir der Adapter andauert abschmiert.
                            Mal nach ein paar Minuten, mL nach ein paar Stunden.

                            Das war der Log vor dem letzten Crash.
                            Jemand eine Idee?

                            -12-12 23:15:20.511  - debug: proxmox.0 (395719) found states: [["proxmox.0.storage.pve_tank","active","default_num",1],["proxmox.0.storage.pve_tank","avail","size",633887],["proxmox.0.storage.pve_tank","content","text","rootdir,images"],["proxmox.0.storage.pve_tank","total","size",1841664],["proxmox.0.storage.pve_tank","used_lev","level",65.58],["proxmox.0.storage.pve_tank","used","size",1207777],["proxmox.0.storage.pve_tank","shared","default_num",0],["proxmox.0.storage.pve_tank","type","text","zfspool"],["proxmox.0.storage.pve_tank","enabled","default_num",1]]
                            2023-12-12 23:15:33.369  - info: host.iobroker stopInstance system.adapter.proxmox.0 (force=false, process=true)
                            2023-12-12 23:15:33.373  - info: proxmox.0 (395719) Got terminate signal TERMINATE_YOURSELF
                            2023-12-12 23:15:33.373  - debug: proxmox.0 (395719) clearing request timeout
                            2023-12-12 23:15:33.374  - info: proxmox.0 (395719) terminating
                            2023-12-12 23:15:33.374  - info: proxmox.0 (395719) Terminated (ADAPTER_REQUESTED_TERMINATION): Without reason
                            2023-12-12 23:15:33.417  - info: host.iobroker stopInstance system.adapter.proxmox.0 send kill signal
                            2023-12-12 23:15:33.874  - info: proxmox.0 (395719) terminating
                            2023-12-12 23:15:33.963  - info: host.iobroker instance system.adapter.proxmox.0 terminated with code 11 (ADAPTER_REQUESTED_TERMINATION)
                            2023-12-12 23:15:36.519  - info: host.iobroker instance system.adapter.proxmox.0 started with pid 395797
                            2023-12-12 23:15:37.898  - info: proxmox.0 (395797) starting. Version 2.2.2 in /opt/iobroker/node_modules/iobroker.proxmox, node: v18.19.0, js-controller: 5.0.12
                            2023-12-12 23:15:37.909  - warn: proxmox.0 (395797) Using Proxmox API: https://192.168.99.58:8006/api2/json
                            

                            Ab hier kommt im Log nichts mehr mit Proxmox.

                            Adapter v. 2.2.2
                            Proxmox 8.3.1
                            Admin: 6.12.0
                            js-controller: 5.0.12
                            node: 18.19
                            O/S: bookworm

                            arteck 1 Reply Last reply Reply Quote 0
                            • arteck
                              arteck Developer Most Active @David G. last edited by

                              @david-g poste mal den gesammten LOG..

                              David G. Edis77 2 Replies Last reply Reply Quote 0
                              • David G.
                                David G. @arteck last edited by

                                @arteck
                                Den gesamten mit allen Adaptern oder nur mit proxmox.0?

                                arteck 1 Reply Last reply Reply Quote 0
                                • arteck
                                  arteck Developer Most Active @David G. last edited by

                                  @david-g gesamt

                                  David G. 1 Reply Last reply Reply Quote 0
                                  • David G.
                                    David G. @arteck last edited by

                                    @arteck

                                    Das Log ist 13 MB groß.
                                    Wie darf ich es dir zukommen lassen? Darf hier nicht so viel hochladen.

                                    Das PW vom Proxmoxuser habe ich ersetzt. Habe gesehen, dass das im Klartext im Log steht.

                                    Hoffe, das machen die anderen Adapter nicht. Hab beim überfliegen nichts gesehen.

                                    arteck 1 Reply Last reply Reply Quote 0
                                    • arteck
                                      arteck Developer Most Active @David G. last edited by

                                      @david-g was läuft denn da ..13 mb .. alder.

                                      dann machmal ab 23 Uhr

                                      David G. 1 Reply Last reply Reply Quote 0
                                      • David G.
                                        David G. @arteck last edited by

                                        @arteck

                                        Anbei:
                                        iobroker.2023-12-12.log

                                        Zur Loggröße:
                                        50b97342-0c5e-4cb9-93da-058cb9e8f0f9-image.png

                                        Am 12ten lief der Proxmoxadapter.
                                        Da schiebe ich das eindeutig mal auf diesen.......
                                        Stand aber glaube auch auf debug, da ich mal sehen wollte ich ich was finde warum er abstürzt.
                                        Grad aber nciht sicher, ob das am 12ten auch noch so war.

                                        arteck 1 Reply Last reply Reply Quote 0
                                        • arteck
                                          arteck Developer Most Active @David G. last edited by

                                          @david-g der startet neu.. und läuft danach.. nur warum der neu startet ??
                                          stell den mal auf debug,, dann ab der Zeile /das ist der start)

                                          Using Proxmox API: https://192.168.99.58:8006/api2/json
                                          

                                          bis nach abbruch posten

                                          ? David G. 2 Replies Last reply Reply Quote 0
                                          • ?
                                            A Former User @arteck last edited by

                                            @arteck sagte in [Neuer Adapter] Proxmox VM:

                                            stell den mal auf debug,,

                                            Moin,

                                            proxmox.0 steht doch schon auf Debug

                                            2023-12-12 23:12:03.174  - info: proxmox.0 (365461) Got terminate signal TERMINATE_YOURSELF
                                            2023-12-12 23:12:03.174  - info: proxmox.0 (365461) terminating
                                            2023-12-12 23:12:03.174  - info: proxmox.0 (365461) Terminated (ADAPTER_REQUESTED_TERMINATION): Without reason
                                            2023-12-12 23:12:03.303  - info: host.iobroker stopInstance system.adapter.proxmox.0 send kill signal
                                            2023-12-12 23:12:03.676  - info: proxmox.0 (365461) terminating
                                            2023-12-12 23:12:03.811  - info: host.iobroker instance system.adapter.proxmox.0 terminated with code 11 (ADAPTER_REQUESTED_TERMINATION)
                                            2023-12-12 23:12:06.413  - info: host.iobroker instance system.adapter.proxmox.0 started with pid 395336
                                            2023-12-12 23:12:06.842  - debug: proxmox.0 (395336) Redis Objects: Use Redis connection: 127.0.0.1:9001
                                            2023-12-12 23:12:06.875  - debug: proxmox.0 (395336) Objects client ready ... initialize now
                                            2023-12-12 23:12:06.919  - debug: proxmox.0 (395336) Objects create System PubSub Client
                                            2023-12-12 23:12:06.920  - debug: proxmox.0 (395336) Objects create User PubSub Client
                                            2023-12-12 23:12:06.981  - debug: proxmox.0 (395336) Objects client initialize lua scripts
                                            2023-12-12 23:12:06.991  - debug: proxmox.0 (395336) Objects connected to redis: 127.0.0.1:9001
                                            2023-12-12 23:12:07.024  - debug: proxmox.0 (395336) Redis States: Use Redis connection: 127.0.0.1:9000
                                            2023-12-12 23:12:07.091  - debug: proxmox.0 (395336) States create System PubSub Client
                                            2023-12-12 23:12:07.092  - debug: proxmox.0 (395336) States create User PubSub Client
                                            2023-12-12 23:12:07.172  - debug: proxmox.0 (395336) States connected to redis: 127.0.0.1:9000
                                            2023-12-12 23:12:07.548  - info: proxmox.0 (395336) starting. Version 2.2.2 in /opt/iobroker/node_modules/iobroker.proxmox, node: v18.19.0, js-controller: 5.0.12
                                            2023-12-12 23:12:07.562  - warn: proxmox.0 (395336) Using Proxmox API: https://192.168.99.58:8006/api2/json
                                            2023-12-12 23:12:07.688  - debug: proxmox.0 (395336) received 200 response from /access/ticket?username=root@pam&password=%23********** with content: {"data":{"username":"root@pam","cap":{"nodes":{"Sys.Syslog":1,"Sys.Audit":1,"Permissions.Modify":1,"Sys.PowerMgmt":1,"Sys.Modify":1,"Sys.Console":1,"Sys.Incoming":1},"mapping":{"Mapping.Audit":1,"Permissions.Modify":1,"Mapping.Modify":1,"Mapping.Use":1},"access":{"User.Modify":1,"Permissions.Modify":1,"Group.Allocate":1},"sdn":{"Permissions.Modify":1,"SDN.Allocate":1,"SDN.Audit":1,"SDN.Use":1},"vms":{"VM.Config.Cloudinit":1,"Permissions.Modify":1,"VM.Snapshot.Rollback":1,"VM.PowerMgmt":1,"VM.Config.CPU":1,"VM.Backup":1,"VM.Config.Network":1,"VM.Snapshot":1,"VM.Config.Disk":1,"VM.Migrate":1,"VM.Clone":1,"VM.Allocate":1,"VM.Config.HWType":1,"VM.Config.CDROM":1,"VM.Config.Options":1,"VM.Console":1,"VM.Audit":1,"VM.Config.Memory":1,"VM.Monitor":1},"storage":{"Datastore.AllocateTemplate":1,"Datastore.AllocateSpace":1,"Permissions.Modify":1,"Datastore.Allocate":1,"Datastore.Audit":1},"dc":{"Sys.Audit":1,"SDN.Allocate":1,"SDN.Audit":1,"Sys.Modify":1,"SDN.Use":1}},"CSRFPreventionToken":"6578DAB7:JMkHnQHWRxXqCzySEBJxywj2sHYusuv6MkJ24yFkFVM","ticket":"PVE:root@pam:6578DAB7::Y/sZiRygHY6y5zMUWGFTFNBXL6krD62rJ93E3q7Gh9LP1Ww0usgQelwNcS2//vMf5kaVn5sMDvZyXAUFUyoxSionNJo1zg+Ry7KkuHC0kBN2toxxUZjTHaf/3emgrkasCyS1qR12lVB/+6Gc6Vq+eq+Gg+iqV1cdv/Uuv7k6rbg0D4c3Kdf0aCSAAZACRxf0uz3/LdZL2v6gEw7gTaPecjYqexKplEJDNYLRNLtw1D5P/Qc80SYJA2X8Kpmm1qbRARgZg2PrrpS+0yOYfe0luY+X7Dho2c/L8Ukn5jskVoyiyAwhyPu1ZKT28sTPvcpIPdJ/2kNLEy6Nb+uR0XvUAA=="}}
                                            2023-12-12 23:12:07.691  - debug: proxmox.0 (395336) dataticket: {"data":{"username":"root@pam","cap":{"nodes":{"Sys.Syslog":1,"Sys.Audit":1,"Permissions.Modify":1,"Sys.PowerMgmt":1,"Sys.Modify":1,"Sys.Console":1,"Sys.Incoming":1},"mapping":{"Mapping.Audit":1,"Permissions.Modify":1,"Mapping.Modify":1,"Mapping.Use":1},"access":{"User.Modify":1,"Permissions.Modify":1,"Group.Allocate":1},"sdn":{"Permissions.Modify":1,"SDN.Allocate":1,"SDN.Audit":1,"SDN.Use":1},"vms":{"VM.Config.Cloudinit":1,"Permissions.Modify":1,"VM.Snapshot.Rollback":1,"VM.PowerMgmt":1,"VM.Config.CPU":1,"VM.Backup":1,"VM.Config.Network":1,"VM.Snapshot":1,"VM.Config.Disk":1,"VM.Migrate":1,"VM.Clone":1,"VM.Allocate":1,"VM.Config.HWType":1,"VM.Config.CDROM":1,"VM.Config.Options":1,"VM.Console":1,"VM.Audit":1,"VM.Config.Memory":1,"VM.Monitor":1},"storage":{"Datastore.AllocateTemplate":1,"Datastore.AllocateSpace":1,"Permissions.Modify":1,"Datastore.Allocate":1,"Datastore.Audit":1},"dc":{"Sys.Audit":1,"SDN.Allocate":1,"SDN.Audit":1,"Sys.Modify":1,"SDN.Use":1}},"CSRFPreventionToken":"6578DAB7:JMkHnQHWRxXqCzySEBJxywj2sHYusuv6MkJ24yFkFVM","ticket":"PVE:root@pam:6578DAB7::Y/sZiRygHY6y5zMUWGFTFNBXL6krD62rJ93E3q7Gh9LP1Ww0usgQelwNcS2//vMf5kaVn5sMDvZyXAUFUyoxSionNJo1zg+Ry7KkuHC0kBN2toxxUZjTHaf/3emgrkasCyS1qR12lVB/+6Gc6Vq+eq+Gg+iqV1cdv/Uuv7k6rbg0D4c3Kdf0aCSAAZACRxf0uz3/LdZL2v6gEw7gTaPecjYqexKplEJDNYLRNLtw1D5P/Qc80SYJA2X8Kpmm1qbRARgZg2PrrpS+0yOYfe0luY+X7Dho2c/L8Ukn5jskVoyiyAwhyPu1ZKT28sTPvcpIPdJ/2kNLEy6Nb+uR0XvUAA=="}}
                                            2023-12-12 23:12:07.691  - debug: proxmox.0 (395336) Updating ticket to "PVE:root@pam:6578DAB7::Y/sZiRygHY6y5zMUWGFTFNBXL6krD62rJ93E3q7Gh9LP1Ww0usgQelwNcS2//vMf5kaVn5sMDvZyXAUFUyoxSionNJo1zg+Ry7KkuHC0kBN2toxxUZjTHaf/3emgrkasCyS1qR12lVB/+6Gc6Vq+eq+Gg+iqV1cdv/Uuv7k6rbg0D4c3Kdf0aCSAAZACRxf0uz3/LdZL2v6gEw7gTaPecjYqexKplEJDNYLRNLtw1D5P/Qc
                                            

                                            VG
                                            Bernd

                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post

                                            Support us

                                            ioBroker
                                            Community Adapters
                                            Donate

                                            605
                                            Online

                                            31.7k
                                            Users

                                            79.8k
                                            Topics

                                            1.3m
                                            Posts

                                            proxmox
                                            74
                                            484
                                            89978
                                            Loading More Posts
                                            • Oldest to Newest
                                            • Newest to Oldest
                                            • Most Votes
                                            Reply
                                            • Reply as topic
                                            Log in to reply
                                            Community
                                            Impressum | Datenschutz-Bestimmungen | Nutzungsbedingungen
                                            The ioBroker Community 2014-2023
                                            logo