NEWS
iobroker influxdb1.x econnrefused 192.168.178.120:8086
-
@meister-mopper sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Ist nicht komplett
Denke schon aber da läuft einiges krumm:
@pepa sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Display-Server: true
Desktop: cinnamon
Terminal: x11
Boot Target: graphical.target@pepa sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
- system.adapter.influxdb.0 : influxdb : ppmint-ESPRIMO-Q556-2 - enabled, port: 8086
- system.adapter.influxdb.1 : influxdb : ppmint-ESPRIMO-Q556-2 - enabled, port: 8086
- system.adapter.javascript.0 : javascript : ppmint-ESPRIMO-Q556-2 - enabled
- system.adapter.mqtt-client.0 : mqtt-client : ppmint-ESPRIMO-Q556-2 - enabled, port: 1883
- system.adapter.mqtt.0 : mqtt : ppmint-ESPRIMO-Q556-2 - enabled, port: 1883, bind: 0.0.0.0
- system.adapter.ping.0 : ping : ppmint-ESPRIMO-Q556-2 - enabled
- system.adapter.sonoff.0 : sonoff : ppmint-ESPRIMO-Q556-2 - enabled, port: 1883, bind: 0.0.0.0
edit:
zusätzlich bitte das LOGLevel der Instanzen nicht auf SILLY stellen
-
Hallo zusammen,
nun auch hier nach Adapterupdate javascript keine Verbindung mehr zu influxdb.
Error: connect ECONNREFUSED 192.168.178.58:8086iob diag:
======== Start marking the full check here ========= Skript v.2024-05-22 *** BASE SYSTEM *** Static hostname: raspberrypi Icon name: computer Operating System: Debian GNU/Linux 12 (bookworm) Kernel: Linux 6.1.21-v8+ Architecture: arm64 Model : Raspberry Pi 4 Model B Rev 1.5 Docker : false Virtualization : none Kernel : aarch64 Userland : 64 bit Systemuptime and Load: 17:27:50 up 26 min, 2 users, load average: 4.50, 3.25, 2.51 CPU threads: 4 *** RASPBERRY THROTTLING *** Current issues: No throttling issues detected. Previously detected issues: No throttling issues detected. *** Time and Time Zones *** Local time: Sun 2024-05-26 17:27:50 CEST Universal time: Sun 2024-05-26 15:27:50 UTC RTC time: n/a Time zone: Europe/Berlin (CEST, +0200) System clock synchronized: yes NTP service: active RTC in local TZ: no *** Users and Groups *** User that called 'iob diag': ronny HOME=/home/ronny GROUPS=ronny adm dialout cdrom sudo audio video plugdev games users input render netdev lpadmin docker gpio i2c spi iobroker User that is running 'js-controller': iobroker HOME=/home/iobroker GROUPS=iobroker tty dialout audio video bluetooth gpio i2c *** Display-Server-Setup *** Display-Server: true Desktop: Terminal: tty Boot Target: multi-user.target *** MEMORY *** total used free shared buff/cache available Mem: 4.0G 1.6G 551M 56M 2.0G 2.4G Swap: 104M 0B 104M Total: 4.1G 1.6G 656M Active iob-Instances: 15 3794 M total memory 1516 M used memory 2045 M active memory 684 M inactive memory 526 M free memory 227 M buffer memory 1653 M swap cache 99 M total swap 0 M used swap 99 M free swap *** top - Table Of Processes *** top - 17:27:51 up 26 min, 2 users, load average: 4.50, 3.25, 2.51 Tasks: 238 total, 1 running, 237 sleeping, 0 stopped, 0 zombie %Cpu(s): 0.0 us, 14.3 sy, 0.0 ni, 85.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st MiB Mem : 3794.3 total, 525.2 free, 1517.6 used, 1881.1 buff/cache MiB Swap: 100.0 total, 100.0 free, 0.0 used. 2276.7 avail Mem *** FAILED SERVICES *** UNIT LOAD ACTIVE SUB DESCRIPTION * influxdb.service loaded failed failed InfluxDB is an open-source, distributed, time series database * rc-local.service loaded failed failed /etc/rc.local Compatibility LOAD = Reflects whether the unit definition was properly loaded. ACTIVE = The high-level unit activation state, i.e. generalization of SUB. SUB = The low-level unit activation state, values depend on unit type. 2 loaded units listed. *** FILESYSTEM *** Filesystem Type Size Used Avail Use% Mounted on /dev/root ext4 29G 23G 4.2G 85% / devtmpfs devtmpfs 1.7G 0 1.7G 0% /dev tmpfs tmpfs 1.9G 0 1.9G 0% /dev/shm tmpfs tmpfs 759M 3.3M 756M 1% /run tmpfs tmpfs 5.0M 16K 5.0M 1% /run/lock /dev/mmcblk0p1 vfat 255M 31M 225M 13% /boot tmpfs tmpfs 380M 36K 380M 1% /run/user/1000 //192.168.178.1/fritz.nas/FRITZ_1/Backup cifs 118G 4.9G 113G 5% /opt/iobroker/backups Messages concerning ext4 filesystem in dmesg: [Sun May 26 17:01:35 2024] Kernel command line: coherent_pool=1M 8250.nr_uarts=0 snd_bcm2835.enable_headphones=0 snd_bcm2835.enable_headphones=1 snd_bcm2835.enable_hdmi=1 snd_bcm2835.enable_hdmi=0 smsc95xx.macaddr=D8:3A:DD:16:F4:39 vc_mem.mem_base=0x3ec00000 vc_mem.mem_size=0x40000000 console=ttyS0,115200 console=tty1 root=PARTUUID=e446591d-02 rootfstype=ext4 fsck.repair=yes rootwait quiet splash plymouth.ignore-serial-consoles [Sun May 26 17:01:36 2024] EXT4-fs (mmcblk0p2): INFO: recovery required on readonly filesystem [Sun May 26 17:01:36 2024] EXT4-fs (mmcblk0p2): write access will be enabled during recovery [Sun May 26 17:01:37 2024] EXT4-fs (mmcblk0p2): recovery complete [Sun May 26 17:01:37 2024] EXT4-fs (mmcblk0p2): mounted filesystem with ordered data mode. Quota mode: none. [Sun May 26 17:01:37 2024] VFS: Mounted root (ext4 filesystem) readonly on device 179:2. [Sun May 26 17:01:40 2024] EXT4-fs (mmcblk0p2): re-mounted. Quota mode: none. [Sun May 26 17:24:58 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #324609: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:58 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #324609: comm du: Directory block failed checksum [Sun May 26 17:24:58 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325045: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:58 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325045: comm du: Directory block failed checksum [Sun May 26 17:24:58 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325090: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:58 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325090: comm du: Directory block failed checksum [Sun May 26 17:24:58 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325078: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:58 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325078: comm du: Directory block failed checksum [Sun May 26 17:24:58 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325086: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:58 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325086: comm du: Directory block failed checksum [Sun May 26 17:24:59 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325082: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:59 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325082: comm du: Directory block failed checksum [Sun May 26 17:24:59 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325088: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:59 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325088: comm du: Directory block failed checksum [Sun May 26 17:24:59 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325084: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:59 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325084: comm du: Directory block failed checksum [Sun May 26 17:24:59 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #325080: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:59 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #325080: comm du: Directory block failed checksum [Sun May 26 17:24:59 2024] EXT4-fs warning (device mmcblk0p2): ext4_dirblock_csum_verify:404: inode #324539: comm du: No space for directory leaf checksum. Please run e2fsck -D. [Sun May 26 17:24:59 2024] EXT4-fs error (device mmcblk0p2): htree_dirblock_to_tree:1072: inode #324539: comm du: Directory block failed checksum Show mounted filesystems: TARGET SOURCE FSTYPE OPTIONS / /dev/mmcblk0p2 ext4 rw,noatime |-/dev devtmpfs devtmpfs rw,relatime,size=1678472k,nr_inodes=419618,mode=755 | |-/dev/shm tmpfs tmpfs rw,nosuid,nodev | |-/dev/pts devpts devpts rw,nosuid,noexec,relatime,gid=5,mode=620,ptmxmode=000 | `-/dev/mqueue mqueue mqueue rw,nosuid,nodev,noexec,relatime |-/proc proc proc rw,relatime | `-/proc/sys/fs/binfmt_misc systemd-1 autofs rw,relatime,fd=29,pgrp=1,timeout=0,minproto=5,maxproto=5,direct | `-/proc/sys/fs/binfmt_misc binfmt_misc binfmt_misc rw,nosuid,nodev,noexec,relatime |-/sys sysfs sysfs rw,nosuid,nodev,noexec,relatime | |-/sys/kernel/security securityfs securityfs rw,nosuid,nodev,noexec,relatime | |-/sys/fs/cgroup cgroup2 cgroup2 rw,nosuid,nodev,noexec,relatime,nsdelegate,memory_recursiveprot | |-/sys/fs/pstore pstore pstore rw,nosuid,nodev,noexec,relatime | |-/sys/fs/bpf bpf bpf rw,nosuid,nodev,noexec,relatime,mode=700 | |-/sys/kernel/debug debugfs debugfs rw,nosuid,nodev,noexec,relatime | |-/sys/kernel/tracing tracefs tracefs rw,nosuid,nodev,noexec,relatime | |-/sys/fs/fuse/connections fusectl fusectl rw,nosuid,nodev,noexec,relatime | `-/sys/kernel/config configfs configfs rw,nosuid,nodev,noexec,relatime |-/run tmpfs tmpfs rw,nosuid,nodev,size=777080k,nr_inodes=819200,mode=755 | |-/run/lock tmpfs tmpfs rw,nosuid,nodev,noexec,relatime,size=5120k | |-/run/credentials/systemd-sysctl.service ramfs ramfs ro,nosuid,nodev,noexec,relatime,mode=700 | |-/run/credentials/systemd-sysusers.service ramfs ramfs ro,nosuid,nodev,noexec,relatime,mode=700 | |-/run/credentials/systemd-tmpfiles-setup-dev.service ramfs ramfs ro,nosuid,nodev,noexec,relatime,mode=700 | |-/run/credentials/systemd-tmpfiles-setup.service ramfs ramfs ro,nosuid,nodev,noexec,relatime,mode=700 | |-/run/rpc_pipefs sunrpc rpc_pipefs rw,relatime | |-/run/user/1000 tmpfs tmpfs rw,nosuid,nodev,relatime,size=388536k,nr_inodes=97134,mode=700,uid=1000,gid=1000 | |-/run/docker/netns/baec15910bef nsfs[net:[4026532852]] nsfs rw | |-/run/docker/netns/9ccad93d5340 nsfs[net:[4026532933]] nsfs rw | |-/run/docker/netns/5c56e1cacff4 nsfs[net:[4026532609]] nsfs rw | |-/run/docker/netns/84f9adbecec1 nsfs[net:[4026532422]] nsfs rw | `-/run/docker/netns/cdf8b87c3a2e nsfs[net:[4026533013]] nsfs rw |-/home/iobroker_backup/fritzNAS systemd-1 autofs rw,relatime,fd=47,pgrp=1,timeout=0,minproto=5,maxproto=5,direct |-/boot /dev/mmcblk0p1 vfat rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro |-/var/lib/docker/overlay2/a5275e136d3b8c31c6b4a6a7f9454b4c49d6fa9ba99ad4303dae39c0448ae262/merged overlay overlay rw,relatime,lowerdir=/var/lib/docker/overlay2/l/ZXOIITVSJWC2J45LRYORORNTX6:/var/lib/docker/overlay2/l/3LB5VQVUZLTIWLHCXYXYVTBYTP:/var/lib/docker/overlay2/l/WDJFTTDVG4RVBG7KEKKW23TPFC:/var/lib/docker/overlay2/l/IWAVT44WFCJBLPUYNWZRDBA2XF,upperdir=/var/lib/docker/overlay2/a5275e136d3b8c31c6b4a6a7f9454b4c49d6fa9ba99ad4303dae39c0448ae262/diff,workdir=/var/lib/docker/overlay2/a5275e136d3b8c31c6b4a6a7f9454b4c49d6fa9ba99ad4303dae39c0448ae262/work |-/var/lib/docker/overlay2/6294d7fe20fc30cf191fd349e80093aea9b7d723428fe061df028c53598e2051/merged overlay overlay rw,relatime,lowerdir=/var/lib/docker/overlay2/l/Y4KIDLIINUL4F2DXESC4O5KRBJ:/var/lib/docker/overlay2/l/LXPEYR7VBFQ6BZETH37VPMGFLQ:/var/lib/docker/overlay2/l/RG5KXMBTXWGWYY4H3ISX2NQ2AT:/var/lib/docker/overlay2/l/7SZ7C77PC5ONRYST6QJB4BZYRA:/var/lib/docker/overlay2/l/OJZY7YGXRVAA7QCVIKVEOXJEX3:/var/lib/docker/overlay2/l/JNSS5BO3UKLABDPSVWBUP4DWQ7:/var/lib/docker/overlay2/l/NLDOZSC7FVQY6O25WPNZSDPDTW:/var/lib/docker/overlay2/l/DVOKLYVEJGXBDOGHGVWHIIRT2I:/var/lib/docker/overlay2/l/T54RJWMX2RGJNSLQHKWZ4IYHMN:/var/lib/docker/overlay2/l/KD3VFV5TWC6CQHAQX4GW2EQORH:/var/lib/docker/overlay2/l/63AK5PBC3MFBRJYMHDH6HUWU5P:/var/lib/docker/overlay2/l/SFPY6ZYRB4L6MTPU3TXMK5NCNR:/var/lib/docker/overlay2/l/CK7PHP4NVYM4RC5L5FPXQJOGCN:/var/lib/docker/overlay2/l/5EHULSCTZM6FKKP5MCMB2V63YH:/var/lib/docker/overlay2/l/RKNTNECVBYMERRMUZRRGIISS7C:/var/lib/docker/overlay2/l/5OG5USPEL6SAIDNEUNVQBPKJSM:/var/lib/docker/overlay2/l/3H2TFNRJ7VIFMSWBPRH5JS5MHJ:/var/lib/docker/overlay2/l/GO6JHZDVILQZG57AZ4EG3RAJVI:/var/lib/docker/overlay2/l/PC5SBE6QGP4IZVQQZZLYB54GXX,upperdir=/var/lib/docker/overlay2/6294d7fe20fc30cf191fd349e80093aea9b7d723428fe061df028c53598e2051/diff,workdir=/var/lib/docker/overlay2/6294d7fe20fc30cf191fd349e80093aea9b7d723428fe061df028c53598e2051/work |-/var/lib/docker/overlay2/c65b180a672d4ad649614237d47bba43809101080c5afe9afd38e7721b75f766/merged overlay overlay rw,relatime,lowerdir=/var/lib/docker/overlay2/l/WBOYRJN6BCIHKCZJ6BPHZ37L3Y:/var/lib/docker/overlay2/l/7PMBGDN3W4JVMZFHEA4VVFEWOA:/var/lib/docker/overlay2/l/L4X52U3RRFPAUV6A2FH5HVKUJU:/var/lib/docker/overlay2/l/REZGUYYZ3US7PQUZWSAPSH326P:/var/lib/docker/overlay2/l/3WLQHIHKHYSMWBKJXM6LJYXP3T,upperdir=/var/lib/docker/overlay2/c65b180a672d4ad649614237d47bba43809101080c5afe9afd38e7721b75f766/diff,workdir=/var/lib/docker/overlay2/c65b180a672d4ad649614237d47bba43809101080c5afe9afd38e7721b75f766/work |-/var/lib/docker/overlay2/308ee548ba35785941d368549826341eaa5bd7062275ce3d1ca94137b70b91f6/merged overlay overlay rw,relatime,lowerdir=/var/lib/docker/overlay2/l/B7R4UXH2SL2EH2RNJS5S2DU4KQ:/var/lib/docker/overlay2/l/AV3C4K3WX3M2KALWG55MT5CEGZ:/var/lib/docker/overlay2/l/5TZELNBRLQV34FU25JYNQPIU6I:/var/lib/docker/overlay2/l/3ZA2SRHIKDVMBT7HIARZ432R5G:/var/lib/docker/overlay2/l/TDGWKQ4LZHZN66QG672TOA5EVG:/var/lib/docker/overlay2/l/5FLAUAG6OTPPCO7LNPBV4YJRHO:/var/lib/docker/overlay2/l/M4XKCWRYMA3ULE6ASYZ5M7GI53,upperdir=/var/lib/docker/overlay2/308ee548ba35785941d368549826341eaa5bd7062275ce3d1ca94137b70b91f6/diff,workdir=/var/lib/docker/overlay2/308ee548ba35785941d368549826341eaa5bd7062275ce3d1ca94137b70b91f6/work |-/var/lib/docker/overlay2/d54e898deebaef7274983ac69024d2d9354c694df846f106332bc50cfad6750a/merged overlay overlay rw,relatime,lowerdir=/var/lib/docker/overlay2/l/AEAXG5LKPHDFV3ECO2R7VI55WT:/var/lib/docker/overlay2/l/TOYFTJO536Y2CEFDKOGOPZWU6I:/var/lib/docker/overlay2/l/IRTVWCUYRMJT6673HFMWQV5SWX:/var/lib/docker/overlay2/l/HK4YRXKD6HZIFOKUXWK4I2LMLN:/var/lib/docker/overlay2/l/LEVA2AJAJNZIDJL5K4U4Y7AVHX:/var/lib/docker/overlay2/l/VJK7DLUJQVJH2KPMWPQ2ESYY7P:/var/lib/docker/overlay2/l/RKVLUAOKNZUX35Z6ENMAS7F2GI:/var/lib/docker/overlay2/l/LZT6TFXGCTFOXPCIYZCS3KCCI4:/var/lib/docker/overlay2/l/KGKN7LZBUMSQMYVY2L2IZNTOTV:/var/lib/docker/overlay2/l/SNUQZSIGTJ2HAYRJ4SSNSDIMUD:/var/lib/docker/overlay2/l/7EQICGQIM32XJQZRP76XAMWXJE:/var/lib/docker/overlay2/l/XP5HQXJAX2NKBCZNUZMK3YVFOM:/var/lib/docker/overlay2/l/R24LNBXO44TBEBMVZV6X7NSZZE:/var/lib/docker/overlay2/l/SW5OMAT4IVAQMBOWNHBVMPJPVP:/var/lib/docker/overlay2/l/LAZI6DH7DOFEEOWHEUB7YTXMAQ,upperdir=/var/lib/docker/overlay2/d54e898deebaef7274983ac69024d2d9354c694df846f106332bc50cfad6750a/diff,workdir=/var/lib/docker/overlay2/d54e898deebaef7274983ac69024d2d9354c694df846f106332bc50cfad6750a/work `-/opt/iobroker/backups //192.168.178.1/fritz.nas/FRITZ_1/Backup cifs rw,relatime,vers=3.1.1,cache=loose,username=iobroker_backup,uid=1002,noforceuid,gid=1002,noforcegid,addr=192.168.178.1,file_mode=0777,dir_mode=0777,soft,nounix,mapposix,rsize=65536,wsize=65536,bsize=1048576,echo_interval=60,actimeo=1,closetimeo=5 Files in neuralgic directories: /var: du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/Test2': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/Gc': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/IDS': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/GrBase': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/Hyphen': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/Hex': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/IDC': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/Hst': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/unicore/lib/GrExt': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/TAP/Harness': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/TAP/Parser/Scheduler': Bad message du: cannot read directory '/var/lib/docker/overlay2/8bd87b394eac97a69f2b7eaf35e0b29fe8ce790ab6d3f2f09ded78c1314fb43f/diff/usr/share/perl/5.36.0/TAP/Parser/Result': Bad message 16G /var/ 13G /var/lib/docker 13G /var/lib 11G /var/lib/docker/overlay2 1.8G /var/log Archived and active journals take up 1.6G in the file system. /opt/iobroker/backups: 1.3G /opt/iobroker/backups/ 47M /opt/iobroker/backups/influxDBtmp /opt/iobroker/iobroker-data: 288M /opt/iobroker/iobroker-data/ 231M /opt/iobroker/iobroker-data/files 138M /opt/iobroker/iobroker-data/files/javascript.admin 118M /opt/iobroker/iobroker-data/files/javascript.admin/static 117M /opt/iobroker/iobroker-data/files/javascript.admin/static/js The five largest files in iobroker-data are: 24M /opt/iobroker/iobroker-data/files/web.admin/static/js/main.135279a0.js.map 13M /opt/iobroker/iobroker-data/objects.jsonl 8.5M /opt/iobroker/iobroker-data/files/web.admin/static/js/main.135279a0.js 7.0M /opt/iobroker/iobroker-data/files/javascript.admin/static/js/675.a9c6d34a.chunk.js.map 7.0M /opt/iobroker/iobroker-data/files/javascript.admin/static/js/61.54b23816.chunk.js.map USB-Devices by-id: USB-Sticks - Avoid direct links to /dev/tty* in your adapter setups, please always prefer the links 'by-id': No Devices found 'by-id' *** NodeJS-Installation *** /usr/bin/nodejs v18.20.3 /usr/bin/node v18.20.3 /usr/bin/npm 10.7.0 /usr/bin/npx 10.7.0 /usr/bin/corepack 0.28.0 nodejs: Installed: 18.20.3-1nodesource1 Candidate: 18.20.3-1nodesource1 Version table: *** 18.20.3-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 100 /var/lib/dpkg/status 18.20.2-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.20.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.20.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.19.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.19.0+dfsg-6~deb12u1 500 500 http://security.debian.org/debian-security bookworm-security/main arm64 Packages 18.19.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.18.2-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.18.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.18.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.17.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.17.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.16.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.16.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.15.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.14.2-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.14.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.14.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.13.0+dfsg1-1 500 500 http://deb.debian.org/debian bookworm/main arm64 Packages 18.13.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.12.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.11.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.10.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.9.1-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.9.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.8.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.7.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.6.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.5.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.4.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.3.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.2.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.1.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages 18.0.0-1nodesource1 1001 500 https://deb.nodesource.com/node_18.x nodistro/main arm64 Packages Temp directories causing npm8 problem: 0 No problems detected Errors in npm tree: *** ioBroker-Installation *** ioBroker Status iobroker is running on this host. Objects type: jsonl States type: jsonl Core adapters versions js-controller: 5.0.19 admin: 6.13.16 javascript: 8.3.1 nodejs modules from github: 0 Adapter State + system.adapter.accuweather.0 : accuweather : raspberrypi - enabled + system.adapter.admin.0 : admin : raspberrypi - enabled, port: 8081, bind: 0.0.0.0 (SSL), run as: admin + system.adapter.backitup.0 : backitup : raspberrypi - enabled + system.adapter.discovery.0 : discovery : raspberrypi - enabled + system.adapter.go-e.0 : go-e : raspberrypi - enabled + system.adapter.influxdb.0 : influxdb : raspberrypi - enabled, port: 8086 + system.adapter.javascript.0 : javascript : raspberrypi - enabled + system.adapter.mqtt-client.0 : mqtt-client : raspberrypi - enabled, port: 1883 + system.adapter.mqtt.0 : mqtt : raspberrypi - enabled, port: 1883, bind: 192.168.178.58 + system.adapter.ping.0 : ping : raspberrypi - enabled + system.adapter.solax.0 : solax : raspberrypi - enabled + system.adapter.tesla-motors.0 : tesla-motors : raspberrypi - enabled system.adapter.vis-icontwo.0 : vis-icontwo : raspberrypi - disabled + system.adapter.vis-inventwo.0 : vis-inventwo : raspberrypi - enabled system.adapter.vis-timeandweather.0 : vis-timeandweather : raspberrypi - disabled system.adapter.vis-weather.0 : vis-weather : raspberrypi - disabled system.adapter.vis.0 : vis : raspberrypi - enabled + system.adapter.web.0 : web : raspberrypi - enabled, port: 8082, bind: 0.0.0.0, run as: admin + instance is alive Enabled adapters with bindings + system.adapter.admin.0 : admin : raspberrypi - enabled, port: 8081, bind: 0.0.0.0 (SSL), run as: admin + system.adapter.influxdb.0 : influxdb : raspberrypi - enabled, port: 8086 + system.adapter.mqtt-client.0 : mqtt-client : raspberrypi - enabled, port: 1883 + system.adapter.mqtt.0 : mqtt : raspberrypi - enabled, port: 1883, bind: 192.168.178.58 + system.adapter.web.0 : web : raspberrypi - enabled, port: 8082, bind: 0.0.0.0, run as: admin ioBroker-Repositories stable : http://download.iobroker.net/sources-dist.json beta : http://download.iobroker.net/sources-dist-latest.json Active repo(s): stable Installed ioBroker-Instances Used repository: stable Adapter "accuweather" : 1.4.0 , installed 1.4.0 Adapter "admin" : 6.13.16 , installed 6.13.16 Adapter "backitup" : 2.11.0 , installed 2.11.0 Adapter "discovery" : 4.4.0 , installed 4.4.0 Adapter "go-e" : 1.0.29 , installed 1.0.29 Adapter "influxdb" : 4.0.2 , installed 4.0.2 Adapter "javascript" : 8.3.1 , installed 8.3.1 Controller "js-controller": 5.0.19 , installed 5.0.19 Adapter "mqtt" : 5.2.0 , installed 5.2.0 Adapter "mqtt-client" : 1.8.0 , installed 1.8.0 Adapter "ping" : 1.6.2 , installed 1.6.2 Adapter "simple-api" : 2.7.2 , installed 2.7.2 Adapter "socketio" : 6.7.0 , installed 6.7.0 Adapter "solax" : 0.9.6 , installed 0.9.6 Adapter "tesla-motors" : 1.3.2 , installed 1.3.2 Adapter "vis" : 1.5.4 , installed 1.5.4 Adapter "vis-icontwo" : 1.5.0 , installed 1.5.0 Adapter "vis-inventwo" : 3.3.4 , installed 3.3.4 Adapter "vis-timeandweather": 1.2.2, installed 1.2.2 Adapter "vis-weather" : 2.5.9 , installed 2.5.9 Adapter "web" : 6.2.5 , installed 6.2.5 Adapter "ws" : 2.6.1 , installed 2.6.1 Objects and States Please stand by - This may take a while Objects: 3373 States: 3335 *** OS-Repositories and Updates *** Hit:1 http://archive.raspberrypi.org/debian bookworm InRelease Hit:2 http://deb.debian.org/debian bookworm InRelease Hit:3 http://deb.debian.org/debian bookworm-updates InRelease Hit:4 http://security.debian.org/debian-security bookworm-security InRelease Hit:5 https://apt.grafana.com stable InRelease Hit:6 https://download.docker.com/linux/debian bookworm InRelease Hit:7 https://deb.nodesource.com/node_18.x nodistro InRelease Reading package lists... Pending Updates: 0 *** Listening Ports *** Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State User Inode PID/Program name tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 0 14112 703/sshd: /usr/sbin tcp 0 0 0.0.0.0:139 0.0.0.0:* LISTEN 0 1936 1043/smbd tcp 0 0 0.0.0.0:445 0.0.0.0:* LISTEN 0 1935 1043/smbd tcp 0 0 192.168.178.58:1883 0.0.0.0:* LISTEN 1002 26464 5190/io.mqtt.0 tcp 0 0 127.0.0.1:631 0.0.0.0:* LISTEN 0 15934 651/cupsd tcp 0 0 0.0.0.0:5900 0.0.0.0:* LISTEN 0 14143 687/vncserver-x11-c tcp 0 0 0.0.0.0:4000 0.0.0.0:* LISTEN 0 19734 1711/docker-proxy tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 1002 18504 747/iobroker.js-con tcp 0 0 127.0.0.1:9001 0.0.0.0:* LISTEN 1002 16695 747/iobroker.js-con tcp 0 0 0.0.0.0:3100 0.0.0.0:* LISTEN 0 18936 1678/docker-proxy tcp6 0 0 :::3000 :::* LISTEN 116 18497 746/grafana tcp6 0 0 :::22 :::* LISTEN 0 14114 703/sshd: /usr/sbin tcp6 0 0 :::139 :::* LISTEN 0 1934 1043/smbd tcp6 0 0 ::1:631 :::* LISTEN 0 15933 651/cupsd tcp6 0 0 :::445 :::* LISTEN 0 1933 1043/smbd tcp6 0 0 :::5900 :::* LISTEN 0 14142 687/vncserver-x11-c tcp6 0 0 :::8081 :::* LISTEN 1002 27689 3966/io.admin.0 tcp6 0 0 :::8082 :::* LISTEN 1002 29906 5833/io.web.0 tcp6 0 0 :::4000 :::* LISTEN 0 18136 1719/docker-proxy tcp6 0 0 :::3100 :::* LISTEN 0 18116 1685/docker-proxy udp 0 0 0.0.0.0:5353 0.0.0.0:* 108 1650 435/avahi-daemon: r udp 0 0 0.0.0.0:68 0.0.0.0:* 0 13248 512/dhcpcd udp 0 0 172.19.255.255:137 0.0.0.0:* 0 26075 737/nmbd udp 0 0 172.19.0.1:137 0.0.0.0:* 0 26074 737/nmbd udp 0 0 172.17.255.255:137 0.0.0.0:* 0 26071 737/nmbd udp 0 0 172.17.0.1:137 0.0.0.0:* 0 26070 737/nmbd udp 0 0 169.254.255.255:137 0.0.0.0:* 0 26067 737/nmbd udp 0 0 169.254.248.216:137 0.0.0.0:* 0 26066 737/nmbd udp 0 0 169.254.255.255:137 0.0.0.0:* 0 26063 737/nmbd udp 0 0 169.254.225.2:137 0.0.0.0:* 0 26062 737/nmbd udp 0 0 169.254.255.255:137 0.0.0.0:* 0 26059 737/nmbd udp 0 0 169.254.210.46:137 0.0.0.0:* 0 26058 737/nmbd udp 0 0 169.254.255.255:137 0.0.0.0:* 0 26055 737/nmbd udp 0 0 169.254.141.55:137 0.0.0.0:* 0 26054 737/nmbd udp 0 0 169.254.255.255:137 0.0.0.0:* 0 26051 737/nmbd udp 0 0 169.254.113.152:137 0.0.0.0:* 0 26050 737/nmbd udp 0 0 192.168.178.255:137 0.0.0.0:* 0 14306 737/nmbd udp 0 0 192.168.178.58:137 0.0.0.0:* 0 14305 737/nmbd udp 0 0 0.0.0.0:137 0.0.0.0:* 0 15970 737/nmbd udp 0 0 172.19.255.255:138 0.0.0.0:* 0 26077 737/nmbd udp 0 0 172.19.0.1:138 0.0.0.0:* 0 26076 737/nmbd udp 0 0 172.17.255.255:138 0.0.0.0:* 0 26073 737/nmbd udp 0 0 172.17.0.1:138 0.0.0.0:* 0 26072 737/nmbd udp 0 0 169.254.255.255:138 0.0.0.0:* 0 26069 737/nmbd udp 0 0 169.254.248.216:138 0.0.0.0:* 0 26068 737/nmbd udp 0 0 169.254.255.255:138 0.0.0.0:* 0 26065 737/nmbd udp 0 0 169.254.225.2:138 0.0.0.0:* 0 26064 737/nmbd udp 0 0 169.254.255.255:138 0.0.0.0:* 0 26061 737/nmbd udp 0 0 169.254.210.46:138 0.0.0.0:* 0 26060 737/nmbd udp 0 0 169.254.255.255:138 0.0.0.0:* 0 26057 737/nmbd udp 0 0 169.254.141.55:138 0.0.0.0:* 0 26056 737/nmbd udp 0 0 169.254.255.255:138 0.0.0.0:* 0 26053 737/nmbd udp 0 0 169.254.113.152:138 0.0.0.0:* 0 26052 737/nmbd udp 0 0 192.168.178.255:138 0.0.0.0:* 0 14308 737/nmbd udp 0 0 192.168.178.58:138 0.0.0.0:* 0 14307 737/nmbd udp 0 0 0.0.0.0:138 0.0.0.0:* 0 15971 737/nmbd udp 0 0 0.0.0.0:631 0.0.0.0:* 0 15945 735/cups-browsed udp 0 0 0.0.0.0:49917 0.0.0.0:* 108 1652 435/avahi-daemon: r udp6 0 0 :::5353 :::* 108 1651 435/avahi-daemon: r udp6 0 0 :::47428 :::* 108 1653 435/avahi-daemon: r udp6 0 0 :::546 :::* 0 14259 512/dhcpcd *** Log File - Last 25 Lines *** 2024-05-26 17:27:44.862 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:27:44.863 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:27:44.868 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:27:54.870 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:27:54.870 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:27:54.873 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:04.875 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:04.876 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:04.883 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:14.884 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:14.885 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:14.888 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:24.890 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:24.891 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:24.894 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:34.896 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:34.896 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:34.900 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:44.901 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:44.902 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:44.905 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 2024-05-26 17:28:45.236 - info: solax.0 (4715) State value to set for "solax.0.data.json" has to be type "string" but received type "number" 2024-05-26 17:28:54.906 - info: influxdb.0 (14866) Connecting http://192.168.178.58:8086/ ... 2024-05-26 17:28:54.907 - info: influxdb.0 (14866) Influx DB Version used: 2.x 2024-05-26 17:28:54.911 - error: influxdb.0 (14866) Error: connect ECONNREFUSED 192.168.178.58:8086 ============ Mark until here for C&P =============
Lief einwandfrei, nach javascript-Adapter-update keine Verbindung. Danke.
-
Dein Raspberry OS 'Bookworm' ist nicht richtig installiert. War ein 'inline upgrade' von einer Vorversion aus, vermute ich...
-
ja, stimmt. hat aber bis jetzt funktioniert... Ich komme auch nicht mehr auf die Webseite der influxdb-Installation merke ich gerade.
-
@ronk sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
hat aber bis jetzt funktioniert...
Aber nicht richtig...
Die Kiste dreht im roten Bereich und der influxdb.service startet nicht.
systemctl status influxdb.service
-
Linux raspberrypi 6.1.21-v8+ #1642 SMP PREEMPT Mon Apr 3 17:24:16 BST 2023 aarc h64 The programs included with the Debian GNU/Linux system are free software; the exact distribution terms for each program are described in the individual files in /usr/share/doc/*/copyright. Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent permitted by applicable law. Last login: Sun May 26 17:42:47 2024 from 192.168.178.20 ronny@raspberrypi:~ $ systemctl status influxdb.service × influxdb.service - InfluxDB is an open-source, distributed, time series database Loaded: loaded (/lib/systemd/system/influxdb.service; enabled; preset: enabled) Active: failed (Result: exit-code) since Sun 2024-05-26 17:34:08 CEST; 9min ago Docs: https://docs.influxdata.com/influxdb/ Process: 792 ExecStart=/usr/lib/influxdb/scripts/influxd-systemd-start.sh (code=exited, status=2) CPU: 208ms Mai 26 17:34:08 raspberrypi influxd-systemd-start.sh[794]: /go/src/runtime/mgc.go:1407 +0x398 fp=0x4000072fd0 sp=0x4000072f40 pc=0x7f89bfaa58 Mai 26 17:34:08 raspberrypi influxd-systemd-start.sh[794]: runtime.goexit() Mai 26 17:34:08 raspberrypi influxd-systemd-start.sh[794]: /go/src/runtime/asm_arm64.s:1172 +0x4 fp=0x4000072fd0 sp=0x4000072fd0 pc=0x7f89c4af04 Mai 26 17:34:08 raspberrypi influxd-systemd-start.sh[794]: created by runtime.gcBgMarkStartWorkers Mai 26 17:34:08 raspberrypi influxd-systemd-start.sh[794]: /go/src/runtime/mgc.go:1199 +0x28 Mai 26 17:34:08 raspberrypi systemd[1]: influxdb.service: Scheduled restart job, restart counter is at 5. Mai 26 17:34:08 raspberrypi systemd[1]: Stopped influxdb.service - InfluxDB is an open-source, distributed, time series database. Mai 26 17:34:08 raspberrypi systemd[1]: influxdb.service: Start request repeated too quickly. Mai 26 17:34:08 raspberrypi systemd[1]: influxdb.service: Failed with result 'exit-code'. Mai 26 17:34:08 raspberrypi systemd[1]: Failed to start influxdb.service - InfluxDB is an open-source, distributed, time series database.
-
Dein Dateisystem läuft voll und der Kernel hat ein fscheck durchführen müssen.
Dann spukt da noch irgendwas von einem Docker herum. -
ja, da läuft eine Dockerinstallation mit TeslaSolarCharger.
-
Bei dem Thema bin ich dann auch raus.
Schau das du deine Basis sauber hinbekommst. Wirst du aber nur per Neuinstallation das OS richtig hinbekommen, sage ich voraus. -
nuja, das lief alles einträchtig nebeneinander. Also von grundauf neuinstallieren?
-
Nuja, nicht wirklich. Wie du siehst.
Die Empfehlung Raspberry OS 'Bookworm' komplett neuzuinstallieren gab es nicht ohne Grund.
-
-
Das ging aber schnell... Eine ordentliche Neuinstallation kann das nicht gewesen sein und da ist immer noch das Flickwerk am Start.
-
nein, war natürlich keine Neuinstallation. Im Zuge eines geplanten Umbaus auf SSD (noch läuft die SD-Karte unauffällig, toitoitoi) setze ich alles voraussichtlich im Herbst neu auf.
Habe aber iob diag nochmal laufen lassen, keinerlei Warnungen oder Fehler mehr erkennbar. Den TeslaSolarCharger brauche ich wirklich sehr oft, iobroker ist Beifang, nur um die Sache etwas hübscher aussehen zu lassen.Eine Frage habe ich noch: was genau ist an (m)einem InRelease-upgrade Flickwerk (gerne am Beispiel meiner geposteten Quotes)? Auch ich bin ein Freund sauberer Neuinstallationen. Aber wenn alles problemlos läuft? Warum nicht?
Vielen Dank
-
@ronk sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Aber wenn alles problemlos läuft?
tut es ja nicht! Nur scheinbar.
Ab und zu blitzt ein Problem auf.
Das sind Zeitbomben.irgendwann klappt es möglicherweise nicht, wenn du
@ronk sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Den TeslaSolarCharger brauche ich wirklich sehr oft,
-
@ronk sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Eine Frage habe ich noch: was genau ist an (m)einem InRelease-upgrade Flickwerk
Falscher Kernel über die falsche Firmware falsch eingehängt.
Hier versucht es auch jemand:
https://gnulinux.ch/pios-auf-bookworm-upgrade-ohne-neuinstallation-und-gewehr
Der schreibt aber auch:
Bitte beachten, dass dieser Vorgang nicht offiziell empfohlen wird und möglicherweise nicht für alle Anwendungen geeignet ist, besonders das Upgrade einer Installation mit grafischer Oberfläche wurde nicht getestet und wird höchstwahrscheinlich nicht ohne weiteres möglich sein. Auch ein Upgrade der 32-Bit Version von PiOS wurde nicht getestet.
Hier noch:
the changes in release from bullseye to bookworm is a lot, the desktop was swapped from a modified lxde to a wayland/wayfire setup, so no longer lxde, the network was changed from dhcpd to network manager, many other stuff was changed behind the UI.
use a spare to do a fresh install of raspberry bookworm os using imager you can then grab anything important to you from your old install.
-
Aha, so ist das also. Vielen Dank.
Dann sollte ich das doch schnellstmöglich in Ordnung bringen.
Die Sicherungen, die ich mir von Backitup für influxdb und Grafana mit anlegen lasse kann man für die Wiederherstellung nach Neuinstallation nutzen? Also alle Datenpunkte und die Datenbank lassen sich darüber wieder einspielen?
-
@ronk sagte in iobroker influxdb1.x econnrefused 192.168.178.120:8086:
Die Sicherungen, die ich mir von Backitup für influxdb und Grafana mit anlegen lasse kann man für die Wiederherstellung nach Neuinstallation nutzen?
Ja