Going to Hass.io
My first Home Assistant setup evolved over time. The RaspberryPi it runs on was originally acquired for a different purpose and it had gone through several uses prior to being dedicated to home automation.
The home automation setup had evolved with a number of components added and later removed. The end result was configuration data for the various active pieces was scattered about and the only real way to do a back up was to make an image of the entire microSD card. Unfortunately, a full image back up of a storage device retains any file system corruption. And a microSD card on a RaspberryPi seems to be susceptible to corruption. Recently I went to make a change and discovered that I couldn’t do it because of corruption issues and that my backups all had the same corruption problems.
Rebuilding things from scratch, I wanted a setup that I could easily back up the configuration files so in the worse case scenario a full restore would be to create a new microSD image from scratch then apply the configuration files.
My first cut at this was using Docker. But the Home Assistant docker container has issues: First, it wants to run as root. But that caused a number of issues with my configuration control. The reason for abandoning that effort was different: I could not get it to connect to my router to get presence information.
So on to Hass.io. This transition was more work than I hoped but I’ve got things functioning pretty much as they did on my old setup prior to my filesystem corruption problems.
And most/all of the configuration files are all in one directory tree. In addition there is a back up facility on the UI that will create tar file backups (placed in a different share). Looking at the contents of a backup file, it seems to have everything I’d hoped it would.
Nextcloud backup
Automatically making a backup and pushing it to my NextCloud server turned out to be easier than I expected. There is a Hass.io add-on purpose built for this task. I created a “ha_backup” user on my Nextcloud server and in just a couple of minutes was able to configure the Nextcloud backup add-on to make a manual backup and upload it to my server.
Another couple of clicks on my Nextcloud server’s web interface and the backup folder on the NextCloud ha_backup account was shared with my personal account. A quick check of the Nextcloud folder on my laptop showed the backup folder and first manual backup already there. I opened it and it appears to contain the same type of data as the backups created via the stock Hass.io backup button.
I configured the Hass.io Nextcloud backup add-on to create periodic configuration backups. They should show up on my Nextcloud server a few seconds after being created and then should show up on my laptop shortly after I fire it up each day. My Nextcloud server is backed up daily by my VPS provider and my laptop is backed up to a local NAS box hourly. This should solve my backup problems
I’ve switched from using consumer grade microSD cards from my local discount store to using industrial rated wear leveling microSD cards so I hope some of the storage corruption issues will go away.
Node Red
Presence Detection
Previously I used a Home Assistant “group” that held the device tracker presence detection for each member of the family (plus a switch for if guests were present). I was able to use the value of that group within Node Red to decide if the house was occupied or not. With the Hass.io setup and Node Red add-on I was unable to access group information from within Node Red. So all my alarm system and other automations that required that information broke.
It looked like Node Red could access some of the “person” data. At least it seemed to know about person.administrator. And it seemed I could get home/away status about that person. So I setup a synthetic person I called “occupied” (person.occupied) and set it to be home if all our cell phones were home. But for reasons I still don’t understand I could not get Node Red to see this Home Assistant object.
Finally, I noticed that the synthetic sensors I created using the sensor template were accessible by Node Red. So I created sensor.occupied which has values of either “home” or “not-home” based on the device tracker data for everyone. That works.
Node Red Add-ons
As a note to self: The add-ons I am using within Node Red are: node-red-contrib-light-scheduler and node-red-node-email. In my earlier setup I also had an add-on to communicate with Home Assistant but with Hass.io that is baked in when you add Node Red to the setup.
Z-Wave
Until a few days ago I was unaware that you can access a USB device by ID. So use: /dev/serial/by-id/usb-0658_0200-if00
instead of \dev\ttyACM0
(or the /dev/zwave
symbolic link I set up on previous incarnations). I don’t know why that isn’t mentioned more often: It really makes connecting to the Z-Wave USB stick easy and reliable.
Z-Wave Antenna
Regarding the Z-Wave USB stick: The antenna is within the stick and where I placed my Raspberry Pi isn’t the best for RF reasons. There is a router, NAS box, switch, modem, etc. all adjacent just brimming with metal and electronics.
But a spare USB extension cord works well for repositioning the USB stick to a place in my server closet where there is good access to most of the Z-Wave devices in my network.
Graph of Z-Wave network
In my old setup I had a weird mix of things to get a Z-Wave network graph to show up. It is much easier now. The files you need are all on GitHub but it wasn’t obvious to me how to use them until I read a blog post that made it clear. Much easier than what I did in my old setup.
And in that same post there is mention of using a long USB cable to move the Z-Wave USB stick to a better location. I really did think about it and relocated my USB stick using a USB extension cable a long while back, not just after reading that blog today. Great minds think alike (dumb ones too).
Samba
Out of the box, Hass.io doesn’t support file sharing or terminal access. But the Samba Share add on works pretty well. I haven’t found a ssh add-on yet but with the Samba share I don’t really need it at this time as I can now easily edit the files that are not customizable via the web interface.
Reducing Storage Usage
Left in a default configuation, the history log database can become huge (it eventually lead to a out of storage situation on my old setup). This time around I’ve found that you can have older data purged from the history. So I’ve added an entry in configuration.yaml to only keep one day of data:
recorder:
purge_keep_days: 1
purge_interval: 1