Automatic backups to ssh'ed computer?

Currently we only have one pioreactor running, so the database can’t be backed up onto any other workers, and it’s only in one place :melting_face:

Would it be sensible to set up my PC as a fake worker so that the syncing happens automatically? Tailscale is normally running on both devices so they tend to be on the same network. This would also have the advantage that I can analyse data locally etc. without having to be connected to the pioreactor.

@rafik.n I think you mentioned you had some automatic backup thing going on?

set up my PC as a fake worker

I think that would cause more noise than you would like (i.e. lots of “could not connect / execute X”)

A simpler solution might just be to set up a weekly (?) cronjob, on either computer, so scp the ~/.pioreactor/storage/pioreactor.sqlite.backup backup to the remote computer.

1 Like

(much) Later, we’d like plugins that sync databases to cloud providers, and this could include local computers, too.

1 Like

Hey @noahsprent,

My apologies that I missed your tag. What I had mentioned previously was not about backing up everything. It was more about exporting custom data directly into a Python script I run on my PC. Shown here: Export custom measurements via UI - #2 by CamDavidsonPilon

Funny enough we seem to work on similar problems at the same time. I have been exploring the exporting/importing scripts which is similar to backing up. A very easy solution would be to set a second pio as a worker on a pi 02W instead of using ur computer.

An intermediate approach before complete cloud integration, that @CamDavidsonPilon mentioned, could be setting up a folder on the pio that is backedup automatically on a cloud folder (such as google drive) and have a script similar to an export script run at a set backing-up frequency to write to the backed up folder. Export script rn is not ideal as it turns off everything to do so which could interrupt experiments.