12/24/2022 0 Comments File synchronization ubuntu# Remember to execute this script on all servers separately! Then, copy the output of the script in all of your servers' authorized_keys files. # Removing public key for security purposes. # Paste the output in your destination servers' ~/.ssh/authorized_keys file: Let's call it SyncFiles: mkdir /opt/syncfilesĪnd for secure file transfer, we want a public-private key link for the transfer link that rsync uses, this is how to configure it: ssh-keygen -t rsa -f ~/rsync-key -N '' Backup/failover servers with the need for constant replicationįirst things first: we need to get all the dependencies installed on the 3 servers with this one-line command: apt update & apt -y install openssh-server rsync inotify-toolsĪfter that, let's create a specific folder that we want to sync.Load balanced file storage clusters/servers.A development environment, where constant file transfers are taking up a lot and/or too much time.I could see some potential for workflow improvement on these situations: All rsync traffic is supposed to be worked out over SFTP. All these 3 machines needed to be setup with the following software packages:Īlso noteworthy is that these machines are absolutely not connected through a private network. I needed to get myself a nice development environment at first so I started off 3 virtual servers which all run Ubuntu 16.04, my personal favourite. Rsync is a great solution, but having to run rsync manually would take a lot of unnecessary time away, right? And that is where inotify is for: real-time monitoring of your filesystem so that your files can be synced between multiple machines with the power of rsync! Well, I got the solution for you: with a little bit of thinking in an innovative way I have found the solution that might bring you onto the right path as well. They contain files that you want to have automatically synced when possible, because that would save a lot of time. So: you've got two or more clients and/or servers. You should probably include /var in files not to sync (merged logs from multiple systems seems to make little sense).Sync folders and files on Linux with rsync and inotify I am inclined to think remote management (eg, using puppet or Ubuntu's apt synchronisation mechanism) is more appropriate than continuous synchronisation of the whole machine image. (Consider what happens if your sync mechanism replaced libc or something similar). (You could look in the apt file lists, or at open file descriptors, but there will still be edge cases). If you are just synchronising files while the system is running, it is hard to know what running processes might be affected. How do you detect when significant files are changed? Should a program be restarted if the binary is replaced, or configuration files updated? Some will detect such changes with inotify, some not. How do you deal with different architectures? Some parts of the filesystem consistently include the architecture as part of the path ( /usr/lib/x86_64-linux-gnu and friends), but I don't think this applies everywhere, and in any case you then have to determine which symlinks (and hardlinks) should be copied, and which are architecture specific. My inclination is that you are opening a bit of a pandora's box here. ~ 1Mbps upload bandwidth and a public IP address)? Can this be done with reasonable performance, having reasonable resources (e.g. I know that /boot, /dev, /proc, /run, /tmp and device-specific mount points in /mnt and /media will have to be left out the sync mechanism. Does anybody knows if this solution could approximate Dropbox in this sense? But the main difference between Dropbox and NFS is that NFS is a remote filesystem that always forces to remotely access the files, while Dropbox pushes modifications to local filesystems (and thus would perform better). So the question is: Does anybody knows a way to achieve a completely synchronized Ubuntu, always synchronized with a remote running copy, but still locally stored on both disks? This completely disrupts Linux installations, thus these services can't be used for this purpose. This means that if I push a symlink in one of the synced folders, locally the symlink is kept as is, but remotely (in the cloud or on the other synced machines) the symlink is resolved to the actual file that was originally pointed to. In fact, I think that there is just one big issue that prevents people to have their beloved installations mirrored wherever they go: symlinks.ĭropbox, Google Drive, Ubuntu One, Sugarsync, Skydrive, none of these services support symlinking. Given that we are always talking about files and directories, what's the difference between my Documents folder and my /usr system directory? Almost none, except for their location. I think that the time is ripe to have my whole Ubuntu synchronized just as my Dropbox folder is.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |