Backup in Linux Servers - Docker Volumes, and Databases

Sdílet
Vložit
  • čas přidán 4. 07. 2024
  • Backup in Linux doesn't need to be complicated. I'll show you backup strategies and tools to create a reliable backup for your entire Linux server. You can use this perfectly in combination with Docker Volumes and Databases. #Duplicati #Docker #HomeLab
    DOCS: github.com/christianlempa/vid...
    Linuxserver.io: docs.linuxserver.io/images/do...
    Database Backup Script: github.com/christianlempa/scr...
    Nginx Proxy Manager Tutorial: • Nginx Proxy Manager - ...
    Portainer Tutorial: • Portainer Install Ubun...
    Docker Tutorial: • Docker explained simply
    Follow me:
    TWITTER: / christianlempa
    INSTAGRAM: / christianlempa
    DISCORD: / discord
    GITHUB: github.com/christianlempa
    PATREON: / christianlempa
    MY EQUIPMENT: kit.co/christianlempa
    Timestamps:
    00:00 - Introduction
    01:05 - Backup Strategies
    02:40 - Incremental Backups
    04:32 - Deploy Duplicati on Portainer
    09:19 - Create a simple Backup Job
    12:28 - How to Backup Databases
    14:57 - Database Backup Script
    18:18 - Backup Database Dumps with Duplicati
    19:35 - Automatically run the Script
    ----
    All links with "*" are affiliate links.

Komentáře • 169

  • @albert21994
    @albert21994 Před 8 měsíci +4

    Thank you Christian for going the extra step with the Databases. This kind of content is really missing on YT as most tutorials cover the "easy" stuff.

  • @walking_on_earth
    @walking_on_earth Před 2 lety +3

    You’re doing great work man! Your videos are so easy to follow and helpful! Thanks for all your efforts on this channel :)

  • @myr3434
    @myr3434 Před 3 lety +6

    I was literally today contemplating how to backup my databases on my home server. Thank you for posting - your timing was perfect as was the info.

  • @thewestindianboy
    @thewestindianboy Před 2 měsíci +1

    You earned a new subscriber. Your videos are better than a lot of tech guys out there. Respect from India🙏

  • @belaircomputerguyllc4001
    @belaircomputerguyllc4001 Před 3 lety +7

    SOLID videos. I've watched dozens of yours. I hope your channel explodes!

  • @MrBoydheeres
    @MrBoydheeres Před 3 lety +4

    It amazing how much my home network project with docker, docker compose and all the containers I run line up with the videos you put out. Sometimes I get the feeling you are seeing my project notes. Just this week I started fiddling with backups and what do you know a new TDL video to explain exactly what I need to now.
    Keep up the awesome work man. Your videos really help me enjoy the process of creating and managing my own setup.

    • @christianlempa
      @christianlempa  Před 3 lety +1

      This is awesome! I hope my next videos will also fit in well with your projects 😍

    • @MrBoydheeres
      @MrBoydheeres Před 3 lety +1

      @@christianlempa I am trying to get the last part of the backup implemented with systemd to run the job automatically. Sadly it seems that the how ever I try to set it up on my RPI4 I get permission error messages. I have no clue why it is failing. Any suggestions?

  • @matthaeusdoerksen6804

    This video is great! I didn't know that I can back up docker volumes witt Duplicati. Thank you Christian!

  • @partretropartnostalgia5439

    Database backups! I’ve been meaning to dive into this, thanks for the overview of how you’re managing this!

    • @christianlempa
      @christianlempa  Před 3 lety

      Glad you liked it 🙂

    • @mechbear
      @mechbear Před 2 lety

      @@christianlempa not only we did like it, it's really useful. Thanks!

  • @alimohammadi6015
    @alimohammadi6015 Před rokem +1

    A lot of thanks go to you for presenting this complete backup solution!

  • @LarsEjaas
    @LarsEjaas Před rokem

    Just wow. What a video. Didn't know of Duplicati. This is just what I was looking for. Thanks!

    • @christianlempa
      @christianlempa  Před rokem

      Glad it was helpful!

    • @LarsEjaas
      @LarsEjaas Před rokem

      @Christian Lempa It's super helpful! Even for a frontend dev. like me. I'm just slowly learning devops stuff like Docker and Bash - so I need to take things slow. But your video and repo is incredibly helpful! Thanks again 🙏

  • @stevendonaldson1216
    @stevendonaldson1216 Před 2 lety +6

    As a Newbie I am able to understand and follow your explanations. Even though it's hard for me to deeply understand what's going on, I can still pickup on things to learn next. I would love to see a Linux survivor's guide by Christian TDL. I'd watch a nice, long, in-depth Linux introduction course by this guy.

    • @christianlempa
      @christianlempa  Před 2 lety +2

      Thanks for the great feedback! I'm thinking about more Linux content indeed 😀

  • @arielalejandro6900
    @arielalejandro6900 Před rokem

    very clear and well explained, thanks for taking your time making this video.

  • @HEWfunkingKNEWit
    @HEWfunkingKNEWit Před 3 lety +2

    Thanks for the demistification of backups. Great video.

  • @yachalupson
    @yachalupson Před 3 lety +1

    So useful, thank you so much Christian! - The DB backup script, I can just read it.. but couldn't have written it.

  • @JeanFrancoCaringi
    @JeanFrancoCaringi Před 3 lety +4

    The db backup script blows my mind! Thanks!
    Please more videos ámbito db restore!

  • @lukestagg2138
    @lukestagg2138 Před rokem

    really love your channel Christian! thank you

  • @qiuyue4082
    @qiuyue4082 Před 3 lety +14

    From the doc:
    > The Duplicati project was inspired by Duplicity and had similar functionality until 2008
    I believe duplicati is an Italian word and you spell du-plee-ka-tee

    • @christianlempa
      @christianlempa  Před 3 lety +5

      Oh yea I knew that I'm going to miss-spell it :D thanks bro!

  • @randallroach
    @randallroach Před 3 lety

    Many thanks Christian, happy to sub!

  • @tarcisio_menezes
    @tarcisio_menezes Před rokem

    Awesome video! Thank you!

  • @mohammadshakir1664
    @mohammadshakir1664 Před 3 lety

    Awesome Christian

  • @patrickp.harvey848
    @patrickp.harvey848 Před 10 měsíci

    Thank you for this great post, grateful!

  • @maginos1310
    @maginos1310 Před 2 lety

    Thank you Christian for this awesome video, it is really helpful!
    With the help of your tutorial, I am now able to backup the docker containers for my Zabbix installation. Nevertheless, during the setup process of Duplicati and your db backup script I had to overcome two hurdles and I would like to share how I solved the different challenges.
    The Zabbix installation has four different containers and the images are called "zabbix/zabbix-agent", "zabbix/zabbix-web-nginx-mysql" "zabbix/zabbix-server-mysql" and "zabbix/zabbix-java-gateway". In addition to these four containers, I run a mariadb container for the db. When the first line of your backup script (docker ps ....) is executed, not only the mariadb docker is in the output, but also "zabbix-web-nginx-mysql" and "zabbix-server-mysql", which both do not contain a database. So I had to adapt this line to the following:
    CONTAINER=$(docker ps --format '{{.Names}}:{{.Image}}' | grep 'mariadb' | cut -d":" -f1)
    With this, the output of the docker ps command is only the mariadb container.
    For the execution of this script, I use a cronjob.
    I also tested the restore of the mysql dump and since it is not explained in detail how the mysql dump is restored, here's how I do it:
    - Recreate the docker container with cli command or docker-compose file.
    - Mount the Backup directory from the backup script in the mariadb container (e.g. /appdata/mariadb/backup:/backup)
    - Go to the backup directory and unzip the dump file: cd /appdata/mariadb/backup && sudo gunzip name-of-dump-file.sql.gz
    - Connect to mariadb docker console: docker exec -it -u 0 name-of-mariadb-docker /bin/bash
    - Restore the mysql db with following command: mysql --user username-of-database --password -p name-of-database < /backup/name-of-dump-file.sql
    After that, I restart all Zabbix containers and also the mariadb container and everything is fine.
    Once again, thank your for this awesome tutorial!

    • @christianlempa
      @christianlempa  Před 2 lety

      Thank you so much! Glad you liked it

    • @maginos1310
      @maginos1310 Před 2 lety

      @@christianlempa fyi: I set it up now on two hosts, including telegram notifications. For this I used the Email2Telegram Bot, which generates a mail address to which Duplicati can send reports. It's easy to setup and I like it. 😎

  • @Baker00552
    @Baker00552 Před 2 lety +2

    You could also use the "--run-script-before" in Duplicati to run the database backups before the duplicati backup.
    Also you pronounced it wrong. Duplicity != Duplicati. there are 2 different products.
    Anyways - informational video.

  • @rodrigo.55
    @rodrigo.55 Před rokem +1

    very useful video!

  • @PetritK10
    @PetritK10 Před 3 lety

    Amazing, more video about Linux Administration bro, thank you:D

  • @benjamink7311
    @benjamink7311 Před 3 lety +1

    Very good video, I like to see other opensource options preferably bacula in docker please

  • @vergil3656
    @vergil3656 Před rokem

    hello christian, great video!
    thank for this detailed explanation of database backups.
    i have a integrity check enabled on my nas and i read that this should not be enabled for folders that contains virtual machines and databases (maybe its the same reason like you explained in your video).
    so that means i can not protect my vms and databases from bit-rot or is there a other solution?
    what would you do?

  • @mattiavadala7870
    @mattiavadala7870 Před rokem

    Awesome tutorial! I had some changes on the SH file because now is Vaultwarden. I don't tried yet to restore the relative DB with the sql file to another Vaultwarden docker ( in order to simulate a disaster or just the brake down of my PI). I not that the SQL file is encrypted so I think I will have to decrypt that somehow, right? :)

  • @laurentiusjudhianto6631

    Hi Christian, it's an awesome explanation and video. However I think it's best to add --single-transaction --quick during the database dump especially on database that kept changing frequently especially when there's cronjob processes running (CMIIW)

  • @flobow8446
    @flobow8446 Před rokem

    Great work and clear instructions. I use duplicati with BackBlaze on my server and currently rework my infrastructure with new servers. Good idea to grep the ROOT password out of the Container itself not having them at all in the script, will probably use your bash script for MariaDB . Learned something new about using systemd as timer , instead of regular crontab.
    As addition a quick guide on stopping containers (like gitea) before doing volume backup would great for others. Otherwise I assume you run it lock errors or data consistency errors ,too. Example I have is like Gitea, HomeAssistant (at least the .storage folder)

    • @christianlempa
      @christianlempa  Před rokem

      Thank you :) Good point, I might give my scripts a rework soon, as I haven't used and updated them for a long time

    • @flobow8446
      @flobow8446 Před rokem

      @@christianlempa I provided duplicati docker capability (like you would do for portainer) and then I use a shared script folder where I have some pre and post script that run docker stop containerxyz and start container. With args you could only use one 😀

  • @PaulChauvat
    @PaulChauvat Před rokem

    Super video ! Thanks a lot. I've a just a question: What would be the fastest protocol to backup btw two server ? (Rsync is decent but for terabyte of data it get tricky, and on replicate there is a lot of options: SFTP/ftp etc...)
    Also an next idea of video would be disasters recovery. After a breach or a physical lost :)

  • @pavelperina7629
    @pavelperina7629 Před rokem +1

    Good video. In my opinion it misses one part: example of how to restore the backup.
    Also some backup scenarios can be done using btrfs filesystem snapshot (likely works with zfs, possibly some other filesystems).
    I already had some fun migrating few docker volumes and part of docker-compose from OpenSuse (docker) to Fedora (using podman). Hopefully podman has a command for importing volume from tar file.

  • @EvolverDX
    @EvolverDX Před rokem

    Duplicati/Duplicati is the official maintained Docker Image on Docker HUB.

  • @lauerdk
    @lauerdk Před 3 lety +2

    Great video which helped me to optimize my backup. However, you could add something about the 1-2-3 strategy. E.g how to use it to save offline backup other places.

    • @christianlempa
      @christianlempa  Před 3 lety +1

      Thanks, that's probably a good idea to include in my proxmox videos

  • @streambarhoum4464
    @streambarhoum4464 Před rokem +1

    Please do bash scripting tutorials, we need those a lot.
    thanks for the great job you are doing in your wonderful Chanel Christian 😍😊✨😎🙏

    • @christianlempa
      @christianlempa  Před rokem +1

      Thank you so much :) great suggestion by the way!

  • @randallroach
    @randallroach Před 3 lety +4

    Hi Christian, will you please consider a companion video on restoring these database types (mariadb, mysql, sqlite) from the compressed file? There are different methods out there, but it would be great to hear your preference.

    • @christianlempa
      @christianlempa  Před 3 lety +1

      Hmm it's probably not a topic for an entire video but I usually backup and restore everything with sql files compressed as gzip, as the performance is good but you can still easily decompress and access it. The sql files are mostly the same for all database types with just a few exceptions

    • @randallroach
      @randallroach Před 3 lety +1

      @@christianlempa Makes sense, thank you!

  • @xornei
    @xornei Před 2 lety +1

    yes please can you, do that video of comparaison between different solutions :'(

    • @christianlempa
      @christianlempa  Před 2 lety

      thank you! I'm going to revisit my backup strategy somewhere next year with my new server, until then I'm testing a few different apps. So hopefull that will be what you're looking for :)

  • @alexanderneunsinger
    @alexanderneunsinger Před 4 měsíci

    Very very cool Video. I want to suggest a Video about Automatic Backup Monitoring. Because ITS one Thing to have Automatic Backups, but another thing how to Monitor If the Backup hast been run completely without Errors. I would Love to See a Tutorial about that. Maybe with suggestions for Backup Monitoring ways (Email, snmp, log File Check...) Keep Up that amazing Work !🎉

  • @LungenStrudel
    @LungenStrudel Před rokem

    Thanks for this video, Christian! Two questions: 1) Why are you setting up individual volumes for your services? Wouldn't it be easier to just backup the entire /var/lib/docker/volumes/ directory and exclude stuff you do not want? 2) Does it make sense to also backup Duplicati's own data this way? I'm asking because when my system breaks completely, which is not unlikely for a Raspberry Pi running off an SD card, I would need a running instance of Duplicati to restore data from my backup.

    • @maiastrillo
      @maiastrillo Před rokem +1

      I completely agree with you! what if the system where duplicati is installed crashes and we don't have a working copy of the database? Found the answer: install new instance of duplicati, then restore using previously saved duplicati archives as backup source.

    • @rajeshjsl
      @rajeshjsl Před 8 měsíci

      I agree with point 1 and I think that’s the way to go. (just deselect the databases file systems to avoid duplicacy)

  • @guguge
    @guguge Před 2 lety

    I seem to have a problem with your script for restoring containers from backups. I used the scp command to send a backup of the bitwarden file to the /root/data/docker-data/bitwarden/backup.tar path on my other server, and then executed the following command,
    docker run --rm --volumes-from bitwarden -v $(pwd):/backup busybox bash -c "cd /data && tar xvf /root/data/docker-data/bitwarden/backup.tar/backup/backup.tar --strip 1"
    which reports an error,
    docker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: "bash": executable file not found in $PATH: unknown.
    ERRO[0000] error waiting for container: context canceled
    and I don't know how to fix it I don't know how to fix it, can you help me to see how to deal with it?

  • @AfroJewelz
    @AfroJewelz Před 7 měsíci

    have you consider running mariadb data volume on a optane drive or in memory then running online incremental online backup routinely dump to a non-violate storage that also chks if backup is valid? cause when i playin around my photoprism instance , i found when db is runnin in memory is quite fast and rarely falls into their shitty sql-lock situation. they really should upgrade their ORM to gormv2

  • @chetanchetan-dn4pr
    @chetanchetan-dn4pr Před 9 měsíci +1

    this is amazing but i understand some of it basically i want a backup mysql docker container and run to another server so how i do it in ubuntu using command please explain to me 🥺

  • @dragon3602010
    @dragon3602010 Před 2 lety +1

    Awesome, do you know how to not expose port of the dockers when we are using nginx proxy manager like with Traefik

    • @christianlempa
      @christianlempa  Před 2 lety

      You simply can redeploy your container without port mapping.

  • @gennaroc3500
    @gennaroc3500 Před 2 lety

    Hi,
    I have been following you for a long time and i appreciate a lot your contents. You are very clear and professional.
    I'm doing master thesis about docker containers and i'm dealing with redundancy in docker containers. Can you provide me more informations, books, readings, videos and so on, about this topic? Thank you in advice for the help.
    Bye!

    • @christianlempa
      @christianlempa  Před 2 lety

      Hey mate, sorry for being late. I can't really recommend anything because I'm always learning it by doing. I've not gone through any training course or video about Containers.

  • @wchorski
    @wchorski Před 2 lety

    Would you recommend this if I were to backup the whole system? Linux and or Windows based? Where i can pickup right where i left off on a new machine?

    • @christianlempa
      @christianlempa  Před 2 lety

      You can back up the whole system, Duplicati is also available on Windows

  • @DjLundbladh
    @DjLundbladh Před rokem

    Is it possible to backup whole system? so i can restore it if something bad happends?

  • @wildflowers465
    @wildflowers465 Před rokem +1

    Duuude, duplicati just saved me about 4-6 hours of lost work! Somehow I lost a large amount of untracked files that I had stashed in git, while I was doing a partial revert and interactively staging hunks and then popping the said stash after committing (I was distracted, I still don't know how they were lost). How could this happen to *me*? I "never" lose files :) I simply restored last night's backup, and copied them back into the repo. Done.

  • @sahibvirk
    @sahibvirk Před rokem

    You are my God!

  • @impact0r
    @impact0r Před 3 lety +2

    Can you create a video on creating a WireGuard network between servers at different locations so that they can communicate directly with each other while not disturbing their individual internet and container connectivity?

    • @christianlempa
      @christianlempa  Před 3 lety +3

      Sure, I did a video on it: czcams.com/video/bVKNSf1p1d0/video.html

  • @heinzbroehl4598
    @heinzbroehl4598 Před 2 lety +1

    Maybe you can think about doing a video for setting up an external drive maybe on a friends Fritzbox as a target of a encrypted external backup target?

    • @christianlempa
      @christianlempa  Před 2 lety

      Hmm I might actually more videos about backup solutions, but I probably would do something about TrueNAS, Proxmox, something like this. Because I don't have a Fritzbox myself :D

    • @heinzbroehl4598
      @heinzbroehl4598 Před 2 lety

      @@christianlempa Proxmox is in use at my homelab, TrueNAS is in my case a Nextcloud and the fileserver thing is done by Samba.
      Last week a little fire happend in my basement. This started my plannig for an external Backup again. But I am finding no "simple&free" solution. It is independend from a Fritzbox :)

    • @christianlempa
      @christianlempa  Před 2 lety +1

      I'm still at the very beginning of my Home Lab journey, but there is more stuff to come in the future! :)

  • @midteknologi
    @midteknologi Před 2 lety

    You must try acronis cloud backup very easy to use :)

  • @ws_stelzi79
    @ws_stelzi79 Před rokem

    Hey guy this anachronistic (very old) dotnet thing is even almost 2 years later still in beta and the devs really struggle to migrate to ANY newer dotnet version. Coding wise this thing is a blast from the olden pre Natella evil Microsoft era.

  • @FredrikRambris
    @FredrikRambris Před rokem

    Take a look into restic-backup

  • @sybertron1591
    @sybertron1591 Před rokem

    Hey, I get the error "mysqldump: Got error: 1045: "Access denied for user 'root'@'localhost' (using password: YES)" when trying to connect
    " 2x times when I execute the nextcloud script for the database. Have someone an idea how to fix this?

  • @razoreddy9225
    @razoreddy9225 Před 2 lety

    Right before it happened I thought to myself "Man that cap looks like it's about to pop.", then a second later it did. Stuff of nightmares right there.

  • @asmp7
    @asmp7 Před 2 lety

    Any vedio about web based office like open office or something alternative to google sheets or 365 ???

    • @christianlempa
      @christianlempa  Před 2 lety

      Probably not, I've done a few tests with onlyoffice and libreoffice and I have to say, O365 is just better (for me)

  • @mistakek
    @mistakek Před 2 lety +1

    Great video, just would have been extra awesome if you could have finished with showing how to restore the databases. I figured out MYSQL, but I can't figure out sqlite for vaultwarden yet.
    I made the appropriate changes to your backup script, from bitwardenrs to vaultwarden, and it appears to backup, but I need to restore to test this.
    My current backup system for vaultwarden is, I'm just copying the entire folder from my synology nas with syncthing to my ubuntu docker server, and running a script to copy over the vaultwarden folder every 30mins. It appears to work perfectly fine with both source and destination containers running, but I know it's not best practice.
    I do this incase for some reason my synology goes down, I can just modify the URL in my app to access the backup vaultwarden on the ubuntu docker system, and still have access to my passwords.
    If you could just reply to this with the correct command to restore the vaultwarden database, that would be great.

    • @christianlempa
      @christianlempa  Před 2 lety +1

      Yeah you're right that should be part of it. I hope I'll update my GitHub docs to include that. Thanks for giving me a heads up.

  • @esit2082
    @esit2082 Před rokem +1

    This is great stuff but the volumes section is confusing me. A more detailed look of what is going on here would be helpful, it appears you have created a volume called 'duplicati-config' before setting all this up?

  • @francoisduprez6188
    @francoisduprez6188 Před 2 lety +1

    Great video, learned a lot from it. But sadly i can not use your db backup script, because i use one maridb docker with 5 databases from differen dockers. And i realy don't understand how i can addapt your script to make a backup for every database.. :( Duplicat is great, will be my daily drive for backup.. Keep going with you videos, they are great..

    • @christianlempa
      @christianlempa  Před 2 lety +1

      Thanks mate, yeah bash scripting can be tough. But it's worth learning it!

  • @ashishyadav9613
    @ashishyadav9613 Před 2 lety +1

    Hey Dude, I have been following lot of your videos. You are doing an amazing job helping millions of IT professionals. Ah i just need an advice if you have some time.
    I have a cluster of 3 Proxmox servers which are on Hardware raid. i do not have ZFS storage and hence i am unable to configure HA in Proxmox.
    I have few VMs (Linux, Docker) which are having their disk on local storage on one of the nodes of Proxmox. I wish to understand is there a way i can setup a HA (using some clone/rsync? or sowhere the data from VMs of Node 1 or Proxmox are cloned to another Node. I may setup to clone it once a day, that should be fine but do you think any tool can help me on this. I do know about keeping the disk on shared storage (NAS) but then again that is a single point of failure. I am looking for a solution where if one of the node goes down, i can use the copy on another node to continue doing work.

    • @christianlempa
      @christianlempa  Před 2 lety +2

      Thank you so much :) I personally haven't looked into this too far tbh. I'm running a NAS server and try to use it as a central storage for Proxmox, but I'm still having some issues with it. As for now I can't really recommend what to do, but hey.. I'm working on it ;)

    • @ashishyadav9613
      @ashishyadav9613 Před 2 lety

      Thanks for the revert. I did researched and found people using glusterfs cluster. I am giving a try and it appears to be a good option to have persistent storage synced across multiple clusters. I am setting it up as dockers persistent storage.

  • @barkingbandicoot
    @barkingbandicoot Před 5 měsíci

    Fantastic! The script works well!
    The command: sudo systemctl enable db-container-backup --now does not do anything for me!
    That is a bit concerning!
    I seem to have the directory entered correctly and the user. Has anything changed?

    • @barkingbandicoot
      @barkingbandicoot Před 5 měsíci

      Well, the timer SHOULD have gone off - but I have not extra backups!
      I just need to change ExecStart=sh and User in the service file, no?

    • @christianlempa
      @christianlempa  Před 5 měsíci

      let's not discuss the tech topics here, why not join our discord? you can share your findings and we'll update it in the docs :)

    • @barkingbandicoot
      @barkingbandicoot Před 5 měsíci

      Ok, thanks Christian! I may do!
      ps I ended up creating a cron job instead. @@christianlempa

    • @barkingbandicoot
      @barkingbandicoot Před 5 měsíci

      @@christianlempa "You must verify your identity via phone before you can post .." 😯 Hell no!

  • @DanielGoepp
    @DanielGoepp Před rokem

    What about a comparison of other backup solutions? Have you looked at restic, duplicacy, borg, arq, kopia? You seem to show the solution you picked, but not why you picked it. I'd love to see a breakdown of the pros / cons of the most popular solutions out there these days.

    • @christianlempa
      @christianlempa  Před rokem +1

      Good idea, but I would need to do more research on it, but I might start with another tutorial about backups. However, currently. I don't have have planned.

  • @wstrater
    @wstrater Před 2 lety

    You are creating database dumps with date and time in the file name. How are you doing incremental backups? How does your smart retention keep yearly, monthly and weekly versions of the files?

    • @m0rthaus
      @m0rthaus Před 2 lety

      The database backup script has a variable 'DAYS' that keeps 2 days by default and includes a section that deletes any old backups automatically.
      The Duplicati backup is set to 'smart backup retention' which is an automatic function with Duplicati, described as follows: "Over time backups will be deleted automatically. There will remain one backup for each of the last 7 days, each of the last 4 weeks, each of the last 12 months. There will always be at least one remaining backup."
      This is all explained in the video...

    • @hamhumtube
      @hamhumtube Před rokem

      @@m0rthaus for db files the host did not mention anything referred to incremental. in fact the host did not recommend backing up db files with duplicati

  • @jonashess2842
    @jonashess2842 Před 2 lety

    Nice video. I hope the SQL export you showed us at 19:30 is not from your real Bitwarden Database but just a dummy Installation! Otherwise you might want to change some of your passwords.

    • @christianlempa
      @christianlempa  Před 2 lety

      Thanks mate! Yeah that's just a test instance ;)

    • @doradeutsch2340
      @doradeutsch2340 Před 2 lety

      As most of the fields in the Bitwarden file are encrypted - what do you think you can do with the information?

  • @raul230285
    @raul230285 Před 3 lety

    Hi, christian,
    I've been following you for a long time, I like your videos and what you do. I can suggest you something to see if you can make a video talking about RSNAPSHOT. Thank you very much for your time that you dedicate a greeting from Peru.

    • @christianlempa
      @christianlempa  Před 3 lety +1

      Hey thank you so much! :) Great suggestion, I've never checked it out before, so it will need to wait a little bit on my list. Maybe let's check this out in a livestream or so :)

    • @raul230285
      @raul230285 Před 3 lety

      @@christianlempa Ok thanks for taking my suggestion. I will be attentive as always to your publications. XD

  • @johnchristian7788
    @johnchristian7788 Před 7 měsíci

    Why does your website redirect to your git-hub page?

  • @triks911
    @triks911 Před rokem

    thank you for the video but the issue with Duplicati is that it is "file" based and not block based. Proxmox backup server in my environment is a complete block-based backup.

  • @DarrylGibbs
    @DarrylGibbs Před 2 lety

    I've come to this video as a person frustrated with Duplicati as my upload speed will NOT go more than 1.5MB/s (12Mbit), even though I have 500Mbit available. It's running on a Raspberry Pi 4 and SSD combo, and I've faced this same limitation whether uploading to Backblaze B2 or OneDrive, with or without encryption, and the Pi is hardly with any CPU activity, and all via Docker. Any ideas what I'm doing wrong? I never noticed your upload speeds.

    • @christianlempa
      @christianlempa  Před 2 lety +1

      I'm not sure, but it might be limited to your ISP upload bandwidth. 500Mbit is likely only for download, not for upload.

    • @DarrylGibbs
      @DarrylGibbs Před 2 lety

      @@christianlempa My connection is definitely 500Mbit upload (Brazil gives us decent upload speeds ;) Googling the issue, it seems that I'm certainly not the only one with the issue, but no solutions seems to be apparent either.

    • @christianlempa
      @christianlempa  Před 2 lety

      Wow, I'm living in a third-world internet country called Germany, so I can just dream of those speeds :D well there are 2 possible reasons for this, 1. The RPi is simply not powerful enough to ensure high Upload speeds, or 2. You have some misconfiguration in your network, packet fragmentation, packet loss, etc. I would do a packetcapture with Wireshark and see if there are errors, retransmissions or fragmentation happening.

  • @snooops01
    @snooops01 Před 10 měsíci

    The problem i have with duplicati is that the server which i want to backup needs access to the backup storage, instead it should be vice versa for security reasons. The Backup server should be able to reach the server, the server should not be able to reach the backup storage.

  • @gonsasba
    @gonsasba Před 2 lety +2

    I believe it is pronounced "Duplikati"

    • @telosxian
      @telosxian Před 2 lety

      Yes. Duplicity is a competing backup program. And then there is Duplicacy as well.

  • @nilava
    @nilava Před 2 lety

    couldn't we in theory just backup the data folder which was used for any database setup in docker container?

    • @christianlempa
      @christianlempa  Před 2 lety +1

      You could do that, but it doesn't work well on databases.

  • @pubdigitalix
    @pubdigitalix Před rokem

    Anyone can tell me why he talk about duplicity and show duplicati in the screen? They are two different products.

  • @williamschnl
    @williamschnl Před rokem

    but isn't this makes the volume and the database somehow have time discrepancies? as the database is backed up not at the same time of the volume backup run.

    • @williamschnl
      @williamschnl Před rokem

      such as, a web server that manages files and store the information about the files at the same time. i'm talking about nextcloud. both volume and database must be backed up at the same exact time, or you will ended up with orphan databases / orphan files.

  • @ckeilah
    @ckeilah Před 3 lety

    Did I miss the part where he explains how to do incremental backups of a 100TB volume to a volume spanning set of 10TB USB HDDs?

    • @christianlempa
      @christianlempa  Před 3 lety +1

      I think you missed the whole point of this video bro :D

  • @ppsasi8928
    @ppsasi8928 Před 3 lety +1

    please make comparison video

    • @christianlempa
      @christianlempa  Před 3 lety

      Do you have any specific tools in mind?

    • @xOzZy89x
      @xOzZy89x Před 3 lety

      @@christianlempa What about restic? It's also one of the latest tools for backup

    • @ppsasi8928
      @ppsasi8928 Před 3 lety +1

      @@christianlempa backup tools comparison also compare solution like aws, azure provide for backup

    • @qiuyue4082
      @qiuyue4082 Před 3 lety

      Definitely backblaze!

  • @rumplin
    @rumplin Před 5 měsíci +1

    What is it now, "duplicity" or "duplicati"? These are not the same, but do the same...

  • @AllahomAnsorGaza
    @AllahomAnsorGaza Před 3 lety

    its hard for me
    i need to take auto backups for docker database volumes only :D

  • @krystianroza
    @krystianroza Před rokem

    Nice script for DB backups, but if on one container we don't have the name of Database in docker env the script will not work :( I know that the best option is one DB docker to one app, but
    propably You can add check [[ -z "$MYSQL_DATABASE" ]] && MYSQL_DATABASE="--all-databases" after geting from docker variable MYSQL_DATABASE, but for sure there is better way to handle it :)

    • @christianlempa
      @christianlempa  Před rokem

      Thanks! This script definitely needs some improvement. Maybe I have the time to update it the next weeks

  • @sevensolutions77
    @sevensolutions77 Před rokem

    Does anyone have tried Bareos? 🤔

  • @wesleybaird2752
    @wesleybaird2752 Před 2 lety

    time shift

  • @m0norsk
    @m0norsk Před 2 lety +3

    "doop - lic - ott ee" not "doop - liss - it - eee"

  • @Reiner030
    @Reiner030 Před 2 lety

    so 3rd ty - I hate CZcams Spam recognition which sees to contains words which are part of this software...

    • @Reiner030
      @Reiner030 Před 2 lety

      2nd The here shown Duplicati is using MONO (NET) framework which is not very server-friendly /resource friendly setup ^^... Even its about 8 years running in beta phase.

    • @Reiner030
      @Reiner030 Před 2 lety

      so... I used Duplicati 1.x but switched to duply because of "not supported" status.
      And there are several others tools which are also based on the backend: Duplicity which is used by both programs.

    • @Reiner030
      @Reiner030 Před 2 lety

      In last project I used Borg Backup (recommend also by Hetzner) because we needed only ssh access to backup server
      mmh, interesting. Single parts and all stays online so far...

    • @christianlempa
      @christianlempa  Před 2 lety

      Yeah YT spam recognition is soooo bad, I hate it :/ To be honest, Duplicati isn't my favorite software. I'm still looking for a better alternative, but haven't found one

  • @thaidude
    @thaidude Před rokem

    Duplicity is very easy to use but very slow to restore your backup. So be ware.

  • @toaveiro2074
    @toaveiro2074 Před 2 měsíci

    Your video is about Duplicati but you keep calling it "Duplicity" which is another different backup software. It's confusing and a common mistake I've found in other similar videos by different creators.

  • @FunctionGermany
    @FunctionGermany Před rokem +1

    duplicati seems good, but restoring on a different computer ("disaster recovery") is insanely slow. like 20 hours to restore 1/4th of a 160GB backup.
    any ideas on how to enable faster restores on a different machine?

  • @wstrater
    @wstrater Před 2 lety

    You are creating database dumps with date and time in the file name. How are you doing incremental backups? How does your smart retention keep yearly, monthly and weekly versions of the files?