I am trying to download backup files from my websites. I have structured my playbook the following:
site_vars.yml
holds my variables:
website_backup_download:
- name: ftp://username:[email protected]/backups/mysite1backup.tgz
path: mysites/backups/www
- name: ftp://username:[email protected]/backups/mysite2backup.tgz
path: mysites/backups/www
- name: ftp://username:[email protected]/backups/mysite3backup.tgz
path: mysites/backups/www
- Actual downloader playbook:
# Downloader
task:
- name: Download backups from FTP's
get_url:
url: "{{ item.name }}"
dest: "{{ item.path }}"
mode: 0750
no_log: false
ignore_errors: True
with_items:
- "{{ website_backup_download }}"
This works actually very well, but the problem begins with large backup files, the task needs to be running until the backup file has been downloaded properly.
I can't repeat the task to complete the incompleted file or files. :)
Have tried another solution, this works also well for a single site, but can't use it for multiple downloads :(
- name: Download backups
command: wget -c ftp://username:[email protected]/backups/mysite1backup.tgz
args:
chdir: "{{ down_path }}"
warn: false
register: task_result
retries: 10
delay: 1
until: task_result.rc == 0
ignore_errors: True
Thanks for your help.
get_url
module has a parametertimeout
. You could perform a short test with an increased value.