Skip to main content

All Questions

Tagged with
0 votes
2 answers
84 views

download all pdf files from website doesn't support wildcard

I want to download all pdf files in the website of "https://journals.ametsoc.org/view/journals/mwre/131/5/mwre.131.issue-5.xml". I tried many thing with wget as: wget --wait 10 --random-wait ...
Zeinab's user avatar
  • 21
0 votes
1 answer
324 views

How to (re)download file with wget only when the file is newer or the size changed?

I am downloading an archive with wget, how can I use wget to only redownload that file when the file is newer on the server or the size has changed? I'm aware of the -N flag but it doesn't work.
user5994461's user avatar
  • 6,413
-1 votes
1 answer
51 views

I want to Download the wiki pages at 'Oblivion:Oblivion-UESP wiki' without getting the sidebar links to other wiki content

I tried to use wget to download the 'Oblivion:Oblivion-uesp wiki' `https://en.uesp.net/wiki/Oblivion:Oblivion' which is about the crpg game 'Oblivion'. the game is old and I worry that someday the ...
Robert Goodwin's user avatar
1 vote
1 answer
478 views

Dockerfile: how download a file using wget with url from a variable

I have a Dockerfile with something like that RUN url=$(curl -sL https://... | jq) #simply put url=https://example.com/tar.tgz RUN wget $url #will fail here The problem IMO is that RUN means /bin/sh -...
Pawel Cioch's user avatar
  • 3,094
1 vote
1 answer
169 views

How can I use wget to download a list of links from NOAA NCEI?

I need to download public bathymetry data from NOAA NCEI. Often, this means I need to download hundreds of small files to later be stitched together. NOAA NCEI has a tool for this -- "request ...
Evan Lahr's user avatar
1 vote
1 answer
544 views

Downloading multiple files in a directory using wget

I would like to download multiple images in a directory using wget. Here's an image, https://pkmnbinder.com/images/PAL/001.png I can download this using Wget "https://pkmnbinder.com/images/PAL/...
Huichelaar's user avatar
1 vote
2 answers
225 views

stream download very large file over bad connection

I want to process a very large file (a few terabytes) as a stream. This file is accessible via http protocol and from a URL such as: http://example.com/some-file This command can do that: wget -q -O - ...
zxsimba's user avatar
  • 11
1 vote
1 answer
598 views

Connection Error with wget in shell script

This is what my script looks like : #!/bin/bash WGETREF='https://ftp.1000genomes.ebi.ac.uk/vol1/ftp/technical/reference/GRCh38_reference_genome/GRCh38_full_analysis_set_plus_decoy_hla.fa' wget $...
Rachel's user avatar
  • 21
0 votes
2 answers
439 views

Is there a way to rename files downloaded with wget when using -i to download from a list of links?

I have a large text file of links that I want to mass download using wget, but the file names are getting extra junk added to them from the url and I want to prevent that. The urls are broadly ...
o-dawgie's user avatar
1 vote
0 answers
104 views

wget blurry image and dropdown

Hi im trying to download single page and 1 deep pages. So i tried this command; wget --user-agent=" Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0....
ardabb's user avatar
  • 11
0 votes
1 answer
475 views

Resume an aborted recursive download with wget without checking the dates for already downloaded files

The following command was aborted: wget -w 10 -m -H "<URL>" I would like to resume this download without checking the dates on the server for every file that I've already downloaded. ...
Terje Oseberg's user avatar
1 vote
1 answer
370 views

wget: using wildcards in the middle of the path

I am trying to recursively download .nc files from: https://satdat.ngdc.noaa.gov/sem/goes/data/full/*/*/*/netcdf/*.nc A target link looks like this one: https://satdat.ngdc.noaa.gov/sem/goes/data/full/...
Roland's user avatar
  • 449
0 votes
1 answer
586 views

Using wget to download specific subfolders only downloads the robots.txt and the index.html files

I am trying to download all the .nc files from here: https://satdat.ngdc.noaa.gov/sem/goes/data/full/ The link includes a tree structures of varying subfolders, e.g.: https://satdat.ngdc.noaa.gov/sem/...
Roland's user avatar
  • 449
0 votes
0 answers
645 views

Download .tar file from Google Drive into google colab in python on Windows

I am trying to download only one .tar file (Task03_Liver.tar) from this Google Drive link: https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2 into Google Colab with Python on ...
George's user avatar
  • 29
-4 votes
1 answer
192 views

How to download part of website with wget?

I want to download part of this site: coinmarketcap.com It has table of coins with pagination about 95 pages. I want to download single page for each coin, like coinmarketcap.com/currencies/bitcoin/ ...
Mihail's user avatar
  • 25

15 30 50 per page
1
2 3 4 5
23