blog/

Select all images urls and download them

Ivan @ 17 / 07 / 2021 @ Blog / Programming / Скрипты
( / )

Время чтения: ~ 3 мин.

Recently, I was looking for a way to find all image urls in a text and download them in a local folder. The regex for image url goes like this (found it here on stack):

(http)?s?:?(\/\/[^"']*\.(?:png|jpg|jpeg|gif|png|svg))

And the script to download the images goes like this (also, you can find a working example on Google Colab. *Note the first block that installs the wget utility in colab.):

import os  # importing all we need, it's not much
import wget

urls_to_load = list()  # a list to store the urls
path = 'download_folder'  # the path where we will download those files

# preparing to download
if not os.path.exists(path):
    try:
        os.mkdir(path)
    except OSError:
        print("Creation of the directory %s failed" % path)
    else:
        print("Successfully created the directory %s " % path)

urls_to_load = ['https://www.1234.ru/1.jpg', 'https://www.1234.ru/2.jpg']





# starting to download
print("Starting downloading")
for url in urls_to_load:
    print (url)
    file_name = path + '/' + os.path.basename(url) # get the full path to the file
    if os.path.exists(file_name):
        os.remove(file_name) # if exists, remove it directly
    file_name = wget.download(url, out=path)
    print(file_name)

print("ok")

 


Similar Posts:


Subscribe!

Instagram
VK
Facebook
YouTube!
Telegram!

Subscribe to mail list



* clicking the ‘Subscribe’ button you agree that your personal data will be processed