A specialized image download utility, useful for grabbing massive amounts of random images.
Creep can be used to generate gobs of random image data quickly given a single URL. It has no dependencies or requirements and is cross-platform.
Install a prebuilt binary from the releases page.
go get github.com/splode/creep/cmd/creep
Simply pass in a URL that returns an image to creep
to download. Pass in a number of images, and creep
will download them all concurrently.
Usage:
creep [FLAGS] [OPTIONS] [URL]
URL:
The URL of the resource to access (required)
Options:
-c, --count int The number of times to access the resource (defaults to 1)
-n, --name string The base filename to use as output (defaults to "creep")
-o, --out string The output directory path (defaults to current directory)
-t, --throttle int Number of seconds to wait between downloads (defaults to 0)
Flags:
-h, --help Prints help information
-v, --version Prints version information
URL
Specifies the HTTP URL of the image resource to access. This is the only required argument.
--count
The number of times to access and download a resource. Defaults to 1.
--name
The base filename of the downloaded resource. For example, given a count
of 3
, a name
of cat
and url
that returns jpg
, creep
will generate the following list of files:
cat-1.jpg
cat-2.jpg
cat-3.jpg
Defaults to "creep".
--out
The directory to save the output. If no directory is given, the current directory will be used. If the given directory does not exist, it will be created.
--throttle
Throttle downloads by the given number of seconds. Some URLs will return a given image based on the current time, so performing requests in very quick succession will yield duplicate images. If you're receiving duplicate images, it may help to throttle the download rate. Throttling is disabled by default.
Download 32
random images to the current directory.
creep -c 32 https://thispersondoesnotexist.com/image
Download 64
random images using the base filename random
to the downloads
folder, throttling the download rate to 3
seconds.
creep --name=random --out=downloads --count=64 --throttle=3 https://source.unsplash.com/random
Download a single random image to the current directory.
creep https://source.unsplash.com/random
The following URLs will serve a random image upon request:
- Unsplash https://source.unsplash.com/random
- This Person Does Not Exist https://thispersondoesnotexist.com/image
- Picsum https://picsum.photos/400
- Lorem Pixel http://lorempixel.com/400
- This Cat Does Not Exist https://thiscatdoesnotexist.com/
- PlaceGOAT http://placegoat.com/200
- PlaceIMG https://placeimg.com/640/480/any
- LoremFlickr https://loremflickr.com/320/240
- This Artwork Does Not Exist https://thisartworkdoesnotexist.com
- This Horse Does Not Exist https://thishorsedoesnotexist.com/
I frequently find myself needing to seed application data sets with lots of images for testing or demos. Given a few minutes searching for a tool, I wasn't able to find something that suited my requirements, so I built one.
Why Go and not simply script curl
or python? Go's concurrency model makes multiple HTTP requests fast, and being able to compile to a single, cross-platform binary is handy. Besides, Go's cool.
Contributions are welcome! See CONTRIBUTING for details.