Skip to content
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.

Commit

Permalink
edit README.md and add #21
Browse files Browse the repository at this point in the history
  • Loading branch information
KJHJason committed Mar 13, 2023
1 parent fbd78d6 commit e3bc467
Show file tree
Hide file tree
Showing 5 changed files with 141 additions and 67 deletions.
52 changes: 35 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,17 +38,20 @@ This program has only been tested on Windows 10. Hence, if you encounter any iss

- What the flag does:
- Regardless of this flag, the program will overwrite any incomplete downloaded file by verifying against the Content-Length header response.
- GDrive downloads are not affected by this flag.
- If the size of locally downloaded file does not match with the header response, the program will re-download the file.
- Otherwise, if the Content-Length header is not present in the response, the program will only skip the file if the file already exists.
- However, if there's a need to overwrite the skipped files, you can use this flag to do so.
- You should only use this for websites like Pixiv Fanbox which does not return a Content-Length header in their response.
- However, if there's a need to overwrite the skipped files, you can use this flag to do so like for Pixiv Fanbox which does not return a Content-Length header in their response header.
- That said, the program will try to do a clean up of the incomplete downloads by deleting them as of version 1.1.1 and above when you forcefully terminate the program by pressing Ctrl+C so you should not need to use this flag anymore.
- Causes:
- This is caused by the antivirus program on your PC, flagging the program as a malware and stopping it from executing normally.
- This is due to how the programs works, it will try to check the image file size by doing a HEAD request first before doing a GET request for downloading.
- In the event that the Content-Length is not in the response header, it will re-download the file if the overwrite flag is set to true.
- Usually, this is not a problem for Fantia and Pixiv, but for Pixiv Fanbox, it will not return the Content-Length in the response header, which is why the program will try to re-download all the files and overwrite any existing files. Hence, the antivirus program will flag the program as a ransomware and stop it from executing normally.
- Solutions:
- Please exclude `cultured-downloader-cli.exe` or the compiled version of the program from your antivirus software as it can be flagged as a ransomware as previously explained above in the Causes.
- Avoid using the `--overwrite=true` flag and let the program do its clean up when you forcefully terminate the program by pressing Ctrl+C.
- Files that failed to be deleted will be logged so you can manually delete them.
- If you still need to use this flag, please exclude `cultured-downloader-cli.exe` or the compiled version of the program from your antivirus software as it can be flagged as a ransomware as previously explained above in the Causes.
- `go run .` will also NOT work as it will still be blocked by the antivirus program. Hence, you will need to build the program first and then run it.
- By running `go build . -o cultured-downloader-cli.exe` in the src directory of the project, it will build the program and create an executable file.

Expand All @@ -64,43 +67,55 @@ This program has only been tested on Windows 10. Hence, if you encounter any iss

The example below assumes you are using [Go](https://go.dev/dl/) to run the program.

Otherwise, instead of `go run cultured_downloader.go`, you can run the executable file by typing `./cultured_downloader.exe` for Windows.
Otherwise, instead of `go run . cultured_downloader.go`, you can run the executable file by typing `./cultured_downloader.exe` for Windows.

There is also compiled binaries for Linux and macOS in the [releases](https://github.com/KJHJason/Cultured-Downloader-CLI/releases) page.

Note:
- For flags that require a value, you can either use the `--flag_name value` format or the `--flag_name=value` format.
- For flags that allows multiple values, you can either use the `--flag_name value1,value2` format or the `--flag_name=value1,value2` format.
- Quotations like `--flag_name="value1,value2"` are not required but it is recommended to use them.
- For `--text_file="<path>"`, the contents of the text file should be separated by a new line as seen below,
```
https://fantia.jp/posts/123123
https://fantia.jp/fanclubs/1234
https://fantia.jp/fanclubs/1234; 1-12
```
- You can add the `; <pageNum>` after the URL as well!
- Only for:
- Fantia Fanclub URLs
- Pixiv Fanbox Creator URLs
- Pixiv Illustrator URLs
- Pixiv Tag URLs

Help:
```
go run cultured_downloader.go -h
go run . cultured_downloader.go -h
```

Downloading from multiple Fantia Fanclub IDs:
```
go run cultured_downloader.go fantia --cookie_file="C:\Users\KJHJason\Desktop\fantia.jp_cookies.txt" --fanclub_id 123456,789123 --page_num 1,1-10 --dl_thumbnails=false
go run . cultured_downloader.go fantia --cookie_file="C:\Users\KJHJason\Desktop\fantia.jp_cookies.txt" --fanclub_id 123456,789123 --page_num 1,1-10 --dl_thumbnails=false
```

Downloading from a Pixiv Fanbox Post ID:
```
go run cultured_downloader.go pixiv_fanbox --session="<add yours here>" --post_id 123456,789123 --gdrive_api_key="<add your api key>"
go run . cultured_downloader.go pixiv_fanbox --session="<add yours here>" --post_id 123456,789123 --gdrive_api_key="<add your api key>"
```

Downloading from a Pixiv Artwork ID (that is a Ugoira):
```
go run cultured_downloader.go pixiv --session "<add yours here>" --artwork_id 12345678 --ugoira_output_format ".gif" --delete_ugoira_zip=false
go run . cultured_downloader.go pixiv --session "<add yours here>" --artwork_id 12345678 --ugoira_output_format ".gif" --delete_ugoira_zip=false
```

Downloading from multiple Pixiv Artwork IDs:
```
go run cultured_downloader.go pixiv --refresh_token="<add yours here>" --artwork_id 12345678,87654321
go run . cultured_downloader.go pixiv --refresh_token="<add yours here>" --artwork_id 12345678,87654321
```

Downloading from Pixiv using a tag name:
```
go run cultured_downloader.go pixiv --refresh_token="<add yours here>" --tag_name "tag1,tag2,tag3" --tag_page_num 1,4,2 --rating_mode safe --search_mode s_tag
go run . cultured_downloader.go pixiv --refresh_token="<add yours here>" --tag_name "tag1,tag2,tag3" --tag_page_num 1,4,2 --rating_mode safe --search_mode s_tag
```

## Base Flags
Expand Down Expand Up @@ -139,7 +154,7 @@ Usage:
cultured-downloader-cli fantia [flags]
Flags:
--cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
-c, --cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
You can generate a cookie file by using the "Get cookies.txt LOCALLY" extension for your browser.
Chrome Extension URL: https://chrome.google.com/webstore/detail/get-cookiestxt-locally/cclelndahbckbenkjhflpdbgdldlbecc
--dl_attachments Whether to download the attachments of a Fantia post. (default true)
Expand All @@ -157,7 +172,8 @@ Flags:
--post_id strings Fantia post ID(s) to download.
For multiple IDs, separate them with a comma.
Example: "12345,67891" (without the quotes)
--session string Your _session_id cookie value to use for the requests to Fantia.
-s, --session string Your _session_id cookie value to use for the requests to Fantia.
--text_file string Path to a text file containing Fanclub and/or post URL(s) to download from Fantia.
```

## Pixiv Fanbox Flags
Expand All @@ -169,7 +185,7 @@ Usage:
cultured-downloader-cli pixiv_fanbox [flags]
Flags:
--cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
-c, --cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
You can generate a cookie file by using the "Get cookies.txt LOCALLY" extension for your browser.
Chrome Extension URL: https://chrome.google.com/webstore/detail/get-cookiestxt-locally/cclelndahbckbenkjhflpdbgdldlbecc
--creator_id strings Pixiv Fanbox Creator ID(s) to download from.
Expand All @@ -190,7 +206,8 @@ Flags:
--post_id strings Pixiv Fanbox post ID(s) to download.
For multiple IDs, separate them with a comma.
Example: "12345,67891" (without the quotes)
--session string Your FANBOXSESSID cookie value to use for the requests to Pixiv Fanbox.
-s, --session string Your FANBOXSESSID cookie value to use for the requests to Pixiv Fanbox.
--text_file string Path to a text file containing creator and/or post URL(s) to download from Pixiv Fanbox.
```


Expand All @@ -212,9 +229,9 @@ Flags:
- all: Include both illustrations, ugoira, and manga artworks
Notes:
- If you're using the "-pixiv_refresh_token" flag and are downloading by tag names, only "all" is supported. (default "all")
--cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
-c, --cookie_file string Pass in a file path to your saved Netscape/Mozilla generated cookie file to use when downloading.
You can generate a cookie file by using the "Get cookies.txt LOCALLY" extension for your browser.
Chrome Extension URL: https://chrome.google.com/webstore/detail/get-cookiestxt-locally/cclelndahbckbenkjhflpdbgdldlbecc
Chrome Extension URL: https://chrome.google.com/webstore/detail/get-cookiestxt-locally/cclelndahbckbenkjhflpdbgdldlbecc
--delete_ugoira_zip Whether to delete the downloaded ugoira zip file after conversion. (default true)
--ffmpeg_path string Configure the path to the FFmpeg executable.
Download Link: https://ffmpeg.org/download.html (default "ffmpeg")
Expand Down Expand Up @@ -243,7 +260,7 @@ Flags:
- s_tag: Match any post with SIMILAR tag name
- s_tag_full: Match any post with the SAME tag name
- s_tc: Match any post related by its title or caption (default "s_tag_full")
--session string Your PHPSESSID cookie value to use for the requests to Pixiv.
-s, --session string Your PHPSESSID cookie value to use for the requests to Pixiv.
--sort_order string Download Order Options: date, popular, popular_male, popular_female
Additionally, you can add the "_d" suffix for a descending order.
Example: "popular_d"
Expand All @@ -257,6 +274,7 @@ Flags:
--tag_page_num strings Min and max page numbers to search for corresponding to the order of the supplied tag name(s).
Format: "num", "minNum-maxNum", or "" to download all pages
Leave blank to search all pages for each tag name.
--text_file string Path to a text file containing artwork, illustrator, and tag name URL(s) to download from Pixiv.
--ugoira_output_format string Output format for the ugoira conversion using FFmpeg.
Accepted Extensions: .gif, .apng, .webp, .webm, .mp4
(default ".gif")
Expand Down
1 change: 1 addition & 0 deletions src/cmds/fantia.go
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,7 @@ var (
)
}

utils.PrintWarningMsg()
fantia.FantiaDownloadProcess(
&fantiaConfig,
&fantiaDl,
Expand Down
1 change: 1 addition & 0 deletions src/cmds/pixiv.go
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,7 @@ var (
}
pixivDlOptions.ValidateArgs()

utils.PrintWarningMsg()
pixiv.PixivDownloadProcess(
&pixivConfig,
&pixivDl,
Expand Down
1 change: 1 addition & 0 deletions src/cmds/pixiv_fanbox.go
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@ var (
}
pixivFanboxDlOptions.ValidateArgs()

utils.PrintWarningMsg()
pixivfanbox.PixivFanboxDownloadProcess(
&pixivFanboxConfig,
&pixivFanboxDl,
Expand Down
153 changes: 103 additions & 50 deletions src/utils/useful.go
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,24 @@ import (
// Prints out a warning message to the user to not stop the program while it is downloading
func PrintWarningMsg() {
color.Yellow("CAUTION:")
color.Yellow("Please do NOT stop the program while it is downloading.")
color.Yellow("Doing so may result in incomplete downloads and corrupted files.")
color.Yellow("Please do NOT terminate the program while it is downloading unless you really have to!")
color.Yellow("Doing so MAY result in incomplete downloads and corrupted files.")
fmt.Println()
}

// parse the Netscape cookie file generated by extensions like Get cookies.txt
// For the exported cookies in JSON instead of Netscape format
type ExportedCookies []struct {
Domain string `json:"domain"`
Expire float64 `json:"expirationDate"`
HttpOnly bool `json:"httpOnly"`
Name string `json:"name"`
Path string `json:"path"`
Secure bool `json:"secure"`
Value string `json:"value"`
Session bool `json:"session"`
}

// parse the Netscape cookie file generated by extensions like Get cookies.txt LOCALLY
func ParseNetscapeCookieFile(filePath, sessionId, website string) ([]*http.Cookie, error) {
if filePath != "" && sessionId != "" {
return nil, fmt.Errorf(
Expand Down Expand Up @@ -60,66 +72,107 @@ func ParseNetscapeCookieFile(filePath, sessionId, website string) ([]*http.Cooki
}

var cookies []*http.Cookie
scanner := bufio.NewScanner(f)
for scanner.Scan() {
line := strings.TrimSpace(scanner.Text())
if ext := filepath.Ext(filePath); ext == ".txt" {
scanner := bufio.NewScanner(f)
for scanner.Scan() {
line := strings.TrimSpace(scanner.Text())

// skip empty lines and comments
if line == "" || strings.HasPrefix(line, "#") {
continue
}
// skip empty lines and comments
if line == "" || strings.HasPrefix(line, "#") {
continue
}

// split the line
cookieInfos := strings.Split(line, "\t")
if len(cookieInfos) < 7 {
// too few values will be ignored
continue
}
// split the line
cookieInfos := strings.Split(line, "\t")
if len(cookieInfos) < 7 {
// too few values will be ignored
continue
}

cookieName := cookieInfos[5]
if cookieName != sessionCookieName {
// not the session cookie
continue
cookieName := cookieInfos[5]
if cookieName != sessionCookieName {
// not the session cookie
continue
}

// parse the values
cookie := http.Cookie{
Name: cookieName,
Value: cookieInfos[6],
Domain: cookieInfos[0],
Path: cookieInfos[2],
Secure: cookieInfos[3] == "TRUE",
HttpOnly: true,
SameSite: sessionCookieSameSite,
}

expiresUnixStr := cookieInfos[4]
if expiresUnixStr != "" {
expiresUnixInt, err := strconv.Atoi(expiresUnixStr)
if err != nil {
// should never happen but just in case
errMsg := fmt.Sprintf(
"error %d: parsing cookie expiration time, \"%s\", more info => %v",
UNEXPECTED_ERROR,
expiresUnixStr,
err,
)
color.Red(errMsg)
continue
}
if expiresUnixInt > 0 {
cookie.Expires = time.Unix(int64(expiresUnixInt), 0)
}
}
cookies = append(cookies, &cookie)
}

// parse the values
cookie := http.Cookie{
Name: cookieName,
Value: cookieInfos[6],
Domain: cookieInfos[0],
Path: cookieInfos[2],
Secure: cookieInfos[3] == "TRUE",
HttpOnly: true,
SameSite: sessionCookieSameSite,
if err := scanner.Err(); err != nil {
return nil, fmt.Errorf(
"error %d: reading cookie file at %s, more info => %v",
OS_ERROR,
filePath,
err,
)
}
} else if ext == ".json" {
var exportedCookies ExportedCookies
if err := json.NewDecoder(f).Decode(&exportedCookies); err != nil {
return nil, fmt.Errorf(
"error %d: failed to decode cookie JSON file at %s, more info => %v",
JSON_ERROR,
filePath,
err,
)
}

expiresUnixStr := cookieInfos[4]
if expiresUnixStr != "" {
expiresUnixInt, err := strconv.Atoi(expiresUnixStr)
if err != nil {
// should never happen but just in case
errMsg := fmt.Sprintf(
"error %d: parsing cookie expiration time, \"%s\", more info => %v",
UNEXPECTED_ERROR,
expiresUnixStr,
err,
)
color.Red(errMsg)
for _, cookie := range exportedCookies {
if cookie.Name != sessionCookieName {
// not the session cookie
continue
}
if expiresUnixInt > 0 {
cookie.Expires = time.Unix(int64(expiresUnixInt), 0)

parsedCookie := &http.Cookie{
Name: cookie.Name,
Value: cookie.Value,
Domain: cookie.Domain,
Path: cookie.Path,
Secure: cookie.Secure,
HttpOnly: cookie.HttpOnly,
SameSite: sessionCookieSameSite,
}
if !cookie.Session {
parsedCookie.Expires = time.Unix(int64(cookie.Expire), 0)
}
}
cookies = append(cookies, &cookie)
}

if err := scanner.Err(); err != nil {
cookies = append(cookies, parsedCookie)
}
} else {
return nil, fmt.Errorf(
"error %d: reading cookie file at %s, more info => %v",
OS_ERROR,
"error %d: invalid cookie file extension, \"%s\", at %s...\nOnly .txt and .json files are supported",
INPUT_ERROR,
ext,
filePath,
err,
)
}

Expand Down

0 comments on commit e3bc467

Please sign in to comment.