Giter Site home page Giter Site logo

furaffinity-dl's People

Contributors

c-nielson avatar kattus avatar mewtwo0641 avatar shnatsel avatar thdaemon avatar xerbo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

furaffinity-dl's Issues

Can't use cookie.txt without error

I used "EditThisCookie" to export the cookies in Netscape format and I get this error after trying to download from a gallery which is restricted to only logged in users:

ERROR: You have provided a cookies file, but it does not contain valid cookies.

I thought I used the wrong path but that couldn't be... Any help would be appreciated.

Incomplete descriptions in metadata

Hi, thanks for this great script, I found it a few days ago and it's really useful :) This isn't really an issue, per se, but something I thought it might be nice to amend if using the separate metadata files.

I had a look at the pagesource and it looks like the descriptions were being taken from <meta property="og:description" content=", unfortunately that gives incomplete descriptions limited to ~121 characters without spaces.

I'd like to propose the following edit to copy the full description. There was also an '& quot ;' issue in some of the titles which is hopefully sorted as well. It might not be the cleanest way of doing it, but it seems to work okay. It copies it verbatim, so if they've put a lot of linebreaks in the desc, they'll appear in the metadata file. I'm interested to hear what you think :)

From line 147 'Get metadata'

            description="$(cat "$tempfile" | tr '\n' ' ' | sed 's/\(<div class="submission-description">\)/\n\1/gI' | sed 's/\(<\/div>\)/\1\n/gI' | grep -o '<div class="submission-description".*</div>' | sed 's/<div class="submission-description">                     //g' | sed 's@<br />@\n@g' | sed 's/<a href="//g' | sed 's@" class=".*</a>@@g' | sed 's@                </div>@@g' | sed 's/&quot;/"/g')"
            if [ $classic = true ]; then
                    title="$(grep -Eo '<h2>.*</h2>' "$tempfile" | awk -F "<h2>" '{print $2}' | awk -F "</h2>" '{print $1}' | sed 's/&quot;/"/g')"
            else
                    title="$(grep -Eo '<h2><p>.*</p></h2>' "$tempfile" | awk -F "<p>" '{print $2}' | awk -F "</p>" '{print $1}' | sed 's/&quot;/"/g')"
            fi

Authentication failure

FA's new "ReCaptcha" authentication is preventing the program from logging in properly to download files.

BSD mktemp unsupported argument

The following piece of code does not work on the BSD version of mktemp:

mktemp --tmpdir=$runtime_dir

Fixed locally like this:

mktemp $runtime_dir/fdl.XXXXXX

Man page can be viewed here.

Problem with using cookies

I'm not sure what the problem is, but when I try to use my cookies.txt file, a bunch of blank files are attempted to be created. They are all overwritten by each other when using -w, none of which are actual files from a user's gallery or favs. It might be something to do with the formatting of cookie.txt, but I'm using the exact extension that was recommended for Chrome. Any thoughts?

Fetch Error for Flash Files

Whenever a flash file is encountered, an error is thrown and a .https file is downloaded without data. I think it's because the directory for where these files are linked from were changed. Every time this happens, a batch stops and has to be restarted.

ftp://https/
           => ‘.listing’
Resolving https (https)... failed: Name or service not known.
wget: unable to resolve host address ‘https’

Only pulling the first page of (48) images

Looked through the closed issues and saw that this would happen with the Beta Skin, but I don't currently use it and have cleared site Data already to try and solve it.

Skip files already downloaded?

Hey!

I'm not finding in the documentation any functionality for skipping files that have already been downloaded (on the python branch.) Could this be a feature that could be added?

no longer works

Not logged in, NSFW content is inaccessible
/data/data/com.termux/files/home/furaffinity-dl.py:209: DeprecationWarning: The 'text' argument to find()-type methods is deprecated. Use 'string' instead.
  next_button = s.find('button', class_='button standard', text="Next")
Unable to find next button
Finished downloading

it happens even if user has SFW content

command not found when executing

Okay so im new to this whole running script thing, ive installed wget, coreutils in ubuntu and updated my binaries, but i cant seem to download all it says is command not found

Script broken since update

While I appreciate someone putting in effort to try and make this tool better, this seems to have simply broken everything.

~/furaffinity-dl [master] » ./furaffinity-dl -p -c ~/facookies.txt gallery/yuurikin
INFO: eyed3 is not installed, no metadata will be injected into music files.
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
2019-07-23 17:40:22 URL:https://www.furaffinity.net/gallery/yuurikin [59304] -> "/home/katt/.cache/furaffinity-dl/fa-dl.gNmqXvwS7r" [1]
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
2019-07-23 17:40:23 URL:https://www.furaffinity.net/view/31925136/ [34063] -> "/home/katt/.cache/furaffinity-dl/fa-dl.gNmqXvwS7r" [1]
--2019-07-23 17:40:23--  https://d.facdn.net/art/yuurikin/1560634746/1560634746.yuurikin_gsg.png
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Resolving d.facdn.net (d.facdn.net)... 104.25.211.99, 104.25.212.99
Connecting to d.facdn.net (d.facdn.net)|104.25.211.99|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 98477 (96K) [image/png]
Saving to: ‘./SD: PastelGenet.png’

./SD: PastelGenet.png                                                           100%[====================================================================================================================================================================================================>]  96.17K  --.-KB/s    in 0.005s  

2019-07-23 17:40:23 (17.5 MB/s) - ‘./SD: PastelGenet.png’ saved [98477/98477]

Error: Not a valid PNG (looks more like a JPEG) - ./SD: PastelGenet.png
    0 image files updated
    1 files weren't updated due to errors
~/furaffinity-dl [master●] »

This is all I get since updating, Why does it care about the png being valid? I even told it to get "(P)lain file without any additional metadata", yet it renames the file and crashes?

Please revert these silly changes or at least provide a way to get good 'ol plain files, not renamed, and not modified.

Way to download folders

image
When download a folder you have to make sure to put everything after /folder/ from the URL for a folder for example 406765/May-the-Best-Man-Win instead of just 406765 then the downloader will get the correct url for the 2nd page.

Someone should probably add and example to the README.md and the Help text that shows when you run the program for this

(Working)
image

(Broken)
WindowsTerminal_YZbNpHgOCv

copying a descriptor

is it possible to create as metadata a descriptor from class="submission-description user-submitted-links"

Unable to use cookies

I am unsure why I can't use my cookies, if I don't use it I can download a library without issue, but when I include it I seem to get this

--2018-06-24 19:02:35-- ftp://http/
=> ‘.listing’
Resolving http (http)... failed: Name or service not known.
wget: unable to resolve host address ‘http’

just a thanks

Hey, thanks for continuing to keep this working, I really like having this around to keep the never-ending rotation of my FA favorites on my desktop current with whatever goofy smut I'm currently into. <3

I just moved from the bash version to the Python version and I figured I'd say thanks.

Wrong filename for setting metadata when -r is used

Pretty much what is says on the tin. It didn't crop up till many files into an update I was running but eventually it got to a file where it tried to update the exif but the exif section always uses $file so it causes an error when using the -r option.

I like the idea of names based on titles by the way, it's a nice touch! I just don't want to redownload everything cause it's a lot. On that note, when I first ran it with the option it made duplicates ending in numbers instead of skipping existing files if they're up to date. I manually added the -N flag to the wget lines and that seemed to fix it though.

Thank you for the work you've done on this script it's been very useful!

Download stops on files with the incorrect extension

What the title says, I keep on getting the error Error: Not a valid PNG (looks more like a JPEG) and the download stops. Even if I use the -p flag, the download stops in the same place, without printing the error message.

Just a note that because of the dumb way that it handles files, FA has tons of images with the wrong extension.

doesnt do anything

running in a terminal window on ubuntu or bash on windows just returns me to the command line without doing anything
if i dont enter any parameters it shows some help stuff so i know i got the path right

macOS mktemp unsupported argument

When I run it in macOS Monterey, I get the following:

mktemp: illegal option -- -
usage: mktemp [-d] [-q] [-t prefix] [-u] template ...
       mktemp [-d] [-q] [-u] -t prefix 

It appears there is no --suffix argument for mktemp in macOS.

The man page for macOS' version of mktemp is here: https://manpagez.com/man/1/mktemp/

Certificate not trusted

So I tried to use it to download a gallery, cookie file doesn't return anything bad but it stops when it says:
ERROR: The certificate of ‘www.furaffinity.net’ is not trusted.
ERROR: The certificate of ‘www.furaffinity.net’ hasn't got a known issuer.
Which stops the download. How can I whitelist the certificate? Thanks!

Beta skin not supported

Mostly noticed this with audio posts, but it seems to throw an "ftp" before the http in the url and then the entire process comes to a stop

also had trouble saving stuff from fav galleries, it just seems to go in a loop without downloading anything other than the first page

great script btw, it's been really helpful outside of these couple of issues!

It onlyworks on some galleries but not on others also it only downlodes SFW

hey i use the tool on windows
i installed a Linux version from the Microsoft store
i tiped this in the command line:

cd /mnt/c/Users/Ich/Desktop/fu
chmod +x furaffinity-dl
./furaffinity-dl -o user -c /cookies/cookies.txt -p gallery/user

but it works only on some users. :(
Aso it dosnt downlode Adult contet

Getting full size image URL is broken

Should be using //d2.facdn.net/art/ now:

image_url="$prefix$(grep --only-matching --max-count=1 ' href="//d2.facdn.net/art/.+">Download' "$tempfile" | cut -d '"' -f 2)"

Persistently errors out on one particular image.

If I have https://www.furaffinity.net/view/45842186/ (NSFW, cw: hyper titties, clothed) in my favorites, the Python downloader will crash with the following error when it comes to this image:

Traceback (most recent call last):
  File "/Users/egypt/Documents/scripts/fa/peggy faves/../furaffinity-dl.py", line 202, in <module>
    download(img.find('a').attrs.get('href'))
  File "/Users/egypt/Documents/scripts/fa/peggy faves/../furaffinity-dl.py", line 135, in download
    for tag in s.find(class_='tags-row').findAll(class_='tags'):
AttributeError: 'NoneType' object has no attribute 'findAll'

If I unfav this one particular image, the downloader will happily go past the point it occupied in my favorites; favorite it again and the downloader once again crashes when it comes to it. The downloader also has no problems grabbing other images by this artist.

No address associated with hostname.

Ubuntu 19.10

I used this tool once about a month ago. I came back to use it today and it's not working. Am I doing something wrong, or is the tool broken? Here's the output:

~/furaffinity-dl-master$ ./furaffinity-dl -c ./cookie.txt -o /path/to/download/to gallery/artist
INFO: eyed3 is not installed, no metadata will be injected into music files.
INFO: exiftool is not installed, no metadata will be injected into pictures.
2020-03-09 16:57:01 URL:https://www.furaffinity.net/gallery/artist [123456] -> "/home/user/.cache/furaffinity-dl/fa-dl.etcetcetc" [1]
2020-03-09 16:57:02 URL:https://www.furaffinity.net/view/12345678/ [123456] -> "/home/user/.cache/furaffinity-dl/fa-dl.etcetcetc" [1]
File already exists, skipping. Use -w to skip this check
2020-03-09 16:57:03 URL:https://www.furaffinity.net/view/12345678/ [12345] -> "/home/user/.cache/furaffinity-dl/fa-dl.etcetcetc" [1]
--2020-03-09 16:57:03--  ftp://https/
           => ‘.listing’
Resolving https (https)... failed: No address associated with hostname.
wget: unable to resolve host address ‘https’

Did something change on FA's server and so the script's address is mispointed?

Can't fetch anything.

Been using the bash script for quite some time already and yesterday when I tried to download a gallery it began throwing this "https:: Invalid argument" error at me. Tried executing the script with -i and had no luck either, just returns "http:: Invalid argument" instead.

Running the script on Windows 7 32bit through mingw (the one that comes with git by default). Have all the dependencies listed in the readme except for eyed3 and exif ones.

Please put feed back for the Python version on this thread

It's finally happening, when finished all current code will be moved to a new branch and the python version will occupy the master branch.

All existing functionality will be migrated, this would also allow native windows support and easier development.

Text files not downloading properly

The script appears to have an issue with download text files, such as .pdf, .txt, and .docx. While images download with their exact filename (ie <file_id>.filename.png), text files appear to download as the page id and no extension.

Accounts using "beta" layout only download first page

If the -c option is used with a valid cookie but the associated account uses the "beta" layout (as opposed to the "classic" one), attempting to download content will stop after the first page.

This is because next_page_url is generated by grepping for the class "button-link right"; in the "beta" layout, the Next Buttons are class "button mobile-button right" instead. Because the grep fails to find a valid URL, the download stops after the first page.

Going into account settings and switching back to the "classic" layout fixes the problem, but it's rather inconvenient.

Cookies file doesn't work

When specifying a cookie file, this occurs:
--2017-03-30 20:32:14-- ftp://http/button%20download-logged-in => ‘.listing’ Resolving http (http)... failed: Name or service not known. wget: unable to resolve host address ‘http’

Script is unable to find "Next" button and throws AtributeError when done with gallery

Most recently receiving the following error:

Traceback (most recent call last):
  File "furaffinity-dl.py", line 200, in <module>
    next_button = s.find('button', class_='button standard', text="Next").parent
AttributeError: 'NoneType' object has no attribute 'parent'

All gallery pages were downloaded, and I assume there's supposed to be a catch to see if this is the last page in a gallery or not?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.