mcrapet / plowshare-modules-legacy Goto Github PK
View Code? Open in Web Editor NEWPlowshare legacy & unmaintained modules
License: GNU General Public License v3.0
Plowshare legacy & unmaintained modules
License: GNU General Public License v3.0
29 Sep 2015
IMPORTANT & URGENT NEWS!
Dear User,
We are upgrading security features on our website and will therefore deploy "https" to all pages where possibly applied. This upgrade will take place on the 30th of September.
Kindly note that whatever program you are using to either uploading or downloading may be affected. We always recommend downloading via Internet Download Manager in which we have tested for.
We hope that this update brings more confidence in your use of choosing Uploadable.
Regards,
Uploadable Team
Solidfiles change the interface to angular, so the links are changes
http://www.solidfiles.com/d/80b943e040/[ARG]Love_Hina_23_[660EDF3E].ogm
This patch fixes anonymous download for me, but could use testing by somebody with an account/folders.
keep2share seems to have changed, and now detects plowshare as a bot.
plowdown http://www83.zippyshare.com/v/bgqOo6u0/file.html
Starting download (zippyshare): http://www83.zippyshare.com/v/bgqOo6u0/file.html
parse_attr failed (sed): "/id="omg"/ class="
Failed inside zippyshare_download(), line 149, zippyshare.sh
/tmp/plowdown.17237.26971.js:15: TypeError: elts.fimage is undefined
Output URL is not valid: http://www83.zippyshare.com
downloading files from zippyshare fails.
The current module is broken and I've not had any luck deciphering the multiple levels of JavaScript obfuscation. They do have a simple API for downloading but it always seems to be disabled.
I there,
I noticed a trouble (again) with zippyshare module:
me@debian:~$ plowup zippyshare testpic.jpg
zippyshare: take --auth option from configuration file
Starting upload (zippyshare): testpic.jpg
Destination file: testpic.jpg
Starting login process: JohnD/***************
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1135k 0 4038 100 1131k 1944 545k 0:00:02 0:00:02 --:--:-- 545k
http://www38.zippyshare.com/v/gcfiRUKa/file.html
me@debian:~$ plowdown -v4 http://www38.zippyshare.com/v/gcfiRUKa/file.html
dbg: Content Type: 'image'
parse_attr failed (sed): "/id="omg"/ class="
Failed inside zippyshare_download(), line 149, zippyshare.sh
js: uncaught JavaScript runtime exception: TypeError: Impossible de lire la propriété "href" de undefined
Output URL is not valid: http://www38.zippyshare.com
hi,
please can u make a upload module of openload.co it has a API. see below.
regards
When I try to upload with plowup to depositfiles, it always asks for a captcha solving. When I solve the captcha from imgur, it doesn't recognize it for some reason. However I am able to login with my browser, and with cURL too
curl -s 'https://dfiles.eu/api/user/login' -c ~/dfiles-cookie.txt -H 'Origin: http://dfiles.eu' -H 'Accept-Encoding: gzip, deflate' -H 'Accept-Language: en-US,en;q=0.8' -H 'User-Agent: Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.152 Safari/537.36' -H 'Content-Type: application/x-www-form-urlencoded; charset=UTF-8' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'Referer: http://dfiles.eu/login.php?return=%2F' -H 'X-Requested-With: XMLHttpRequest' -H 'Connection: keep-alive' --data 'login=mylogin&password=mypassword&recaptcha_challenge_field=&recaptcha_response_field=' --compressed -L
I tried to modify the modul depositfiles.sh and hardcode my cookie file. When I made my cookie file with cURL using the above command line, it worked, however when I extracted the cookie data from browser with extract_cookie Ruby script found on github it doesn't worked.
Would be nice if I could attach my own cookie file with an option for uploading as well not only for downloading.
Hi,
I am working on a module but i need to do chunk upload and unfortunately I did not manage to get it work.
Could you show me how I can process chunk upload using plowshare ?
Thanks in advance
Since all modules are in the unmaintained section, there is a pressing question is plowshare abandoned and those not longer maintained?
when using plowdown or plowup other session that are connected are automatically logged on 1fichier.com
This very annoying because it must always have reconnected
is it possible to disable the next option when plowshare performs the connection?
Long session
Restrict the session to my IP address
Purge old sessions
view Advanced Options on https://1fichier.com/login.pl?lg=en
thank you
Hey,
I have a error on launch :
parse_attr failed (sed): "//d// href=" Failed inside uptobox_download(), line 166, uptobox.sh
Cordialy
Norman FELTZ
Is it possible to fix the uptobox login? Not working anymore.. I tried several accounts still the same
Hello,
I have problem with dowload from openload.co, I try different file.
jDownloader works fine for openload.
plowdown https://openload.co/f/...
Starting download (openload): https://openload.co/f/...
Waiting 5 seconds... done
/tmp/plowdown.532.11645.js:15:0 SyntaxError: syntax error:
/tmp/plowdown.532.11645.js:15:0 var _0x3a92=["\xC9\x28\xA1\x28\x70\x2C\x61\x2C\x63\x2C\x6B\x
/tmp/plowdown.532.11645.js:15:0 ^
Output URL expected
plowdown --version
v2.1.4-2-g0290dc1 (2016-05-15)
Thanks for any advice.
Hello guys , I am trying to add nitroflare support to plowshare .
As am new to Bash and i have no idea what am i doing wrong with the following code .
The error i am getting is
Starting download (nitroflare): http://nitroflare.com/view/8F10990E72D5461/0521699762_States.rar Failed inside nitroflare_download() [1]
And here is my download functions code , most of which has been copied from the uploaded.net module
`
nitroflare_download() {
local -r COOKIE_FILE=$1
local -r BASE_URL='http://nitroflare.com'
local URL ACCOUNT PAGE JSON WAIT ERR FILE_ID FILE_NAME FILE_URL
# Uploaded.net redirects all possible urls of a file to the canonical one
# Note: There can be multiple redirections before the final one
#URL=$(curl -I -L "$2" | grep_http_header_location_quiet | last_line) || return
#[ -n "$URL" ] || URL=$2
# Recognize folders
if match "$BASE_URL/folder/" "$URL"; then
log_error 'This is a directory list'
return $ERR_FATAL
fi
# Page not found
# The requested file isn't available anymore!
if match "$BASE_URL/\(404\|410\)" "$URL"; then
return $ERR_LINK_DEAD
fi
#nitroflare_switch_lang "$COOKIE_FILE" "$BASE_URL" || return
# Note: File owner never needs password and only owner may access private
# files, so login comes first.
if [ -n "$AUTH" ]; then
ACCOUNT=$(nitroflare_login "$AUTH" "$COOKIE_FILE" "$BASE_URL") || return
fi
# Note: Save HTTP headers to catch premium users' "direct downloads"
PAGE=$(curl -i -b "$COOKIE_FILE" "$URL") || return
# Check for files that need a password
if match '<h2>Authentification</h2>' "$PAGE"; then
log_debug 'File is password protected'
if [ -z "$LINK_PASSWORD" ]; then
LINK_PASSWORD=$(prompt_for_password) || return
fi
# Note: Again, consider "direct downloads"
PAGE=$(curl -i -b "$COOKIE_FILE" -F "pw=$LINK_PASSWORD" "$URL") || return
if match '<h2>Authentification</h2>' "$PAGE"; then
return $ERR_LINK_PASSWORD_REQUIRED
fi
fi
FILE_ID=$(nitroflare_extract_file_id "$URL" "$BASE_URL") || return
FILE_NAME=$(curl "$BASE_URL/file/$FILE_ID/status" | first_line) || return
if [ "$ACCOUNT" = 'premium' ]; then
# Premium users can resume downloads
MODULE_NITROFLARE_DOWNLOAD_RESUME=yes
# Seems that download rate is lowered..
MODULE_NITROFLARE_DOWNLOAD_SUCCESSIVE_INTERVAL=30
# Get download link, if this was a direct download
FILE_URL=$(grep_http_header_location_quiet <<< "$PAGE")
if match 'your Hybrid-Traffic is completely exhausted' "$PAGE"; then
WAIT=$(parse 'Hybrid-Traffic.*exhausted' \
'will be released in \([[:digit:]]\+\) minutes' <<< "$PAGE")
echo $(( ${WAIT:-60} * 60 ))
return $ERR_LINK_TEMP_UNAVAILABLE
fi
if [ -z "$FILE_URL" ]; then
FILE_URL=$(parse_attr 'stor[[:digit:]]\+\.' 'action' <<< "$PAGE") || return
fi
echo "$FILE_URL"
echo "$FILE_NAME"
return 0
fi
if match '^[[:space:]]*var free_enabled = false;' "$PAGE"; then
log_error 'No free download slots available'
echo 300 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
fi
# Request download (use dummy "-d" to force a POST request)
JSON=$(curl -b "$COOKIE_FILE" --referer "$URL" \
-H 'X-Requested-With: XMLHttpRequest' -d '' \
"$BASE_URL/io/ticket/slot/$FILE_ID") || return
if [ "$JSON" != '{succ:true}' ]; then
ERR=$(parse_json_quiet 'err' <<< "$JSON")
# from 'http://nitroflare.com/js/download.js' - 'function(limit)'
if [ "$ERR" = 'limit-dl' ]; then
log_error 'Free download limit reached'
echo 600 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
elif [ "$ERR" = 'limit-parallel' ]; then
log_error 'No parallel download allowed.'
echo 600 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
elif [ "$ERR" = 'limit-size' ]; then
return $ERR_SIZE_LIMIT_EXCEEDED
elif [ "$ERR" = 'limit-slot' ]; then
log_error 'No free download slots available'
echo 300 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
fi
log_error "Unexpected remote error: $ERR"
return $ERR_FATAL
fi
# <span>Current waiting period: <span>30</span> seconds</span>
WAIT=$(parse '<span>Current waiting period' \
'period: <span>\([[:digit:]]\+\)</span>' <<< "$PAGE") || return
wait $((WAIT + 1)) || return
# from 'http://nitroflare.com/js/download.js' - 'Recaptcha.create'
local PUBKEY WCI CHALLENGE WORD ID
PUBKEY='6Lcqz78SAAAAAPgsTYF3UlGf2QFQCNuPMenuyHF3'
WCI=$(recaptcha_process $PUBKEY) || return
{ read WORD; read CHALLENGE; read ID; } <<< "$WCI"
JSON=$(curl -b "$COOKIE_FILE" --referer "$URL" \
-H 'X-Requested-With: XMLHttpRequest' \
-d "recaptcha_challenge_field=$CHALLENGE" \
-d "recaptcha_response_field=$WORD" \
"$BASE_URL/io/ticket/captcha/$FILE_ID") || return
ERR=$(parse_json_quiet 'err' <<< "$JSON")
if [ -n "$ERR" ]; then
if [ "$ERR" = 'captcha' ]; then
log_error 'Wrong captcha'
captcha_nack "$ID"
return $ERR_CAPTCHA
fi
captcha_ack "$ID"
if [ "$ERR" = 'limit-dl' ]; then
log_error 'Free download limit reached'
echo 600 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
# You have reached the max. number of possible free downloads for this hour
elif match 'possible free downloads for this hour' "$ERR"; then
log_error 'Hourly limit reached.'
echo 3600
return $ERR_LINK_TEMP_UNAVAILABLE
# This file exceeds the max. filesize which can be downloaded by free users.
elif match 'exceeds the max. filesize' "$ERR"; then
return $ERR_SIZE_LIMIT_EXCEEDED
# We're sorry but all of our available download slots are busy currently
elif match 'all of our available download slots are busy' "$ERR"; then
log_error 'No free download slots available'
echo 300 # wait some arbitrary time
return $ERR_LINK_TEMP_UNAVAILABLE
fi
log_error "Unexpected remote error: $ERR"
return $ERR_FATAL
fi
captcha_ack "$ID"
# {type:'download',url:'http://storXXXX.nitroflare.com/dl/...'}
# Note: This is no valid JSON due to the unquoted/single quoted strings
FILE_URL=$(nitroflare_parse_json_alt 'url' <<< "$JSON") || return
echo "$FILE_URL"
echo "$FILE_NAME"
}
`
And my login function
`
nitroflare_login() {
local -r AUTH=$1
local -r COOKIE_FILE=$2
local -r BASE_URL=$3
local LOGIN_DATA PAGE ERR TYPE ID NAME TOKEN_PRE TOKEN TRAFFIC_LIMIT
#Get the token
TOKEN_PRE=$(curl -b "$COOKIE_FILE" "$BASE_URL/login") || return
TOKEN=$(parse_form_input_by_name 'token' <<< "$TOKEN_PRE") || return
log_notice "Output directory: $TOKEN"
#exit 1
LOGIN_DATA="email=$USER&password=$PASSWORD&login=&token=$TOKEN"
PAGE=$(post_login "$AUTH" "$COOKIE_FILE" "$LOGIN_DATA" \
"$BASE_URL/login") || return
# Note: Cookies "login" + "auth" get set on successful login
ERR=$(parse_json_quiet 'err' <<< "$PAGE")
if [ -n "$ERR" ]; then
log_error "Remote error: $ERR"
return $ERR_LOGIN_FAILED
fi
# Note: Login changes site's language according to account's preference
#nitroflare_switch_lang "$COOKIE_FILE" "$BASE_URL" || return
# Determine account type
PAGE=$(curl -b "$COOKIE_FILE" "$BASE_URL/me") || return
ID=$(parse 'ID:' '<em.*>\(.*\)</em>' 1 <<< "$PAGE") || return
TYPE=$(parse 'Status:' '<em>\(.*\)</em>' 1 <<< "$PAGE") || return
NAME=$(parse_quiet 'Alias:' '<b><b>\(.*\)</b></b>' 1 <<< "$PAGE") || return
TRAFFIC_LIMIT=$(parse '500,00 GB' '<em>\(.*\)</em>' 1 <<< "$PAGE")
if [ "$TYPE" = 'Free' ]; then
TYPE='free'
elif [ "$TYPE" = 'Premium' ]; then
TYPE='premium'
else
log_error 'Could not determine account type. Site updated?'
return $ERR_FATAL
fi
log_debug "Successfully logged in as $TYPE member '$ID' (${NAME:-n/a})"
echo "$TYPE"
echo "$TRAFFIC_LIMIT"
}
`
Any help is really appreciated ..
So , I installed plowshare on debian via git method , added module name to the config file and placed my module file inside the modules.d folder .
But now i am getting error '8' , with no info about what exactly is causing the module to crash plowdown.
The exact output is -
: command not found
called with arguments:
Even , if just replicate any of the working modules , replacing all the instances of old hoster with a new hosters names , constants and functions , i am getting the same error .
Do i need to have a github repo to add new modules ?
I at a loss here @mcrapet , any help is really appreciated .
Hello,
I need to extract something from this response that I made with curl:
FORM UPLOAD: {"status":"success","code":200,"form_action":"http:\/\/81.211.2.116:8000\/upload","file_field":"file","form_data":{"ajax":true,"params":"{\"project\":\"trix\",\"expire\":1455768667,\"extras\":{\"user_id\":11,\"parent_id\":\"\"}}","signature":"04fa32d662324238dfd5fbf3f60087b36d693caf7b1dc8aaf46749a65f63c37fddcd6fb7a1360417d0586566ad635054f002a6f638665d3f77d2dee86ee57"}}
I need to extract form_data
I've tried with:
formdata=$(parse_json 'form_data' <<< "$FORMUPLOAD" ) || return log_debug "FORM DATA RESPONSE: $formdata"
where $FORMUPLOAD is the response of:
FORMUPLOAD=$(curl -d '{"auth_token":"'$URL'"}' "$BASE_URL"upload ) || return
and I get this error:
parse_json failed (json): "form_data"
Can anyone help me?
Thanks
hi,
since yesterday i am unable to upload to uploaded.net oder download from uploaded.net:
uploaded_net: take --auth option from configuration file
Starting upload (uploaded_net): 123.part1.rar
Destination file: 123.part1.rar
parse failed (sed): "/uploadServer =/ s/^.[[:space:]]'([^']).*$/\1/p" (skip 0)
Failed inside uploaded_net_upload(), line 479, uploaded_net.sh
Failed inside uploaded_net_upload() [1]
plowdown: force captcha method (online)
uploaded_net: take --auth option from configuration file
Starting download (uploaded_net): http://ul.to/123456
curl: failed with exit code 56
mediafire was working for me just fine today, I was entering captchas when I'm prompted for one and I keep going, until something happened that would give me Unexpected content/captcha type. Site updated?
I opened the url in my browser to see the captcha and I noticed that it was different than the ones I was getting, could be the reason ?
here's an -v4
run: http://sprunge.us/SfRR
I tried dozens of mediafire urls and they gave me the same results, when I try with a different computer/IP it will download until I hit the captchas again
error in premium download:
my workaround: https://github.com/matchius/plowshare-modules-legacy/blob/master/catshare.sh
First I did a git pull at my plowshare installation and a new make install.
After I do a plowmod -u to update legacy.git
When I try to upload a file, I grab this error.
$ plowup multiup_org file.ext
Starting upload (multiup_org): file.ext
Destination file: file.ext
parse_json failed (json): "hosts"
Failed inside multiup_org_upload(), line 90, multiup_org.sh
Failed inside multiup_org_upload() [1]
Thanks for your help.
Bogus links are also marked as live when I open the link in browser, but after I click on "regular download" I get the usual 30 sec counter with gray background of deleted links.
When I attach "?redirect" to the end of the links, then plowprobe gives always the proper results.
It looks like the 4shared module is not working,
plowdown http://www.4shared.com/office/JrEKXGh7ba/wordlist.htm --auth-free $NAME
asks for a password, but later it returns
Output URL is not valid: + window.location,
Thanks for Plowshare!
Hello
fileshark.pl not working I have a premium account
fileshark: unused command line switches: -a SHARKY408:fnp6wfrw
Starting download (fileshark): http://fileshark.pl/pobierz/12944028/hgrme/echo-effect-2015-pl-bdrip-xvid-kit-avi
parse failed (sed): "/btn-upload-free/ s/^.normal/([[:digit:]]+/[[:alnum:]]+).$/\1/p" (skip 0)
Failed inside fileshark_download(), line 64, fileshark.sh
Failed inside fileshark_download() [1]
I can give premium account test
Could someone make a upload plugin for videowood please.
It seems a good site, I can donate for someone's time if needed.
Thanks
plowdown http://www88.zippyshare.com/d/H31JUJJ6/10010/Artifuckt-Wild_Girl_EP-WNPR0089-WEB-2015-PITY.rar
Starting download (zippyshare): http://www88.zippyshare.com/d/H31JUJJ6/10010/Artifuckt-Wild_Girl_EP-WNPR0089-WEB-2015-PITY.rar
parse_attr failed (sed): "/id="omg"/ class="
Failed inside zippyshare_download(), line 149, zippyshare.sh
/tmp/plowdown.14068.15579.js:15
print(elts['fimage'].href);
^
TypeError: Cannot read property 'href' of undefined
at Object. (/tmp/plowdown.14068.15579.js:15:29)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:902:3
Output URL is not valid: http://www88.zippyshare.com
I tried with http://www88.zippyshare.com/v/H31JUJJ6/file.html format as well
?
rep: === SYSTEM INFO BEGIN ===
rep: [mach] raspberrypi arm linux-gnueabihf arm-unknown-linux-gnueabihf
rep: [bash] 4.3.30(1)-release
rep: [curl] curl 7.38.0 (arm-unknown-linux-gnueabihf) libcurl/7.38.0 OpenSSL/1.0.1k zlib/1.2.8 libidn/1.29 libssh2/1.4.3 librtmp/2.3
rep: [sed ] License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html.
rep: [lib ] '/usr/share/plowshare'
rep: === SYSTEM INFO END ===
rep: plowdown version v2.1.2-5-g9c672cb (2015-11-21)
Starting download (1fichier): https://1fichier.com/?kvs8aox113
rep: --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0 --silent --head -b /tmp/plowdown.18825.1343 https://1fichier.com/?kvs8aox113
rep: Received 247 bytes. DRETVAL=0
rep: === CURL BEGIN ===
rep:HTTP/1.1 200 OK
rep:Server: nginx
rep:Date: Tue, 22 Dec 2015 22:36:55 GMT
rep:Content-Type: text/html; charset=utf-8
rep:Connection: keep-alive
rep:Vary: Accept-Encoding
rep:Cache-Control: no-cache
rep:Expires: Fri, 30 Oct 1998 14:19:41 GMT
rep:Content-Encoding: gzip
rep:
rep: === CURL END ===
rep: --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0 --silent --form-string links[]=https://1fichier.com/?kvs8aox113 https://1fichier.com/check_links.pl
rep: Received 135 bytes. DRETVAL=0
rep: === CURL BEGIN ===
rep:https://1fichier.com/?kvs8aox113;Ant-Man.2015.REAL.RERIP.MULTI.1080p.BluRay.x264.AC3.DTS-ShowFr.zone-telechargement.com.mkv;9785685538
rep: === CURL END ===
rep: --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0 --silent -b LG=en https://1fichier.com/?kvs8aox113
rep: Received 13211 bytes. DRETVAL=0
rep: === CURL BEGIN ===
rep:
rep:
rep:
rep: <title>1fichier.com: Cloud Storage</title>
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep: <script type="text/javascript" src="https://img.1fichier.com/js/jquery.js"></script>
rep:
rep: <script type="text/javascript">
rep: </script>
rep:
rep:
rep:
You are looking for a reliable and secure hosting ? |
Unlimited Secure Storage |
No advertizing |
Unlimited speed |
Go to the Premium Offer ! |
24 h | 1 month | 3 months | 6 months | 1 year | LifeTime |
1 € | 3 € | 9 € | 15 € | 30 € | 99 € |
File Name : | Ant-Man.2015.REAL.RERIP.MULTI.1080p.BluRay.x264.AC3.DTS-ShowFr.zone-telechargement.com.mkv |
Date : | 19/11/2015 |
Size : | 9.11 GB |
You are looking for a reliable and secure hosting ? |
Unlimited Secure Storage |
No advertizing |
Unlimited speed |
Go to the Premium Offer ! |
24 h | 1 month | 3 months | 6 months | 1 year | LifeTime |
1 € | 3 € | 9 € | 15 € | 30 € | 99 € |
Hello again,
A friend found another problem with 1fichier upload.
If we use a '+' we have a match, parse problem but we can deal with it if we replace '+' with '+'.
The second problem is that 1fichier use the hex representation to create the directory exemple '+' is '%2B'. We can well replace each symbol manually but I don't know if there is a better approach.
Cheers.
This is the main error
"curl: couldn't resolve host
Failed inside go4up_upload() [3]"
And all the log is this
rep: === SYSTEM INFO BEGIN ===
rep: [mach] janjua x86_64 linux-gnu x86_64-pc-linux-gnu
rep: [bash] 4.3.11(1)-release
rep: [curl] curl 7.35.0 (x86_64-pc-linux-gnu) libcurl/7.35.0 OpenSSL/1.0.1f zlib/1.2.8 libidn/1.28 librtmp/2.3
rep: [sed ] License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html.
rep: [lib ] '/usr/share/plowshare'
rep: === SYSTEM INFO END ===
rep: plowup version v2.1.1-1-gff312a2 (2015-05-09)
Starting upload (go4up): Hawaizaada.(2015).Hindi.720p.DVDRip.x264.-.[KIKS].rar
Destination file: Hawaizaada.(2015).Hindi.720p.DVDRip.x264.-.[KIKS].rar
rep: --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0 --silent -b /tmp/plowup.17639.25771 -c /tmp/plowup.17639.25771 http://go4up.com
rep: Received 25118 bytes. DRETVAL=0
rep: === CURL BEGIN ===
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep: <title>Go4up - Share your files (Multiupload Service)</title>
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep: <script type="text/javascript" src="/min/?f=assets/theme/scripts/jquery-1.8.2.min.js,assets/theme/scripts/modernizr.custom.76094.js,assets/theme/scripts/less-1.3.3.min.js"></script>
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep:
rep: <script>
rep: (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
rep: (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
rep: m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
rep: })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
rep:
rep: ga('create', 'UA-45320435-1', 'go4up.com');
rep: ga('send', 'pageview');
rep:
rep: </script>
rep:
rep:
rep:
rep:
rep:
Upload Files
Hello, I saw the note in 1fichier.sh that you will see it later.
I can maybe help but before I will ask how you saw this option ?
So it keep the same code idea.
Cheers
$ plowdown --no-plowsharerc --no-curlrc --no-color -v 3 http://www27.zippyshare.com/v/x2UrFgdq/file.html
Starting download (zippyshare): http://www27.zippyshare.com/v/x2UrFgdq/file.html
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 27765 0 27765 0 0 23894 0 --:--:-- 0:00:01 --:--:-- 23914
dbg: Content Type: ''
parse_attr failed (sed): "/id="omg"/ class="
Failed inside zippyshare_download(), line 149, zippyshare.sh
/tmp/plowdown.27941.323.js:18: TypeError: somffunction is not a function
Output URL is not valid: http://www27.zippyshare.com
Hey,
today I tried to download something from uploaded.net but it gives me an error:
parse failed (sed): "/Status:/ s/^.(.).*$/\1/p" (skip 1)
Failed inside uploaded_net_login(), line 73, uploaded_net.sh
Failed inside uploaded_net_download() [1]
The lines in the request look like that:
rep: <td style="width:11%">Status:</td>
rep: <th style="width:36%">
rep: <a href="register"><em>Premium</em></a> </th>
rep: </tr>
line 73 of the uploaded module:
TYPE=$(parse 'Status:' '<em>\(.*\)</em>' 1 <<< "$PAGE") || return
I am trying to find out what's wrong but maybe you're faster than me 🐙
hi,
please can you write uploadrocket.net Upload Module.
regards
Hi there,
testing to get some files with plowdown on FileJoker.net. I have premium account there and using command as following
plowdown -a '[email protected]:password' https://filejoker.net/29ir...
The result is it tries to download it as free user
filejoker: unused command line switches: -a [email protected]:password Starting download (filejoker): https://filejoker.net/29irc... Waiting 121 seconds... 1m57s left^C
Any ideas whats wrong here?
The website is quite simple and I suppose it shouldn't be too much hassle. Thank you!
Hi,
I love so much your script. But, with dl.free.fr I have this error message:
parse failed (sed): "/Fichier:/ s/^.*">\([^<]*\).*$/\1/p" (skip 1)
Failed inside dl_free_fr_download(), line 171, dl_free_fr.sh
Failed inside dl_free_fr_download() [1]
Have you an idea for a fix ?
Hello
rapidu.net not working I have a premium account
and download as free user
I can give premium account test
Is it posible to get streamable.ch added to the list? I tried adding one myself but it doesn't seem to be loaded when I try to install or update, unless there's a cli command that allows the reloading of all present modules. They seem to be the same people who run uploadable.ch and the site is exactly the same, bar cosmetic changes and the ability to stream files. So I assume that simply using the uploadable_ch script and replacing all instances of:
uploadable.ch => streamable.ch
uploadable => streamable
uploadable_ch => streamable_ch
should suffice to make it work.
i uploaded a test file to fboom.me, the URL is http://fboom.me/file/0d5afdea13eb7/Test_file_50MB.zip. i verified it is downloadable in my browser with no capcha when i have an fboom login cookie. when i try to download via plowdown using authentication, i appear to login OK but am still asked to solve the capcha. when i do so, there is a 30 second wait. this is not the expected behaviour. in the attached pastebin log i just canceled a few seconds after the capcha was solved/the 30 second timer started. i ran it as per:
$ plowdown -a [email protected]:foobar -v4 -r0 --no-plowsharerc --no-color http://fboom.me/file/0d5afdea13eb7/Test_file_50MB.zip
the log is here http://pastebin.com/vcbDtXMt. let me know if there's anything i can do.
thank you!
-RMT
keep2share doesn't seem to want to work when using authentication. i verified that your test URL was downloadable in my browser, and then did the following:
$ plowdown --version
v2.1.4-4-g00fac3b (2016-05-16)
$ plowdown -a [email protected]:foobar -v4 -r0 --no-plowsharerc --no-color http://k2s.cc/file/445d948518bd1/Test_file_50MB.zip
rep: === SYSTEM INFO BEGIN ===
<output redacted for ticket brevity>
rep: === CURL END ===
parse_all failed (sed): "/^[Ll]ocation:/ s/^.*n:[[:space:]]\+\(.*\)[[:cntrl:]]$/\1/p" (skip 0)
Failed inside grep_http_header_location(), line 687, core.sh
Failed inside keep2share_download(), line 125, keep2share.sh
Failed inside keep2share_download() [1]
$ echo $?
1
full log is here http://pastebin.com/0Ue2b1Sf.
let me know if i can provide any more information!
thank you!
-RMT
Throws an error when attempting to perform an upload using module vidzi_tv.
grep_form_by_name failed (sed): "name=file"
I have a problemen with uptobox on two vm with two ip :
http://0bin.net/paste/T6UYvXDRbdA6IDLR#hOZnGfh6PmeyqfWnEZ31suNmu8OSqJWIVd2SSX+nGHH
Perhaps the ddos proctection of clouflare
Here is error reported by depositfiles module.
parse_attr failed (sed): "/<[Ff][Oo][Rr][Mm]/ action="
Failed inside parse_form_action(), line 920, core.sh
Failed inside depositfiles_download(), line 203, depositfiles.sh
Again Zippyshare may have changed its code.
root@host [~]# plowdown http://www65.zippyshare.com/v/45917134/file.html
Starting download (zippyshare): http://www65.zippyshare.com/v/45917134/file.html
/tmp/plowdown.30462.6337.js:17: SyntaxError: syntax error:
/tmp/plowdown.30462.6337.js:17: </div>
/tmp/plowdown.30462.6337.js:17: ............^
Output URL is not valid: http://www65.zippyshare.com
root@host [~]#
zippyshare seems to have changed its system:
plowdown -v4 http://www41.zippyshare.com/v/uZjTVyA6/file.html
rep: === SYSTEM INFO BEGIN ===
rep: [mach] debian x86_64 linux-gnu x86_64-pc-linux-gnu
rep: [bash] 4.2.37(1)-release
rep: [curl] curl 7.26.0 (x86_64-pc-linux-gnu) libcurl/7.26.0 OpenSSL/1.0.1e zlib/1.2.7 libidn/1.25 libssh2/1.4.2 librtmp/2.3
rep: [sed ] GNU sed version 4.2.1
rep: [lib ] '/usr/local/share/plowshare4'
rep: === SYSTEM INFO END ===
rep: plowdown version v1.1.0-29-g0df22d0 (2015-01-06)
rep: use /etc/plowshare.conf
Starting download (zippyshare): http://www41.zippyshare.com/v/uZjTVyA6/file.html
rep: --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0 --max-redirs 5 --silent -L -c /tmp/plowdown.29575.7219 -b ziplocale=en http://www41.zippyshare.com/v/uZjTVyA6/file.html
rep: Received 28333 bytes. DRETVAL=0
rep: === CURL BEGIN ===
rep:
rep:
rep:
rep: <title>Zippyshare.com - testpic.jpg</title>
rep:
rep:
rep:
rep:
rep:
rep:
rep:
(...)
rep: === CURL END ===
dbg: Content Type: 'image'
rep: interpreter: /usr/bin/js
rep: === JAVASCRIPT BEGIN ===
rep:var elts = new Array();
rep: var document = {
rep: getElementById: function(id) {
rep: if (! elts[id]) { elts[id] = {}; }
rep: return elts[id];
rep: }
rep: };
rep:
rep: var curobj;
rep: evt = {
rep: originalEvent: 1
rep: }
rep:
rep: mouseevent = function(fnt) {
rep: fnt.apply(this,[evt]);
rep: }
rep: attr = function(attributeName, value) {
rep: if (attributeName === 'href') {
rep: document.getElementById(curobj.substr(1)).href = value;
rep: }
rep: }
rep: $ = function(obj) {
rep: curobj=obj;
rep: return {
rep: ready: mouseevent,
rep: mouseenter: mouseevent,
rep: mouseover: mouseevent,
rep: attr: attr
rep: };
rep: }
rep:
rep: delete(java);
rep: var EnvJs = true;
rep:
rep: var a = function() {return 1};
rep: var b = function() {return a() + 1};
rep: var c = function() {return b() + 1};
rep: var d = document.getElementById('omg').getAttribute('class');
rep: if (true) { d = d*2;}
rep: document.getElementById('dlbutton').href = "/d/uZjTVyA6/"+(901565%1000 + a() + b() + c() + d + 5/5)+"/testpic.jpg";
rep: if (document.getElementById('fimage')) {
rep: document.getElementById('fimage').href = "/i/uZjTVyA6/"+(901565%1000 + a() + b() + c() + d + 5/5)+"/testpic.jpg";
rep: }
rep: print(elts['fimage'].href);
rep: === JAVASCRIPT END ===
js: uncaught JavaScript runtime exception: TypeError: Cannot find function getAttribute in object [object Object].Output URL is not valid: http://www41.zippyshare.com
Something changed on turbobit site, so now plowdown reports error:
parse failed (sed): "/<title>/ s/^[[:blank:]]Download (.+). Free.$/\1/p" (skip 1)
It was working fine. Suddenly since yesterday I am getting this error.
root@host [/]# plowdown http://www65.zippyshare.com/v/45917134/file.html
Starting download (zippyshare): http://www65.zippyshare.com/v/45917134/file.html
/tmp/plowdown.2879.9203.js:17: SyntaxError: syntax error:
/tmp/plowdown.2879.9203.js:17: </div>
/tmp/plowdown.2879.9203.js:17: ............^
Output URL is not valid: http://www65.zippyshare.com
root@host [/]#
What could be the issue?
Hi developers and associated members,
thanks for this nice too. I would appreciate a new script for the hoster: ulozto.net
They are using Captchas and they doesn't have an API. Download (and maybe Upload) functionality would be fine.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.