xyou365 / autorclone Goto Github PK
View Code? Open in Web Editor NEWAutoRclone: rclone copy/move/sync (automatically) with thousands of service accounts
Home Page: https://www.gfan.loan/?p=235
AutoRclone: rclone copy/move/sync (automatically) with thousands of service accounts
Home Page: https://www.gfan.loan/?p=235
sudo python3 add_to_google_group.py -g [email protected]
<googleapiclient.discovery.Resource object at 0x7fc9ef262208>
Readying accounts |########################## | 1000/1200Traceback (most recent call last):
File "add_to_google_group.py", line 68, in <module>
batch.add(group.members().insert(groupKey=gaddr, body=body))
File "/usr/local/lib/python3.6/dist-packages/googleapiclient/_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/googleapiclient/http.py", line 1400, in add
% MAX_BATCH_LIMIT
googleapiclient.errors.BatchError: <BatchError "Exceeded the maximum calls(1000) in a single batch request.">
Hi,
Please add this important features, to sync Team Drives, without 750 limit / day.
Thanks in advance
note. Guys, say here how you are interest about this feature
Hi
Thanks for the project
When the task is finished, how i delete all accounts from de teamdrive?
Thanks
with this commit rclone/rclone@a86196a
we got a new option to directly change service accounts
I think this would be a better way to handle it instead of killing the task and editing the changing the service account and stating the transfer again
使用下面命令:python3 /root/AutoRclone/rclone_sa_magic.py -sp "%F" -d 0AHe-XXXXX-XXXXX -dp %N% -b 1 -e 600
貌似一直无法成功,我试了下会提示./account文件夹不存在,需要cd到AutoRclone文件夹才能执行成功
rclone is detected: /usr/bin/rclone
generating rclone config file.
No json files found in ./accounts
I'm having trouble getting AutoRclone to use all of my service accounts. I ran the following command:
python3 rclone_sa_magic.py -s {SourceID} -d {DestID} -b 1 -e 600
It proceeded to do 10 copy operations:
>> Let us go dst001:
thru
>> Let us go dst010:
But then it stopped and gave an elapsed time and "All Done" message. Since I have 100 service accounts, why did it stop after 10?
When using run python3 gen_sa_accounts.py --quick-setup -1
it tries to create SA on deleted projects too. It should only try to make SA on active projects.
gen_sa_accounts.py --list-projects
also returns all projects including deleted projects.
Copied 736gb successfully but got stuck on dst002: reading source/destination | checks: 0 files
Total Size 4tb, team drive to team drive!
Command used sudo python3 rclone_sa_magic.py -s source -d destination -dp backup -b 1 -e 600
hi.. i have Gdrive with 500TB data.. i used Autoclone to copy that into another drive.
now i have added additional 50TB data into my original drive. but when using autoclone to copy into another drive.. its taking forever to check existing data..
how to increase performance of check process. .do we need to add additional flag while running it ?
plz help
root@instance-3:~/AutoRclone# python3 rclone_sa_magic.py -sp /### -d ### -dp ccc -b 1 -e 100
rclone is detected: /usr/bin/rclone
generating rclone config file.
rclone config file generated.
Start: 10:47:05
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst001:ccc"
>> Let us go dst001: 10:47:05
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (1/3)
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst002:ccc"
>> Let us go dst002: 11:07:07
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (2/3)
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst003:ccc"
>> Let us go dst003: 11:07:17
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (3/3)
All done (3/3).
log_rclone.txt
2019/10/27 11:07:06 DEBUG : ###.mp4: MD5 = 4830ef8fcf6322291a99fe3$
2019/10/27 11:07:06 INFO : ###.mp4: Copied (new)
2019/10/27 11:07:06 INFO :
Transferred: 81.953G / 81.953 GBytes, 100%, 70.016 MBytes/s, ETA 0s
Errors: 0
Checks: 0 / 0, -
Transferred: 2 / 2, 100%
Elapsed time: 19m58.5s
2019/10/27 11:07:06 DEBUG : 6 go routines active
2019/10/27 11:07:06 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$
2019/10/27 11:07:07 DEBUG : rclone: Version "v1.50.0" starting with parameters ["rclone" "--config" "./rclone.co$
2019/10/27 11:07:07 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/27 11:07:07 DEBUG : Using config file from "###/AutoRclone/rclone.conf"
2019/10/27 11:07:07 INFO : Starting HTTP transaction limiter: max 3 transactions/s with burst 1
2019/10/27 11:07:08 DEBUG : ###.mkv: Dest$
2019/10/27 11:07:08 INFO : Google drive root 'ccc': Waiting for checks to finish
2019/10/27 11:07:08 DEBUG : ###.mp4: Destination exists, skipping
2019/10/27 11:07:08 INFO : Google drive root 'ccc': Waiting for transfers to finish
2019/10/27 11:07:08 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 0
Checks: 2 / 2, 100%
Transferred: 0 / 0, -
Elapsed time: 0s
2019/10/27 11:07:08 DEBUG : 8 go routines active
2019/10/27 11:07:08 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$
2019/10/27 11:07:17 DEBUG : rclone: Version "v1.50.0" starting with parameters ["rclone" "--config" "./rclone.co$
2019/10/27 11:07:17 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/27 11:07:17 DEBUG : Using config file from "###/AutoRclone/rclone.conf"
2019/10/27 11:07:17 INFO : Starting HTTP transaction limiter: max 3 transactions/s with burst 1
2019/10/27 11:07:18 DEBUG : ###.mp4: Destination exists, skipping
2019/10/27 11:07:18 DEBUG : ###.mkv: Dest$
2019/10/27 11:07:18 INFO : Google drive root 'ccc': Waiting for checks to finish
2019/10/27 11:07:18 INFO : Google drive root 'ccc': Waiting for transfers to finish
2019/10/27 11:07:18 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 0
Checks: 2 / 2, 100%
Transferred: 0 / 0, -
Elapsed time: 0s
2019/10/27 11:07:18 DEBUG : 9 go routines active
2019/10/27 11:07:18 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$
It would be really great if we were allowed to pass rclone options (like --rc) or allow us to use rclone sync instead of rclone copy
e.g.
python3 rclone_sa_magic.py -s SourceID1 -d DestinationID1 -dp DestinationPathName -b 1 -e 100
python3 rclone_sa_magic.py -s SourceID2 -d DestinationID2 -dp DestinationPathName -b 101 -e 200
实际上读取的是同一份rclone.conf ,望优化。
总共生成了500个SA账号,最近发现用脚本总是只能添加200-300个之间,重复执行会出现+1 +2的人数上涨,执行下来完成任务非常快,不清楚是不是添加速度太快了被google限制了吗?
root@:~/autorclone# python3 add_to_team_drive.py -d ******
Found credentials.
Make sure the Google account that has generated credentials.json
is added into your Team Drive (shared drive) as Manager
(Press any key to continue)
Readying accounts |################################| 500/500
Adding...
Complete.
Elapsed Time:
00:00:08.53
The option makes rclone handle API transfer limit hit messages as errors. The messages are actually formatted differently for transfer limit vs api usage limits so this reliable. It cuts the account at 750GB.
EDIT: On second check I realized this was a recent option addition so if this added it should be given as a non-default option.
Is that possible to transfer all of my Drive (root) ? What is FolderID i must to use ? Isn't "root" ?
Hi,
i use rclone but autorclone can be very great without gdrive limits.
I have a problem when i try to install it in win10.
I have installed all things and when i launch
python3 gen_sa_accounts.py --quick-setup 5
doesn't create nothing in account folder
I'm stuck on there, please can you help me?
rclone可以用 --exclude-from exclude-file.txt 排除不需要拷贝的文件和目录,非常实用。
强烈建议添加这个功能!
谢谢!
when running nothing happens
it shows elapsed time 8 seconds and no group memebers
It would be a great if you could properly license the project.
Hello:
I am starting working with your great tool, but when i start adding my service accounts to google group using the following order:
python3 add_to_google_group.py -g [email protected]
I get the following error:
Traceback (most recent call last): File "add_to_google_group.py", line 43, in <module> flow = InstalledAppFlow.from_client_secrets_file(credentials[0], scopes=[ IndexError: list index out of range
My test running on Ubuntu and the numbers of service account is 700 service accounts.
Also if i would like to add those service accounts by hand, how can i do that ?
Should i have to grab all email and send them invitation to the group?
Thanks.
换了台vps,利用rclone将整个文件夹copy到新vps上,也安装好了依赖环境,但是在执行从共享文件夹拷贝到TD时一直提示:
Failed to copy: failed to make directory: googleapi: Error 404: File not found: 0AD7f-V*****KFUk9PVA., notFound
确定TD ID是正确的,因为在老机器上也是这么执行,不太清楚是我拷贝漏了东西还是说sa账号只能同时被一台机器使用?
使用的命令如下:
python3 rclone_sa_magic.py -s "1N*******cMM-xkc2Y39jYRAHG253uk" -d 0AD7f-V*****KFUk9PVA -dp "/1127" -b 1 -e 600
wrong github
After I get the
SUCCESS: The process with PID 5232 has been terminated.
The next service account starts but the speed go from about 2000MB/s to 1-40MB/s
Is it possible to get higher speeds once the next account starts.
It does seem to slowly speed up over time and gets higher but it stills caps at about 100MB/s.
Any advice is appreciated.
Hello,2020/04/30 23:22:00 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp [::1]:5572: connectex: No connection could be made because the target machine actively refused it how to fix that?
源:gdriveA,只有下载权限
目的地:gdriveB,可以加入sa账号
此项目如何实现 gdriveA ⇨ gdriveB呢 ?
~/AutoRclone# python3 add_to_team_drive.py -d
Found credentials.
Make sure the Google account that has generated credentials.json
is added into your Team Drive (shared drive) as Manager
(Press any key to continue)
Readying accounts |#################### | 460/701Traceback (most recent call last):
File "add_to_team_drive.py", line 63, in
ce = json.loads(open(i, 'r').read())['client_email']
KeyError: 'client_email'
Hi,
Im stuck with this message while uploading file to Shared Drive
python gen_sa_accounts.py --quick-setup 1 --new-only
creat projects: 1
Creating 1 projects
Enabling services
Traceback (most recent call last):
File "gen_sa_accounts.py", line 323, in <module>
download_keys=args.download_keys
File "gen_sa_accounts.py", line 224, in serviceaccountfactory
_enable_services(serviceusage,ste,services)
File "gen_sa_accounts.py", line 89, in _enable_services
batch.execute()
File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\http.py", line 1524, in execute
self._execute(http, self._order, self._requests)
File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\http.py", line 1454, in _execute
self._batch_uri, method="POST", body=body, headers=headers
File "G:\software\code\Anaconda\envs\test\lib\site-packages\google_auth_httplib2.py", line 198, in request
uri, method, body=body, headers=request_headers, **kwargs)
File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1994, in request
cachekey,
File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1651, in _request
conn, request_uri, method, body, headers
File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1557, in _conn_request
conn.connect()
File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1391, in connect
raise socket_err
File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1326, in connect
self.sock = self._context.wrap_socket(sock, server_hostname=self.host)
File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 423, in wrap_socket
session=session
File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 870, in _create
self.do_handshake()
File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 1139, in do_handshake
self._sslobj.do_handshake()
OSError: [Errno 0] Error
For some reason my service accounts have 3-4 keys each, which means that when I run python3 gen_sa_accounts.py --download-keys [project-id]
I get 300-400 keys. Is there a way to delete all the excess keys from the service accounts without deleting the service accounts themselves?
hi, how do i use --crypt where do i enable password n salt?
I able to accomplished AutoRclone and able to transfer files from Team Drive #1 to Team Drive #2 in Google Account A
So, I would like to replicate the same for Google Account B. But, the script keep listed the project from Google Account A.
How to re-authenticate the script for me to be able to authenticate Google Account B?
I'm running the following command:
python3 rclone_sa_magic.py -sp "/home/user/media/" -d "<drive-id>"
and every once in a while it can't start a new remote because of the following errors:
rclone --config ./rclone.conf copy --drive-stop-on-upload-limit --drive-server-side-across-configs --rc --rc-addr="localhost:5572" -v --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "/home/user/media/" "dst010:" &
>> Let us go dst010: 19:47:00
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/pid": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (1/3)
rclone --config ./rclone.conf copy --drive-stop-on-upload-limit --drive-server-side-across-configs --rc --rc-addr="localhost:5572" -v --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "/home/user/media/" "dst011:" &
>> Let us go dst011: 19:47:10
dst011: 510GB Done @ 245.282449MB/s | checks: 2125 files
In this case the downtime is low, but sometimes it takes much longer (I've seen up to an hour). The log_rclone.txt
returns the following error:
2020/06/10 19:47:00 Failed to start remote control: start server failed: listen tcp 127.0.0.1:5572: bind: address already in use
I'm running Ubuntu Server 20.04 LTS with the latest rclone:
user@server:~$ rclone --version
rclone v1.52.0
- os/arch: linux/amd64
- go version: go1.14.3
When I tried to add sa emails, generated by AutoRclone, to Google Group, the error message shows:
"Your organization or group is configured to allow only organization members to join"
I checked the emails made by AutoRclone are like this:
[email protected]
It is very different from my gmail .edu domain.
What am I doing wrong?
Thanks!
[email protected]
[email protected]
[email protected]
[email protected]
向这样命名的,。
就是数字递增的批量创建。前面的aaa都是一致的呢
Could you add function to add public shared link to all the service accounts?
I'm trying to download from a public shared drive to my own drive, but each service account needs to have this public shared drive opened for it to work.
Thanks for making this great code.
But i would like to know if this tool can bypass the daily 750Gb copy limit?
Also does it bypass the daily 8TB download limits?
Id yes for both, how can we do it?
Thanks.
$ python3 gen_sa_accounts.py --quick-setup 1
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=95576504074-3o3c4a8au0m9fplnc88blv37psg35ghu.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam&state=PjP63ZjIxkrLDfbyRPtsyOmKh0HHGE&prompt=consent&access_type=offline
Enter the authorization code: *********************************
Traceback (most recent call last):
File "gen_sa_accounts.py", line 311, in
resp = serviceaccountfactory(
File "gen_sa_accounts.py", line 175, in serviceaccountfactory
with open(token, 'wb') as t:
PermissionError: [Errno 13] Permission denied: 'token.pickle'
Heya!
When doing Step 2, aka. running python3 gen_sa_accounts.py --quick-setup 1
(in my case just with python since it's already v3) I get the following error:
C:\Users\epicl\Downloads\AutoRclone-master>python gen_sa_accounts.py --quick-setup 1
Traceback (most recent call last):
File "gen_sa_accounts.py", line 311, in <module>
resp = serviceaccountfactory(
File "gen_sa_accounts.py", line 161, in serviceaccountfactory
proj_id = loads(open(credentials,'r').read())['installed']['project_id']
KeyError: 'installed'
Any way to fix this? The same happens on Ubuntu btw.
i already use AutoRclone to copy to my shared drives, but this command:
py rclone_sa_magic.py -s "a1b2c3d4e5f6_g7h8i9j0k1l2m3n" -d "usual_shared_drive_id" -dp "name" -b X -e Y
give me this error:
rclone is detected: ~\rclone.exe
generating rclone config file.
Wrong length of team_drive_id or publicly shared root_folder_id
also, no output log!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.