Giter Site home page Giter Site logo

Comments (35)

JerryChenn07 avatar JerryChenn07 commented on July 22, 2024 3

This project is pretty good. If you can join Timed task, it will be a more complete project.

Come on!

from scrapydweb.

wymen2018 avatar wymen2018 commented on July 22, 2024 1
  1. You can select multiple crawlers when you want to set a timed task. You can select multiple crawlers to start at each time period after selecting the time point.
    Imagine that I have 100 crawlers in a project. This feature does provide convenience and management.
    2, as above, because there are a hundred crawlers, so can you have the function of custom label, that is, I can use the label screen and view the running status of the specified category of crawlers, and timed tasks
    Thanks!

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024 1

Or modify the code below to vm.versions = ['default: the latest version'].concat(obj.versions);

vm.versions = obj.versions;

from scrapydweb.

lidonghe avatar lidonghe commented on July 22, 2024 1

Or modify the code below to vm.versions = ['default: the latest version'].concat(obj.versions);
scrapydweb/scrapydweb/templates/scrapydweb/schedule.html

Line 826 in 560e998

                         vm.versions = obj.versions;

It works, thanks

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024 1

add a image of docker, please

@seozed
Check out the docker image created by @luzihang123.

my8100/logparser#15 (comment)

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

@tuchief

2、Online packaging and deployment

Have you tried the Projects > Deploy page?

from scrapydweb.

tuchief avatar tuchief commented on July 22, 2024

I know what you said, but this method should be manually packaged into egg, and then uploaded, and can not directly package the source project into egg, automatically upload it?

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

I know what you said, but this method should be manually packaged into egg, and then uploaded, and can not directly package the source project into egg, automatically upload it?

Ok, I would try to figure out a better way to eggify local projects.

from scrapydweb.

tuchief avatar tuchief commented on July 22, 2024

You can refer to https://github.com/Gerapy/Gerapy

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

v0.9.9: Add auto eggifying

from scrapydweb.

tuchief avatar tuchief commented on July 22, 2024

Wow, the response is fast! Expecting the same timing tasks

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

You can select multiple crawlers when you want to set a timed task. You can select multiple crawlers to start at each time period after selecting the time point.
Imagine that I have 100 crawlers in a project. This feature does provide convenience and management.

You mean there are 100 spiders in a project and you want to schedule some of them to run periodically?

as above, because there are a hundred crawlers, so can you have the function of custom label, that is, I can use the label screen and view the running status of the specified category of crawlers, and timed tasks

What about labeling some related jobs with a specific jobid?

from scrapydweb.

wymen2018 avatar wymen2018 commented on July 22, 2024

You mean there are 100 spiders in a project and you want to schedule some of them to run periodically?

No, Simultaneously select multiple spiders to set one timed task.

What about labeling some related jobs with a specific jobid?

Labels will be better because they can be more visualized according to their own classification

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

OK, I would take it into account when implementing this feature. Thanks for your advice!

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

v1.0.0rc1: Add Email Notice, with multi-triggers provided, including:

  • ON_JOB_RUNNING_INTERVAL

  • ON_JOB_FINISHED

  • When reaching the threshold of specific kind of log: ['CRITICAL', 'ERROR', 'WARNING', 'REDIRECT', 'RETRY', 'IGNORE'], in the meanwhile, you can ask ScrapydWeb to stop/forcestop current job automatically.

Get it via the pip install scrapydweb==1.0.0rc1 command.

Email sample:

image

from scrapydweb.

LWsmile avatar LWsmile commented on July 22, 2024

ERROR in utils: !!!!! ConnectionError HTTPConnectionPool(host='127.0.0.1', port=6800): Max retries exceeded with url: /jobs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f6111cae9b0>: Failed to establish a new connection: [Errno 111] Connection refused',))

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

ERROR in utils: !!!!! ConnectionError HTTPConnectionPool(host='127.0.0.1', port=6800): Max retries exceeded with url: /jobs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f6111cae9b0>: Failed to establish a new connection: [Errno 111] Connection refused',))

Please new an issue with details.

from scrapydweb.

lidonghe avatar lidonghe commented on July 22, 2024

timed task +1

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

v1.2.0: Support Timer Tasks to schedule a spider run periodically

from scrapydweb.

lidonghe avatar lidonghe commented on July 22, 2024

While adding a timer task, I have to choose a version, but actually I only want to use the latest version so that if the project is updated, I don't need to update my task accordingly.

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

OK, it would be fixed in the future release.
For the time being, go to the Jobs page and click either multinode or Start button to work around.

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

The modification has been commited.

from scrapydweb.

heave-Rother avatar heave-Rother commented on July 22, 2024

For Logs categorization, can the same spiders distributed on different scrapyd be aggregated? @my8100

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

@heave-Rother
For the time being, you can switch to a specific page of the neighboring node
with the help of Node Scroller and Node Skipping.
image

If that cannot satisfy your need, could you draw a picture to show me your idea?

from scrapydweb.

heave-Rother avatar heave-Rother commented on July 22, 2024

@my8100 ok,
捕获

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

@heave-Rother

  1. What would you use the checkboxes in the drop-down list for?
    Did you notice the checkboxes for nodes in the Servers page?
  2. Stats aggregation would be followed up in PR #72.

from scrapydweb.

heave-Rother avatar heave-Rother commented on July 22, 2024

@my8100
yes, I want to choose the server I want to join the statistics.

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

@heave-Rother
I see. Thanks for your suggestion.

from scrapydweb.

seozed avatar seozed commented on July 22, 2024

add a image of docker, please

from scrapydweb.

devxiaosong avatar devxiaosong commented on July 22, 2024

感谢作者,这是我找到的最好的爬虫集群操作平台。提几个需求:
1,给每个node加描述,方便自己看。
2,通过手机短信发送报警信息。
3,如何支持基于scrapy-redis的分布式爬虫的配置、启动?

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

@devxiaosong
Replied in #107.

from scrapydweb.

Tobeyforce avatar Tobeyforce commented on July 22, 2024

Please add a short tutorial on how to switch from the flask development server to a production server with https enabled using letsencrypt. It would be much appreciated.

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

Please add a short tutorial on how to switch from the flask development server to a production server with https enabled using letsencrypt. It would be much appreciated.

Please try it out and share your result.

logger.info("For running Flask in production, check out http://flask.pocoo.org/docs/1.0/deploying/")

############################## ScrapydWeb #####################################
# The default is False, set it to True and add both CERTIFICATE_FILEPATH and PRIVATEKEY_FILEPATH
# to run ScrapydWeb in HTTPS mode.
# Note that this feature is not fully tested, please leave your comment here if ScrapydWeb
# raises any excepion at startup: https://github.com/my8100/scrapydweb/issues/18
ENABLE_HTTPS = False
# e.g. '/home/username/cert.pem'
CERTIFICATE_FILEPATH = ''
# e.g. '/home/username/cert.key'
PRIVATEKEY_FILEPATH = ''

from scrapydweb.

Jasonjk3 avatar Jasonjk3 commented on July 22, 2024

可以在页面上动态添加/删除 scrapyd 节点吗

from scrapydweb.

my8100 avatar my8100 commented on July 22, 2024

可以在页面上动态添加/删除 scrapyd 节点吗

Editing Scrapyd servers via GUI is not supported.

from scrapydweb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.