Giter Site home page Giter Site logo

celery / django-celery-beat Goto Github PK

View Code? Open in Web Editor NEW
1.6K 46.0 417.0 628 KB

Celery Periodic Tasks backed by the Django ORM

License: Other

Makefile 2.24% Python 95.46% HTML 0.92% Dockerfile 0.99% Shell 0.39%
celery celery-task django celerybeat python python-library python3 python-libary django-orm

django-celery-beat's Introduction

https://docs.celeryq.dev/en/latest/_images/celery-banner-small.png

Build status coverage BSD License Celery can be installed via wheel Semgrep security Supported Python versions. Supported Python implementations. Backers on Open Collective Sponsors on Open Collective

Version:5.4.0 (opalescent)
Web:https://docs.celeryq.dev/en/stable/index.html
Download:https://pypi.org/project/celery/
Source:https://github.com/celery/celery/
Keywords:task, queue, job, async, rabbitmq, amqp, redis, python, distributed, actors

Donations

This project relies on your generous donations.

If you are using Celery to create a commercial product, please consider becoming our backer or our sponsor to ensure Celery's future.

For enterprise

Available as part of the Tidelift Subscription.

The maintainers of celery and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. Learn more.

What's a Task Queue?

Task queues are used as a mechanism to distribute work across threads or machines.

A task queue's input is a unit of work, called a task, dedicated worker processes then constantly monitor the queue for new work to perform.

Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task a client puts a message on the queue, the broker then delivers the message to a worker.

A Celery system can consist of multiple workers and brokers, giving way to high availability and horizontal scaling.

Celery is written in Python, but the protocol can be implemented in any language. In addition to Python there's node-celery for Node.js, a PHP client, gocelery, gopher-celery for Go, and rusty-celery for Rust.

Language interoperability can also be achieved by using webhooks in such a way that the client enqueues an URL to be requested by a worker.

What do I need?

Celery version 5.3.5 runs on:

  • Python (3.8, 3.9, 3.10, 3.11, 3.12)
  • PyPy3.9+ (v7.3.12+)

This is the version of celery which will support Python 3.8 or newer.

If you're running an older version of Python, you need to be running an older version of Celery:

  • Python 3.7: Celery 5.2 or earlier.
  • Python 3.6: Celery 5.1 or earlier.
  • Python 2.7: Celery 4.x series.
  • Python 2.6: Celery series 3.1 or earlier.
  • Python 2.5: Celery series 3.0 or earlier.
  • Python 2.4: Celery series 2.2 or earlier.

Celery is a project with minimal funding, so we don't support Microsoft Windows but it should be working. Please don't open any issues related to that platform.

Celery is usually used with a message broker to send and receive messages. The RabbitMQ, Redis transports are feature complete, but there's also experimental support for a myriad of other solutions, including using SQLite for local development.

Celery can run on a single machine, on multiple machines, or even across datacenters.

Get Started

If this is the first time you're trying to use Celery, or you're new to Celery v5.4.x coming from previous versions then you should read our getting started tutorials:

You can also get started with Celery by using a hosted broker transport CloudAMQP. The largest hosting provider of RabbitMQ is a proud sponsor of Celery.

Celery is...

  • Simple

    Celery is easy to use and maintain, and does not need configuration files.

    It has an active, friendly community you can talk to for support, like at our mailing-list, or the IRC channel.

    Here's one of the simplest applications you can make:

    from celery import Celery
    
    app = Celery('hello', broker='amqp://guest@localhost//')
    
    @app.task
    def hello():
        return 'hello world'
  • Highly Available

    Workers and clients will automatically retry in the event of connection loss or failure, and some brokers support HA in way of Primary/Primary or Primary/Replica replication.

  • Fast

    A single Celery process can process millions of tasks a minute, with sub-millisecond round-trip latency (using RabbitMQ, py-librabbitmq, and optimized settings).

  • Flexible

    Almost every part of Celery can be extended or used on its own, Custom pool implementations, serializers, compression schemes, logging, schedulers, consumers, producers, broker transports, and much more.

It supports...

  • Message Transports

  • Concurrency

  • Result Stores

    • AMQP, Redis
    • memcached
    • SQLAlchemy, Django ORM
    • Apache Cassandra, IronCache, Elasticsearch
  • Serialization

    • pickle, json, yaml, msgpack.
    • zlib, bzip2 compression.
    • Cryptographic message signing.

Framework Integration

Celery is easy to integrate with web frameworks, some of which even have integration packages:

Django not needed
Pyramid pyramid_celery
Pylons celery-pylons
Flask not needed
web2py web2py-celery
Tornado tornado-celery
FastAPI not needed

The integration packages aren't strictly necessary, but they can make development easier, and sometimes they add important hooks like closing database connections at fork.

Documentation

The latest documentation is hosted at Read The Docs, containing user guides, tutorials, and an API reference.

最新的中文文档托管在 https://www.celerycn.io/ 中,包含用户指南、教程、API接口等。

Installation

You can install Celery either via the Python Package Index (PyPI) or from source.

To install using pip:

$ pip install -U Celery

Bundles

Celery also defines a group of bundles that can be used to install Celery and the dependencies for a given feature.

You can specify these in your requirements or on the pip command-line by using brackets. Multiple bundles can be specified by separating them by commas.

$ pip install "celery[redis]"

$ pip install "celery[redis,auth,msgpack]"

The following bundles are available:

Serializers

celery[auth]:for using the auth security serializer.
celery[msgpack]:for using the msgpack serializer.
celery[yaml]:for using the yaml serializer.

Concurrency

celery[eventlet]:for using the eventlet pool.
celery[gevent]:for using the gevent pool.

Transports and Backends

celery[amqp]:

for using the RabbitMQ amqp python library.

celery[redis]:

for using Redis as a message transport or as a result backend.

celery[sqs]:

for using Amazon SQS as a message transport.

celery[tblib]:

for using the task_remote_tracebacks feature.

celery[memcache]:

for using Memcached as a result backend (using pylibmc)

celery[pymemcache]:

for using Memcached as a result backend (pure-Python implementation).

celery[cassandra]:

for using Apache Cassandra/Astra DB as a result backend with the DataStax driver.

celery[azureblockblob]:

for using Azure Storage as a result backend (using azure-storage)

celery[s3]:

for using S3 Storage as a result backend.

celery[gcs]:

for using Google Cloud Storage as a result backend.

celery[couchbase]:

for using Couchbase as a result backend.

celery[arangodb]:

for using ArangoDB as a result backend.

celery[elasticsearch]:

for using Elasticsearch as a result backend.

celery[riak]:

for using Riak as a result backend.

celery[cosmosdbsql]:

for using Azure Cosmos DB as a result backend (using pydocumentdb)

celery[zookeeper]:

for using Zookeeper as a message transport.

celery[sqlalchemy]:

for using SQLAlchemy as a result backend (supported).

celery[pyro]:

for using the Pyro4 message transport (experimental).

celery[slmq]:

for using the SoftLayer Message Queue transport (experimental).

celery[consul]:

for using the Consul.io Key/Value store as a message transport or result backend (experimental).

celery[django]:

specifies the lowest version possible for Django support.

You should probably not use this in your requirements, it's here for informational purposes only.

Downloading and installing from source

Download the latest version of Celery from PyPI:

https://pypi.org/project/celery/

You can install it by doing the following:

$ tar xvfz celery-0.0.0.tar.gz
$ cd celery-0.0.0
$ python setup.py build
# python setup.py install

The last command must be executed as a privileged user if you aren't currently using a virtualenv.

Using the development version

With pip

The Celery development version also requires the development versions of kombu, amqp, billiard, and vine.

You can install the latest snapshot of these using the following pip commands:

$ pip install https://github.com/celery/celery/zipball/main#egg=celery
$ pip install https://github.com/celery/billiard/zipball/main#egg=billiard
$ pip install https://github.com/celery/py-amqp/zipball/main#egg=amqp
$ pip install https://github.com/celery/kombu/zipball/main#egg=kombu
$ pip install https://github.com/celery/vine/zipball/main#egg=vine

With git

Please see the Contributing section.

Getting Help

Mailing list

For discussions about the usage, development, and future of Celery, please join the celery-users mailing list.

IRC

Come chat with us on IRC. The #celery channel is located at the Libera Chat network.

Bug tracker

If you have any suggestions, bug reports, or annoyances please report them to our issue tracker at https://github.com/celery/celery/issues/

Wiki

https://github.com/celery/celery/wiki

Credits

Contributors

This project exists thanks to all the people who contribute. Development of celery happens at GitHub: https://github.com/celery/celery

You're highly encouraged to participate in the development of celery. If you don't like GitHub (for some reason) you're welcome to send regular patches.

Be sure to also read the Contributing to Celery section in the documentation.

oc-contributors

Backers

Thank you to all our backers! 🙏 [Become a backer]

oc-backers

Sponsors

Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]

oc-sponsor-1 Upstash

License

This software is licensed under the New BSD License. See the LICENSE file in the top distribution directory for the full license text.

django-celery-beat's People

Contributors

arnau126 avatar ask avatar auvipy avatar bradshjg avatar cclauss avatar contmp avatar d3x avatar ddevine avatar dependabot[bot] avatar foarsitter avatar goatwu1993 avatar gopackgo90 avatar hartwork avatar kirade avatar liquidpele avatar lsaavedr avatar mbaechtold avatar mheppner avatar milind-shakya-sp avatar mondeja avatar moreaba avatar nijel avatar o3o3o avatar philipp-zettl avatar pre-commit-ci[bot] avatar rapto avatar rsichnyi avatar thedrow avatar tinylambda avatar wardal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

django-celery-beat's Issues

Python 3 has no __unicode__() method

ERRORS:
<class 'django_celery_beat.admin.PeriodicTaskAdmin'>: (admin.E108) The value of 'list_display[0]' refers to 'unicode', which is not a callable, an attribute of 'PeriodicTaskAdmin', or an attribute or method on 'django_celery_beat.PeriodicTask'.

Interval works, crontab does not work

I can't get the crontab to work, no matter what I do including restarting the celery beat, restarting the server. I checked the timezone is Europe/London. Interval always works.

django==1.10
celery==4.0.2
django_celery_beat==1.0.1
django_celery_results==1.0.1
kombu==4.0.2
amqp==2.1.4
anyjson==0.3.3
billiard==3.5.0.2

celery.py:

app = Celery('me')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.conf.timezone = 'Europe/London'
app.conf.beat_scheduler = 'DatabaseScheduler'
app.conf.result_backend = 'django-db'

app.conf.beat_schedule = {

# Calls cellery_add(4,5) every 60 seconds - , expires=10
    'celery_add_test_30_secs': {
        'task': 'lendengine.tasks.celery_add',
        'schedule': 30.0,
        "options": {'queue': 'test2Q'},
        'args': (4, 5),
        "enabled": True,
    },


# test a scheduled run:
'cron_test': {
        'task': 'lendengine.tasks.testcrontask',
        'schedule': crontab(),
        "options": {'queue': 'testcronQ'},
        "enabled": True,
    },

}

tasks.py:

from web.celery import app as app2

@app2.task(queue='testcronQ', time_limit=7200)
def testcrontask(on_day=15, on_month=5):
    return on_day + on_month

@app2.task(queue='test2Q')
# logger.info('running tasks')
def celery_add(x ,y):
    return x + y + 5

Commands:

celery -A web beat -l debug -S django
celery -A web worker -l debug -Q testcronQ -n worker1@%%h -Ofair
celery -A web worker -l debug -Q test2Q -n worker1@%%h -Ofair

crontab schedule with specific time not running

When I am setting a specific time like 1:04 or 23:11 as hours and minutes and all other fields are *. The scheduler is getting updated but not sending the task to celery worker.
But if I set the hour and minutes with respect to */{any number}, it is running well. The day_of_week parameter is also working. Just the specific-time cron periodic tasks are not getiing scheduled.
Using django 1.11, celery/master and django-celery-master.
Any solution would be very much helpful.

Сannot import name python_2_unicode_compatible

python 2.7, celery 3.1.23.
I have error during execution of command

python manage.py migrate

from celery.five import python_2_unicode_compatible
ImportError: cannot import name python_2_unicode_compatible

The name field of PeriodicTask does not supports Chinese characters

celery beat v4.0.0 (latentcall) is starting.
__ - ... __ - _
LocalTime -> 2016-11-18 20:30:01
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.app.AppLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler

. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 seconds (5s)

[2016-11-18 20:30:01,155: DEBUG/MainProcess] DatabaseScheduler: initial read
[2016-11-18 20:30:01,155: INFO/MainProcess] Writing entries...
[2016-11-18 20:30:01,156: DEBUG/MainProcess] DatabaseScheduler: Fetching database schedule
[2016-11-18 20:30:01,163: WARNING/MainProcess] Traceback (most recent call last):
[2016-11-18 20:30:01,163: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/bin/celery", line 11, in
[2016-11-18 20:30:01,164: WARNING/MainProcess] sys.exit(main())
[2016-11-18 20:30:01,164: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/main.py", line 14, in main
[2016-11-18 20:30:01,165: WARNING/MainProcess] _main()
[2016-11-18 20:30:01,165: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/celery.py", line 326, in main
[2016-11-18 20:30:01,166: WARNING/MainProcess] cmd.execute_from_commandline(argv)
[2016-11-18 20:30:01,166: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/celery.py", line 488, in execute_from_commandline
[2016-11-18 20:30:01,166: WARNING/MainProcess] super(CeleryCommand, self).execute_from_commandline(argv)))
[2016-11-18 20:30:01,167: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/base.py", line 278, in execute_from_commandline
[2016-11-18 20:30:01,171: WARNING/MainProcess] return self.handle_argv(self.prog_name, argv[1:])
[2016-11-18 20:30:01,171: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/celery.py", line 480, in handle_argv
[2016-11-18 20:30:01,176: WARNING/MainProcess] return self.execute(command, argv)
[2016-11-18 20:30:01,176: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/celery.py", line 412, in execute
[2016-11-18 20:30:01,177: WARNING/MainProcess] ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
[2016-11-18 20:30:01,177: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/base.py", line 282, in run_from_argv
[2016-11-18 20:30:01,177: WARNING/MainProcess] sys.argv if argv is None else argv, command)
[2016-11-18 20:30:01,177: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/base.py", line 365, in handle_argv
[2016-11-18 20:30:01,177: WARNING/MainProcess] return self(*args, **options)
[2016-11-18 20:30:01,178: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/base.py", line 241, in call
[2016-11-18 20:30:01,178: WARNING/MainProcess] ret = self.run(*args, **kwargs)
[2016-11-18 20:30:01,178: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/bin/beat.py", line 107, in run
[2016-11-18 20:30:01,179: WARNING/MainProcess] return beat().run()
[2016-11-18 20:30:01,179: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/apps/beat.py", line 79, in run
[2016-11-18 20:30:01,180: WARNING/MainProcess] self.start_scheduler()
[2016-11-18 20:30:01,181: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/apps/beat.py", line 107, in start_scheduler
[2016-11-18 20:30:01,182: WARNING/MainProcess] service.start()
[2016-11-18 20:30:01,182: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/beat.py", line 528, in start
[2016-11-18 20:30:01,186: WARNING/MainProcess] humanize_seconds(self.scheduler.max_interval))
[2016-11-18 20:30:01,186: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in get
[2016-11-18 20:30:01,187: WARNING/MainProcess] value = obj.dict[self.name] = self.__get(obj)
[2016-11-18 20:30:01,187: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/beat.py", line 572, in scheduler
[2016-11-18 20:30:01,187: WARNING/MainProcess] return self.get_scheduler()
[2016-11-18 20:30:01,187: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/beat.py", line 567, in get_scheduler
[2016-11-18 20:30:01,188: WARNING/MainProcess] lazy=lazy,
[2016-11-18 20:30:01,188: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 179, in init
[2016-11-18 20:30:01,188: WARNING/MainProcess] Scheduler.init(self, *args, **kwargs)
[2016-11-18 20:30:01,189: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/celery/beat.py", line 204, in init
[2016-11-18 20:30:01,189: WARNING/MainProcess] self.setup_schedule()
[2016-11-18 20:30:01,189: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 187, in setup_schedule
[2016-11-18 20:30:01,189: WARNING/MainProcess] self.install_default_entries(self.schedule)
[2016-11-18 20:30:01,189: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 283, in schedule
[2016-11-18 20:30:01,189: WARNING/MainProcess] repr(entry) for entry in values(self._schedule)),
[2016-11-18 20:30:01,189: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 283, in
[2016-11-18 20:30:01,189: WARNING/MainProcess] repr(entry) for entry in values(self._schedule)),
[2016-11-18 20:30:01,189: WARNING/MainProcess] File "/home/david/PycharmProjects/david_blog/env/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 162, in repr
[2016-11-18 20:30:01,190: WARNING/MainProcess] safe_repr(self.kwargs), self.schedule,
[2016-11-18 20:30:01,190: WARNING/MainProcess] UnicodeDecodeError
[2016-11-18 20:30:01,190: WARNING/MainProcess] :
[2016-11-18 20:30:01,190: WARNING/MainProcess] 'ascii' codec can't decode byte 0xe6 in position 0: ordinal not in range(128)
[2016-11-18 20:30:01,190: INFO/MainProcess] Writing entries...

Crontab string representation does not match UNIX crontab expression

The CrontabSchedule.__str__ representation has two values flipped from how they appear in the conventional UNIX crontab expression format.

    def __str__(self):
        return '{0} {1} {2} {3} {4} (m/h/d/dM/MY)'.format(
            cronexp(self.minute),
            cronexp(self.hour),
            cronexp(self.day_of_week),
            cronexp(self.day_of_month),
            cronexp(self.month_of_year),
        )

Beat currently outputs values like minute hour day_of_week day_of_month month_of_year whereas UNIX convention is minute hour day_of_month month_of_year day_of_week .

https://en.wikipedia.org/wiki/Cron#CRON_expression


In my own use case using the https://github.com/Salamek/cron-descriptor library to get a friendly output like At 04:11 AM I had to flip the elements like so:

    @property
    def crontab_pretty(self):
        if self.schedule:  # instance of CrontabSchedule class
            # celery cron is like 'm h dw dM MY'
            celery_cron_expression = str(self.schedule).split('(')[0].strip()
            elements = celery_cron_expression.split(' ')
            elements[2], elements[3], elements[4] = elements[4], elements[2], elements[3]
            # unix cron is like 'm h dM MY dw'
            unix_cron_expression = ' '.join(elements)
            return cron_descriptor.get_description(unix_cron_expression)
        else:
            return None

We should at least match the UNIX crontab expression if we are going to use the */15 * * * * syntax -- although personally I would prefer just using the cron_descriptor.get_description function to provide a more friendly string. (Or both?)

ScheduledTask new feature request

I think it would be nice new feature to have a Class that can be scheduled like a PeriodicTask, but with a automatic "delete" after that run.

Currently i use Periodic tasks with crontabs to schedule events, and i must "manualy" cleanup the created tasks to ensure that next year the same task wont run again.

Imagine the use case of a one time future reminder event to send an email, we want to remind that time only and not again the next year.

If there is a better way to do this please teach me :)

PyPy solar support

I notice that now ephem has been added to the tests to support solar schedules, the PyPy test environments are failing because PyPy does not support the C backed for PyEphem and so the dependency cannot be installed.

How should this be handled? I think we need to exclude solar from the PyPy tests and make a note in the documentation about PyPy support. However, this sort of decision is above my pay grade.

UnicodeDecodeError: 'utf8' codec can't decode byte 0xd3 in position 28

celery==4.0.0
Django==1.10.3
django-celery-beat==1.0.1
django-celery-results==1.0.1

===========================================

CELERY TASK

CELERY_BROKER_URL = 'redis://:[email protected]:6379/0'
CELERY_RESULT_BACKEND = 'redis://:[email protected]:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers.DatabaseScheduler'

"celery -A app worker --loglevel=INFO -P gevent"

-------------- celery@PC0018 v4.0.2 (latentcall)
---- **** -----
--- * *** * -- Windows-8-6.2.9200 2017-02-13 11:39:14
-- * - **** ---

  • ** ---------- [config]
  • ** ---------- .> app: richtrading-system.trading:0x3e0e0f0
  • ** ---------- .> transport: amqp://guest:**@localhost:5672//
  • ** ---------- .> results: redis://:**@xxxx/0
  • *** --- * --- .> concurrency: 4 (gevent)
    -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
    --- ***** -----
    -------------- [queues]
    .> celery exchange=celery(direct) key=celery

[tasks]

the transport is error and report :

[2017-02-13 11:31:43,868: WARNING/MainProcess] Traceback (most recent call last):
[2017-02-13 11:31:43,869: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 859, in emit
[2017-02-13 11:31:43,875: WARNING/MainProcess] msg = self.format(record)
[2017-02-13 11:31:43,875: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 732, in format
[2017-02-13 11:31:43,878: WARNING/MainProcess] return fmt.format(record)
[2017-02-13 11:31:43,878: WARNING/MainProcess] File "c:\python27\lib\site-packages\celery\utils\log.py", line 153, in format
[2017-02-13 11:31:43,881: WARNING/MainProcess] msg = logging.Formatter.format(self, record)
[2017-02-13 11:31:43,885: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 471, in format
[2017-02-13 11:31:43,892: WARNING/MainProcess] record.message = record.getMessage()
[2017-02-13 11:31:43,894: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 335, in getMessage
[2017-02-13 11:31:43,894: WARNING/MainProcess] msg = msg % self.args
[2017-02-13 11:31:43,897: WARNING/MainProcess] UnicodeDecodeError: 'utf8' codec can't decode byte 0xd3 in position 28: invalid continuation byte
[2017-02-13 11:31:43,898: WARNING/MainProcess] Logged from file consumer.py, line 425
[2017-02-13 11:31:46,917: WARNING/MainProcess] Traceback (most recent call last):
[2017-02-13 11:31:46,918: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 859, in emit
[2017-02-13 11:31:46,921: WARNING/MainProcess] msg = self.format(record)
[2017-02-13 11:31:46,921: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 732, in format
[2017-02-13 11:31:46,924: WARNING/MainProcess] return fmt.format(record)
[2017-02-13 11:31:46,924: WARNING/MainProcess] File "c:\python27\lib\site-packages\celery\utils\log.py", line 153, in format
[2017-02-13 11:31:46,927: WARNING/MainProcess] msg = logging.Formatter.format(self, record)
[2017-02-13 11:31:46,927: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 471, in format
[2017-02-13 11:31:46,928: WARNING/MainProcess] record.message = record.getMessage()
[2017-02-13 11:31:46,930: WARNING/MainProcess] File "c:\python27\lib\logging_init_.py", line 335, in getMessage
[2017-02-13 11:31:46,931: WARNING/MainProcess] msg = msg % self.args
[2017-02-13 11:31:46,933: WARNING/MainProcess] UnicodeDecodeError: 'utf8' codec can't decode byte 0xd3 in position 28: invalid continuation byte
[2017-02-13 11:31:46,934: WARNING/MainProcess] Logged from file consumer.py, line 425

PeriodicTask doesn't work after crontab changed in admin?

I followed instruction after changed my task's schedule:

# in `django-extensions`'s shell
from django_celery_beat.models import PeriodicTask, PeriodicTasks
PeriodicTask.objects.all().update(last_run_at=None)
for task in PeriodicTask.objects.all():
    PeriodicTasks.changed(task)

Since PeriodicTasks.changed() require one argument(It occurs an error!), I execute code like above.

And I waiting until task runs but it didn't....

So I just turn off and on django-celery-beat, then it works..

What's the problem?

crontab does not use the CELERY_TIME_ZONE correctly

Although the CELERY_TIME_ZONE is used to display the last_run_time, the actual run time is the time stored in the database converted to UTC instead of being the time in the local time zone.

The machine time zone is set to the local time zone already if that matters

Cannot use local time for cron schedules and UTC for solar schedules

To make Solar scheduling work I had to set CELERY_TIMEZONE = 'UTC' - however this causes Crontab schedules to be offset by my timezone .

That means a crontab configured trigger in 1 hour's time does not end up triggering for 11 hours (because I am in +10 Australia/Brisbane).

Should django-celery-beat create crontab schedules with CrontabSchedule.tz = settings.TIME_ZONE instead to override CELERY_TIMEZONE?

celery.backend_cleanup never executing

Any schedule defined via admin page of django-celery-beat must have queue and exchange fields being set explicitly. Otherwise, it wouldn't be executed. So, we need to change the built-in celery.backend_cleanup schedule to have this options. That's ok.
As of a bug #7 we will restart celery-beat to enable changed schedules. This action performs a call of install_default_entries() which overwrites our changes made before, so the task wont execute.
Adding something like

CELERY_BEAT_SCHEDULE = {
        'celery.backend_cleanup': {
            'options': {'expires': 12 * 3600, 'queue': 'default', 'exchange': 'default'},
    },
}

seems not to be working: task will be overwritten with hardcoded defaults on each celery-beat restart.

I'm using celery 4.0.2 and master branch of django-celery-beat with django-db result backend, if it makes sense. Other tasks works as expected, also. The only problem is backend_cleanup execution.

unicode characters in periodic task name raise exception

[2017-06-20 11:49:47,155: CRITICAL/MainProcess] beat raised exception <type 'exceptions.UnicodeDecodeError'>: UnicodeDecodeError('ascii', 'Sprawd\xc5\xba logi', 6, 7, 'ordinal not in range(128)')
Traceback (most recent call last):
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/celery/apps/beat.py", line 107, in start_scheduler
    service.start()
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/celery/beat.py", line 528, in start
    humanize_seconds(self.scheduler.max_interval))
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
    value = obj.__dict__[self.__name__] = self.__get(obj)
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/celery/beat.py", line 572, in scheduler
    return self.get_scheduler()
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/celery/beat.py", line 567, in get_scheduler
    lazy=lazy,
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 179, in __init__
    Scheduler.__init__(self, *args, **kwargs)
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/celery/beat.py", line 204, in __init__
    self.setup_schedule()
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 187, in setup_schedule
    self.install_default_entries(self.schedule)
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 283, in schedule
    repr(entry) for entry in values(self._schedule)),
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 283, in <genexpr>
    repr(entry) for entry in values(self._schedule)),
  File "/.virtualenvs/im/local/lib/python2.7/site-packages/django_celery_beat/schedulers.py", line 162, in __repr__
    safe_repr(self.kwargs), self.schedule,
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc5 in position 6: ordinal not in range(128)

beat_schedule_filename is created even if Django is selected.

Hi there,

It appears that the BEAT_SCHEDULE_FILENAME is still created even if django-celery-beat is used. It doesn't appear to be used (based on timestamp changing) but it is still created.

Low priority but it is confusing to tell whether it's using the django database or not.

FYI - I'm launching this in my test env as follows.

celery -A celery_app worker --events --beat -S django -l DEBUG

This command is in alignment with the docs and it makes sense that it creates the django.db as the -S option defines the filename.

To verify that its working I needed to set BEAT_SYNC_EVERY = 1 and then watch the Django database.

Code for resetting "last run time" not working

This code in the README:

from django_celery_beat import PeriodicTask, PeriodicTasks
PeriodicTask.objects.all().update(last_run_at=None)
PeriodicTasks.changed()

I get an import error for the first line, and if I fix that I get a error for the third line saying the changed() method requires an argument instance.

Should the code be like this?

from django_celery_beat.models import PeriodicTask, PeriodicTasks
PeriodicTask.objects.all().update(last_run_at=None)
[PeriodicTasks.changed(task) for task in PeriodicTask.objects.all()]

Total run Count is not being updated after a task run

Hello guys, I need your help with an issue (this could be an issue with django_celery_beat)
I'm going to use celery and django celery beat in a project with many async tasks.
In my personal tests I realized that PeriodicTask.total_run_count is not being updated after every run. There is a reason for that? Maybe I'm missing some concept.
I'm sending the periodic task ID to my task in order to verify if it already executed a 100 times (for example) and I get some executions with same total_run_count (not sure if this is a good pratice).

Task Code Example:
@celery_app.task def print_run_count(*args, **kwargs): tks = PeriodicTask.objects.get(pk=kwargs['periodic_task']) print(tks.total_run_count)
Thanks

Question: Get reference to parent PeriodicTask within Celery task

Hi - a question more than an issue, which I couldn't see documented anywhere.

Is there any known way to get a reference to the PeriodicTask within each scheduled task, eg using Celery's current_task method. My use case is that I have Celery tasks running both on a schedule and ad-hoc, so I would like to be able to log the source schedule / originator of each task as it runs.

I've looked through the beat codebase to see whether I could fork-and-modify the code at the time of the scheduled task being raised. But couldn't clearly identify where the tasks are actually added to the queue.

Any advice gratefully received, or just pointers to how I can continue to self-help.

exited with code 73

hi

my dear friend

i have a question

docker+django+django-celery-beat+django-celery-work

when i use django-admin edit Periodic tasks
django-celery-beat must restart to update Periodic tasks

so

i use shipyard https://github.com/shipyard/shipyard restart docker container

every time restart docker container, it return exited with code 73

...

long time ...

delete celerybeat.pid

it's ok

when i use docker and i stop docker , why not auto delete celerybeat.pid ?

my docker-compose.yml


version: '2'
services:
  nginx:
    build: dockerfile/nginx1
    restart: always
    volumes:
      - ./www/:/www/:ro
    ports:
      - "80:80"
      - "443:443"
    links:
      - "api1"

  rabbit:
    image: rabbitmq:3.6
    ports:
      - "5672:5672"
      - "15672:15672"

  api1:
    build: dockerfile/django_api1
    restart: always
    ports:
        - "8000:8000"
    volumes:
        - ./api1/:/api1
    command: sh -c 'cd /api1 && gunicorn --workers=5 --bind 0.0.0.0:8000 api1.wsgi'

  celery_worker:
    build: dockerfile/django_api1
    command: sh -c 'cd /api1 && celery -A api1 worker -l info'
    volumes_from:
      - api1
    depends_on:
      - api1
      - rabbit
    links:
      - rabbit

  celery_beat:
    build: dockerfile/django_api1
    command: sh -c 'cd /api1 && celery -A api1 beat -l info -S django'
    volumes_from:
      - api1
    depends_on:
      - api1
      - rabbit
    links:
      - rabbit

Power-off-period cronjob runs right after the first power-on-period cronjob ended instead of runs right after celerybeat starts

anacron quotes:
When your computer is shut down (or the cron daemon is otherwise not running), cron jobs will not be started. If you have jobs that you would like to run after the fact during those times when the computer is shut down, use anacron.
I need to uncover today's product at 3 o'clock midnight in production. Celerybeat run well like anacron before 2016-12-04, but not after, on my computer. Here's the detail:

I use supervisor for the daemonize thing. Celery: v 4.0.0 django_celery_beat: v 1.0.1
Cronjob 3'disclose set as '0 3 * * *'.
Day 12-01 worked as expected, as soon as computer startup and supervisor run, task 3'disclose scheduled like anacron:

[12-01 09:53:19,316] beat: Starting... (startup)
[12-01 09:53:19,507] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-02 01:23:30,357] poweroff

12-02 01:23:30 ~ 12-02 10:40:24 computer was powered off.
Day 12-02 worked as expected.

[12-02 10:40:24,685] beat: Starting... (startup)
[12-02 10:40:24,867] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-03 00:39:08,070] poweroff

12-03 00:39:08 ~ 12-03 11:00:29 computer was powered off.
Day 12-03 worked as expected.

[12-03 11:00:29,712] beat: Starting... (startup)
[12-03 11:00:29,892] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-04 02:02:48,290] poweroff

---------------------------- Here comes the wired thing ----------------------------
12-04 02:02:48 ~ 12-04 09:46:25 computer was powered off.
Day 12-04

[12-04 09:46:25,760] beat: Starting... (startup)
[12-04 12:00:00,022] Scheduler: Sending due task 12'greeting (homepage.tasks.greeting)
[12-04 12:01:00,005] Scheduler: Sending due task 12'greeting (homepage.tasks.greeting)
[12-04 12:...
[12-04 12:59:00,002] Scheduler: Sending due task 12'greeting (homepage.tasks.greeting)
[12-04 12:59:00,008] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-05 01:52:28,364] poweroff
Cronjob 12'greeting set as '* 12 * * *'. 
Note the time 09:46:25 and 12:59:00, as the title describes.

12-05 01:52:28 ~ 12-05 10:45:32 computer was powered off.
Day 12-05

[12-05 10:45:32,857] beat: Starting... (startup)
[12-05 12:00:00,025] Scheduler: Sending due task 12'greeting (homepage.tasks.greeting)
[12-05 12:00:00,043] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-06 03:00:00,004] Scheduler: Sending due task 3'disclose (homepage.tasks.disclose_today)
[12-06 03:00:00,012] Scheduler: Sending due task celery.backend_cleanup (celery.backend_cleanup)
[12-06 04:00:00,004] Scheduler: Sending due task celery.backend_cleanup (celery.backend_cleanup)
[12-06 04:03:00,922] poweroff
Cronjob 12'greeting set as '0 12 * * *'. 
Note the behavior of backend_cleanup at 03:00:00.  It behaves just like my task 3'disclose.

As far as I can remember, I changed timezone for celery app and added the cronjob 12'greeting for testing between day 12-03 and day 12-04.
Why? How to schedule 3'disclose as soon as celery beat starts like before 12-03?

django-celery-beat does not support nowfun in schedules

See http://docs.celeryproject.org/en/latest/reference/celery.schedules.html

Even when using CrontabSchedule.from_schedule() with a schedule that uses nowfun, django-celery-beat does not store nowfun or make use of it.

from django_celery_beat.models import CrontabSchedule
from celery.schedules import crontab
nowfun = lambda: datetime.now(pytz.timezone("US/Eastern"))
crontab_schedule = crontab(minute=minute,
    hour=1,
    day_of_month="*",
    month_of_year="*",
    day_of_week="*",
    nowfun=nowfun)
db_crontab_schedule = CrontabSchedule.from_schedule(crontab_schedule)
db_crontab_schedule.save()

The above does not end up saving or using nowfun.

Solar Schedules break Celery Beat.

I have the latest django-celery-beat (f014edc not from PyPI) and Celery 4.0.2 and Solar Schedules crash celery beat with:

...
[2017-06-22 14:41:17,762: WARNING/MainProcess] File "/opt/myapp-env/lib/python3.5/site-packages/django_celery_beat/schedulers.py", line 102, in is_due
[2017-06-22 14:41:17,762: WARNING/MainProcess] return self.schedule.is_due(self.last_run_at)
[2017-06-22 14:41:17,762: WARNING/MainProcess] AttributeError
[2017-06-22 14:41:17,762: WARNING/MainProcess] :
[2017-06-22 14:41:17,762: WARNING/MainProcess] 'NoneType' object has no attribute 'is_due'

Using PDB shows that self.schedule is None even though it is certainly making it out the the database because self.model.solar is <SolarSchedule: solar_noon (-20.0, 150.0).

Digging through I found that django_celery_beat.models.PeriodicTask.schedule() does not return the solar schedule. So I addded it like this:

@property
def schedule(self):
    if self.interval:
        return self.interval.schedule
    if self.crontab:
        return self.crontab.schedule
    if self.solar:
        return self.solar.schedule

After hacking this in I found that 'ephem' was not installed as a dependency of celery or django-celery-beat. Then I ran into a bug in celery.schedules.solar.remaining_estimate() - after hacking this, celery beat no-longer crashed.

Todo: Add warnings about django-celery-beat instability/lack of production readiness to celery documentation

As much as I support, understand and use all things open source ...I cannot endorse pre-alpha code not being called out as such.

With Celery 4.0.2 being out as a stable release, it is the least that the documentation could do to call out the fact that the django celery beat is nowhere being ready for production use. So creating this ticket so others may be warned, get support for getting documentation updated with the same.

Have already been bitten by over a half dozen issues here. Its come to the point where I am reading the issues to find out if I am about to be bitten by more 👎 👎 👎 !

Have loved celery and djcelery otherwise but this is such a poor replacement and that too as an official contrib

Admin filters

I think the Django celery beat app could do with a bit of admin improvements, the admin pages are quite lacking to be honest:

  • Filters would be nice for a starter, how can I quickly filter on SUCCESS/FAILED task results (at the moment I have to do this through sorting)
  • Date filter might be useful for people that keep results around for a long time, but maybe not hugely important.
  • Maybe searching

'bytes' object has no attribute 'encode' when running with python 3.4

when starting celery like this celery worker -E -B -Q celery -A myapp -S django
I get:

Traceback (most recent call last):
  File "/usr/local/bin/celery", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.4/dist-packages/celery/__main__.py", line 14, in main
    _main()
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/celery.py", line 326, in main
    cmd.execute_from_commandline(argv)
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/celery.py", line 488, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/base.py", line 278, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/celery.py", line 480, in handle_argv
    return self.execute(command, argv)
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/celery.py", line 412, in execute
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/worker.py", line 221, in run_from_argv
    return self(*args, **options)
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/base.py", line 241, in __call__
    ret = self.run(*args, **kwargs)
  File "/usr/local/lib/python3.4/dist-packages/celery/bin/worker.py", line 255, in run
    **kwargs)
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/worker.py", line 99, in __init__
    self.setup_instance(**self.prepare_args(**kwargs))
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/worker.py", line 139, in setup_instance
    self.blueprint.apply(self, **kwargs)
  File "/usr/local/lib/python3.4/dist-packages/celery/bootsteps.py", line 214, in apply
    step.include(parent)
  File "/usr/local/lib/python3.4/dist-packages/celery/bootsteps.py", line 343, in include
    return self._should_include(parent)[0]
  File "/usr/local/lib/python3.4/dist-packages/celery/bootsteps.py", line 339, in _should_include
    return True, self.create(parent)
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/components.py", line 214, in create
    w._persistence = w.state.Persistent(w.state, w.statedb, w.app.clock)
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/state.py", line 189, in __init__
    self.merge()
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/state.py", line 197, in merge
    self._merge_with(self.db)
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/state.py", line 213, in _merge_with
    self._merge_revoked(d)
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/state.py", line 232, in _merge_revoked
    self._merge_revoked_v3(d[b'zrevoked'])
  File "/usr/lib/python3.4/shelve.py", line 113, in __getitem__
    f = BytesIO(self.dict[key.encode(self.keyencoding)])
AttributeError: 'bytes' object has no attribute 'encode'

without -S djangooption it works fine.

Use CELERY_ROUTES setting when routing fields are empty

Hi,

I have been struggling with configuring custom routing within my django project. I have tried with several different configurations involving CELERY_ROUTES and CELERY_QUEUES setting after realizing that probably django-celery-beat is ignoring these properties while finally scheduling to celery.

I was assuming that leaving blank the routing information when creating the periodic tasks in the admin would use CELERY_ROUTES for properly routing the tasks

Is there any plan to consider this?

Thanks

Scheduled tasks don't get executed

So I have been trying to setup celery for quite a few hours and after some time I sort of managed to get it working. Now the part that I am currently stuck is that the scheduled tasks from beat aren't getting executed by celery workers. It seems that they aren't actually being sent to beat? The PeriodicTask objects are getting creating, but not executed.

Celery starts up fine by using:

celery -A proj worker --loglevel=DEBUG --concurrency=12 -B -S django -E -n worker1@pw --without-gossip --without-mingle --without-heartbeat
2017-02-21T23:00:32.730193+00:00 app[worker1.1]: [2017-02-21 23:00:32,728: INFO/MainProcess] Connected to amqp://**:**@antelope.rmq.cloudamqp.com:5672/
2017-02-21T23:00:32.740966+00:00 app[worker1.1]: [2017-02-21 23:00:32,740: INFO/Beat] beat: Starting...
2017-02-21T23:00:32.765330+00:00 app[worker1.1]: [2017-02-21 23:00:32,765: INFO/MainProcess] worker1@pw ready.

Task get created fine

    def start_collecting(self, interval=1):
        if PeriodicTask.objects.filter(task='twitch_stats.tasks.get_all_stats').count() == 0:
            schedule, created = IntervalSchedule.objects.get_or_create(
                every=interval,
                period=IntervalSchedule.MINUTES,
            )
            PeriodicTask.objects.create(
                interval=schedule,
                name='Collecting statistics',
                task='twitch_stats.tasks.get_all_stats',
                enabled=True,
            )
            print(PeriodicTask.objects.count())
2017-02-21T23:25:08.867896+00:00 app[web.1]: 1

But the beat doesn't recognize

2017-02-21T23:25:36.554762+00:00 app[worker1.1]: [2017-02-21 23:25:36,554: DEBUG/Beat] beat: Waking up in 5.00 seconds.
2017-02-21T23:25:41.557634+00:00 app[worker1.1]: [2017-02-21 23:25:41,557: DEBUG/Beat] beat: Waking up in 5.00 seconds.

TypeError: issubclass() arg 1 must be a class

Hello, I get the following error while opening the "Periodic tasks" changelist view with Django 1.8.16.

Internal Server Error: /admin/django_celery_beat/periodictask/
Traceback (most recent call last):
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django/core/handlers/base.py", line 132, in get_response
    response = wrapped_callback(request, *callback_args, **callback_kwargs)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django/contrib/admin/options.py", line 618, in wrapper
    return self.admin_site.admin_view(view)(*args, **kwargs)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django/utils/decorators.py", line 110, in _wrapped_view
    response = view_func(request, *args, **kwargs)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django/views/decorators/cache.py", line 57, in _wrapped_view_func
    response = view_func(request, *args, **kwargs)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django/contrib/admin/sites.py", line 233, in inner
    return view(request, *args, **kwargs)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django_celery_beat/admin.py", line 136, in changelist_view
    extra_context['wrong_scheduler'] = not is_database_scheduler(scheduler)
  File "/var/local/pyweb/virtualenv/dnsalloc/lib/python2.7/site-packages/django_celery_beat/utils.py", line 41, in is_database_scheduler
    return issubclass(symbol_by_name(scheduler), DatabaseScheduler)
TypeError: issubclass() arg 1 must be a class

Please take a look, thanks.

django_celery_beat executes crontabscheduled task only once

I am trying to integrate django_celery_beat in project. When I setup task using django admin panel entries in database are reflected correctly and when I start project task gets executed successfully. Whereas on every successive attempt of sending task is rather failed or either message is not sent to broker.

My setup :

django_celery_beat : 1.0.1
celery: 4.0.1

Setting in celeryconfig:

CELERY_ENABLE_UTC = True
BROKER_URL = 'amqp://guest:guest@localhost:5672//'
CELERYBEAT_SCHEDULER = 'django_celery_beat.schedulers.DatabaseScheduler'

When I start celery I do get exception of :

Traceback (most recent call last):
 File "/usr/local/lib/python3.4/dist-packages/celery/worker/pidbox.py", line 42, in on_message
   self.node.handle_message(body, message)
 File "/usr/local/lib/python3.4/dist-packages/kombu/pidbox.py", line 129, in handle_message
   return self.dispatch(**body)
 File "/usr/local/lib/python3.4/dist-packages/kombu/pidbox.py", line 112, in dispatch
   ticket=ticket)
 File "/usr/local/lib/python3.4/dist-packages/kombu/pidbox.py", line 135, in reply
   serializer=self.mailbox.serializer)
 File "/usr/local/lib/python3.4/dist-packages/kombu/pidbox.py", line 265, in _publish_reply
   **opts
 File "/usr/local/lib/python3.4/dist-packages/kombu/messaging.py", line 181, in publish
   exchange_name, declare,
 File "/usr/local/lib/python3.4/dist-packages/kombu/messaging.py", line 203, in _publish
   mandatory=mandatory, immediate=immediate,
 File "/usr/local/lib/python3.4/dist-packages/amqp/channel.py", line 1748, in _basic_publish
   (0, exchange, routing_key, mandatory, immediate), msg
 File "/usr/local/lib/python3.4/dist-packages/amqp/abstract_channel.py", line 64, in send_method
   conn.frame_writer(1, self.channel_id, sig, args, content)
 File "/usr/local/lib/python3.4/dist-packages/amqp/method_framing.py", line 174, in write_frame
   write(view[:offset])
 File "/usr/local/lib/python3.4/dist-packages/amqp/transport.py", line 269, in write
   self._write(s)
BrokenPipeError: [Errno 32] Broken pipe

Logs in beat.stderr.log:

[2017-06-12 05:40:00,002: INFO/MainProcess] Scheduler: Sending due task pooja_task (app.tasks.send_test_mail)
[2017-06-12 05:41:00,002: INFO/MainProcess] Scheduler: Sending due task pooja_task (app.tasks.send_test_mail)
[2017-06-12 05:42:00,002: INFO/MainProcess] Scheduler: Sending due task pooja_task (app.tasks.send_test_mail)

However from all above task sent, celery picks only first task and execute whenever I start by server. All other messages are avoided.

CrontabSchedule ignore TIME_ZONE setting

I just upgraded my project from 'django-celery' to 'django-celery-beat'.

In my django project TIME_ZONE = 'Europe/Moscow'.
I configured CrontabSchedule as "30 9 * * * (m/h/d/dM/MY)", but it was executed only at 12:30.
Why timezone ignored? last_run_at was None, task was launched first time.

Release new version

Hi, There are a lot of new commits since November 2016
can you please release a new version?

Thanks!

Solar schedules

Are there any plans for adding the solar schedules? Can I work on a pull request for this feature?

scheduler wont lauch tasks if DatabaseScheduler: Schedule changed.

I have a Model that generates a PeriodicTask with CrontabSchedule, and sometimes there are 2 or 3 of those tasks at the same time. everytime a new Model instance is saved or edited, the Schedule will change, so the DatabaseScheduler resyncs everything. If this happens near the full minute mark, when some new tasks are bound to be launched, they never are as the process is busy with rescheduling and wont launch those scheduled tasks.

you can recreate this by:

setting up a bunch of tasks every minute

from djcelery.models import CrontabSchedule, PeriodicTask
crontab = CrontabSchedule.objects.create(minute='*')
tasks = ['task1', 'task2', 'task3', 'task4', 'task5', 'task6', 'task7', 'task8', 'task9', ]
for task in tasks:
    PeriodicTask.objects.create(name=task, crontab=crontab)

generating a new task, in seccond 59 of any minute
PeriodicTask.objects.create(name='interruptor1', crontab=crontab)

you will see the beat log prompt a schedule change and it wont launch the shceduled tasks.

how to update the last run time

I do as you demo code to update the last_run_at, but there is something wrong in it.
from django_celery_beat import PeriodicTask, PeriodicTasks miss 'models' in it,and I change 'from django_celery_beat.models import PeriodicTask, PeriodicTasks' is success.
And the PeriodicTasks.changed() raise the error as

  File "<console>", line 1, in <module>
TypeError: changed() takes exactly 2 arguments (1 given)

now I change to the default timezone and run success, but how can I update the last run time by using python shell.
thanks.

DatabaseScheduler sync problem

In DatabaseScheduler in sync function:

    def sync(self):
        info('Writing entries...')
        _tried = set()
        try:
            with transaction.atomic():
                while self._dirty:
                    try:
                        name = self._dirty.pop()
                        _tried.add(name)
                        self.schedule[name].save()
                    except (KeyError, ObjectDoesNotExist):
                        pass
        except DatabaseError as exc:
            # retry later
            self._dirty |= _tried
            logger.exception('Database error while sync: %r', exc)

  @property
    def schedule(self):
        update = False
        if not self._initial_read:
            debug('DatabaseScheduler: initial read')
            update = True
            self._initial_read = True
        elif self.schedule_changed():
            info('DatabaseScheduler: Schedule changed.')
            update = True

        if update:
            self.sync()
            self._schedule = self.all_as_schedule()
            # the schedule changed, invalidate the heap in Scheduler.tick
            self._heap = None
            if logger.isEnabledFor(logging.DEBUG):
                debug('Current schedule:\n%s', '\n'.join(
                    repr(entry) for entry in values(self._schedule)),
                )
        return self._schedule

Property self.schedule is never called If self._dirty is empty.
Which means that celery beat doesn't check if there are any changes in database.

I solved it by adding self.schedule at the end of sync function.
I think it's not the best solution. At least quite confusing.

shared_task defined in custom apps not showing up in registered tasks in periodic tasks admin page

The tasks defined in the proj/proj/celery.py file directly are registering successfully.

However, shared tasks defined in the custom django apps are not registering to the database and not showing up in admin at /admin/django_celery_beat/periodictask/.

proj/proj/celery.py

# .. imports.. #
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

celery_app = Celery('proj')
celery_app.config_from_object('django.conf:settings', namespace='CELERY')
celery_app.autodiscover_tasks()

@celery_app.task(bind=True)
def hello():
    print "This task is registering successfully"

proj/app/tasks.py

from celery import shared_task

@shared_task
def app_hello():
    print "This task is not registering"

Add `on_delete` to ForeignKeys

When attempting to use celery-beat with the current master Django branch, I get the following error messages:

TypeError: __init__() missing 1 required positional argument: 'on_delete'
django_celery_beat/models.py:176: RemovedInDjango20Warning: on_delete will be a required arg for ForeignKey in Django 2.0. Set it to models.CASCADE on models and in existing migrations if you want to maintain the current default behavior. See https://docs.djangoproject.com/en/1.11/ref/models/fields/#django.db.models.ForeignKey.on_delete
  null=True, blank=True, verbose_name=_('interval'),

This is because of the deprecation mentioned here: https://docs.djangoproject.com/en/1.11/ref/models/fields/#django.db.models.ForeignKey.on_delete

Deprecated since version 1.9:
on_delete will become a required argument in Django 2.0. In older versions it defaults to CASCADE.

This should be safe to set explicitly to CASCADE on all ForeignKeys, as this will maintain legacy behavior, and on_delete has been an available option since before 1.8, which is the oldest version currently supported by this package.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.