Giter Site home page Giter Site logo

nextdoor / ndscheduler Goto Github PK

View Code? Open in Web Editor NEW
1.1K 49.0 203.0 411 KB

A flexible python library for building your own cron-like system, with REST APIs and a Web UI.

License: BSD 2-Clause "Simplified" License

Python 55.58% Makefile 0.78% HTML 12.61% CSS 1.40% JavaScript 29.15% Shell 0.13% Dockerfile 0.36%

ndscheduler's Issues

Add chaining job

How to add chaining job?
After job1 is finish then job2 start

Implimentation

Hello, I'm building an aquarium controller that is open sourced and was looking to build a scheduler into it. It is a flask app and I was wondering if you have an example of this being used in a flask app. Does this need to run separate from the app and then the app can use the API to communicate with it back and forth? Appreciate it.

Cannot add a job that doesn't take arguments through the web-ui.

Basically, I have a bunch of custom jobs that don't take arguments. I've therefore omitted the 'arguments' section from the meta_info() function.

This seems to allow the jobs to be executed fine, but the web-ui doesn't allow you to add the job due to an error:

image

Basically, it seems the interface is blindly assuming that these fields are present.

It seems like either the value of meta_info should be validated before shipping to the web-client, or it should handle missing parameters in a way that doesn't render the web-interface non-usable.

Installation fails

Collecting ndscheduler from git+https://github.com/Nextdoor/ndscheduler.git#egg=ndscheduler
Cloning https://github.com/Nextdoor/ndscheduler.git to /tmp/pip-build-mm359amx/ndscheduler
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-build-mm359amx/ndscheduler/setup.py", line 65, in
long_description=open('README.md').read(),
File "/home/thomas/production/env/lib/python3.5/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 495: ordinal not in range(128)

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-mm359amx/ndscheduler/

possible error in ndscheduler/server/server.py

I tried to build app by Dockerfile and then run ndscheduler, after that I got following error:

Starting scheduler_1 ... done
Attaching to bankruptcy_api_scheduler_1
scheduler_1    | Traceback (most recent call last):
scheduler_1    |   File "/mnt/scheduler/src/ndscheduler/simple_scheduler/scheduler.py", line 3, in <module>
scheduler_1    |     from ndscheduler.server import server
scheduler_1    |   File "/mnt/scheduler/src/ndscheduler/ndscheduler/server/server.py", line 16, in <module>
scheduler_1    |     from ndscheduler.server.handlers import audit_logs
scheduler_1    |   File "/mnt/scheduler/src/ndscheduler/ndscheduler/server/handlers/audit_logs.py", line 6, in <module>
scheduler_1    |     import tornado.web
scheduler_1    |   File "/mnt/scheduler/local/lib/python2.7/site-packages/tornado/web.py", line 205
scheduler_1    |     application: "Application",
scheduler_1    |                ^
scheduler_1    | SyntaxError: invalid syntax

Run job every 15 seconds

Hi,

I wrote a small script which queries SOAP webserver and writes the returned JSON data into InfluxDB. I will use it to get machine data, which is generated at 15 second intervals on my machines. I have two questions accordngly:

  1. How I can add this script into Nextdoor scheduler? I added it into /mnt/scheduler/src/ndscheduler/simple_scheduler/jobs/xxx.py but I could not see it in the job class combobox.

  2. How can I make tihs job run in 15 seconde intervals?

Thank you,

Wrong time

When I create a new job in ndscheduler with 10 mins, its not scheduled to run on 10 mins.

Ex:

  1. Consider current system time is 19:21 IST
  2. Create a new job with 30 mins interval
  3. The next time to run should be 19:51 IST. But ndscheduler shows 20:00 IST

Ex2:

  1. Consider current system time is 19:32 IST
  2. create a new job with 9 mins interval
  3. The next time to run should be 19:41 IST. But ndscheduler shows 19:39 IST

My parameters are,
Mins hour (UTC) day month day of week
*/30 * * * *

Please help me to run the scheduler at the desired time.
Thanks in advance
Bala

Non-CronTrigger based jobs in scheduler break the web-ui.

Basically, I have a number of use cases that really require more flexible scheduling then can be achieved with the apscheduler CronTrigger.

Since the project is using apscheduler internally, it's fairly easy to add the ability to add jobs with custom triggers, but going to the web-ui causes the server to throw an exception.

It'd be nice to support other trigger types for display, if not necessarily adding via the web interface.

Adding year so you could run the task once a year

Great work thanks,
I am just wondering how could I use ndscheduler to run task only once.
I think adding the year option could be a way to say run the task on that particular date and thats it.

Ok maybe providing a mecanism to delete the job just after it has been triggered.
Is that possible? How would I do that?

Not able to override default settings

We're trying to alter the setting JOB_MAX_INSTANCES, but not having any luck.

Dockerfile

FROM python:2-alpine3.7

RUN apk update && apk add git postgresql-dev gcc python-dev musl-dev

COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

ENV NDSCHEDULER_SETTINGS_MODULE=settings

ADD . .

CMD ["python", "scheduler.py"]

settings.py

JOB_MAX_INSTANCES = 10

scheduler.py

"""Run the scheduler process."""
from ndscheduler.server import server

class Tassadar(server.SchedulerServer):
  pass

if __name__ == "__main__":
    Tassadar.run()

And we boot up the container and try to run more than 3 instances:

ndscheduler.core.scheduler_manager - WARNING - Execution of job "REDACTED (trigger: cron[month='*', day='*', day_of_week='*', hour='*', minute='0'], next run at: 2018-07-13 04:00:00 UTC)" skipped: maximum number of running instances reached (3)

Any ideas? I would expect it to not limit out at 3 any more, but rather 10.

Thanks!

the JSON object must be str, not 'bytes'

I am trying to define some simple tasks...

2017-11-21 17:08:57,060 - tornado.application - ERROR - Uncaught exception PUT /api/v1/jobs/098bb546ced011e7b27f10c37b505584 (127.0.0.1)
HTTPServerRequest(protocol='http', host='localhost:7778', method='PUT', uri='/api/v1/jobs/098bb546ced011e7b27f10c37b505584', version='HTTP/1.1', remote_ip='127.0.0.1', headers={'Content-Type': 'application/json; charset=UTF-8', 'Origin': 'http://localhost:7778', 'Host': 'localhost:7778', 'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7', 'Cookie': 'sidebar_pinned=false; PGADMIN_LANGUAGE=en; csrftoken=BgZcnLvOC9ptQSyc990Ith4oEqtx7RtL9DKUVWkYcyhxcKvR89vYrsC1DeUiPFm3; Pycharm-379b065f=6f9c7e1a-a633-4337-b1b1-a73535c20137; session=eyJfY3NyZl90b2tlbiI6ImM2NDJiNzU2MDQ2MDM2OGZkMmYxZDg5MzVhZjJmMGUyMTk0Nzk4NDAifQ.DL0HWg.fhwDX-MWDAlOssKqbZVDSupE7PM; COVERAGE_INDEX_SORT=', 'Content-Length': '178', 'Accept': '*/*', 'X-Requested-With': 'XMLHttpRequest', 'Referer': 'http://localhost:7778/'})
Traceback (most recent call last):
  File "/home/thomas/github/production/env/lib/python3.5/site-packages/tornado/web.py", line 1422, in _execute
    result = self.prepare()
  File "/home/thomas/github/production/env/lib/python3.5/site-packages/ndscheduler/server/handlers/base.py", line 25, in prepare
    self.json_args = json.loads(self.request.body)
  File "/home/thomas/github/production/env/lib/python3.5/json/__init__.py", line 312, in loads
    s.__class__.__name__))
TypeError: the JSON object must be str, not 'bytes'

Validate return status for shell scripts

Hi team, thanks for sharing this tool.

I would like to know where should I modify the python code to evaluate the status code from shell scripts to make it fail if it is different than zero.

Thanks!
Nicolas.

Status of this project

Hello,

I see that no commits have been made recently and am wondering about the status of this project. This looks like the tool I would need to use but I'm wondering if that's smart seeing the activity. Is this not in used anymore at nextdoor ?
What are the alternatives if any ?
Thanks !

Stronger sql model needed

Your schema is weak. Can you please make sure job_id is a proper foreign key and you also include a definition for the job. Next, it all falls over once a job is deleted that still has relatives in the Executions table... Btw: I would recommend to use Pony for the job

Adding new job to ndscheduler

Hi,

I've created my own job, based on sample curl_job, which should "ping" an API.
Here is it's source.
In order to make it run, I've added to original Dockerfile from simple_scheduler folder, right after pem file is copied next row:
ADD patrowl_scan_job.py /mnt/scheduler/src/ndscheduler/simple_scheduler/
but the docker container fail to start with next error
bash: /mnt/scheduler/bin/run_scheduler: /bin/bash^M: bad interpreter: No such file or directory
Any suggestion?

Job Name - Again

I have multiple jobs setup in my scheduler, I am having some trouble on how to fetch the job names within in the job. I am doing some logging for my own use. Would like log every job along with its name and time stamps.

Jobs stuck at "running"

This can happen when a job is running and the ndscheduler process died.

I.e to reproduce:
can create shell job like that sleeps for a while i.e:
["bash","-c","sleep 3600"]

when the job is running, send kill signal, the next time ndscheduler starts, the job will be stuck at running.

Error report

I test error on running batch/shell file.
but job execution status is "Success" and Result is:
{ "returncode": 1 }
Can I print the error on shell script to result trough a Job argument?

And what best practice argument for job shell script?

Missing argument "db_tablenames" with APScheduler

I can create a job successfully and it appears in my postgresql database. But when I try and modify it, I get the following error:

HTTPServerRequest(protocol='http', host='192.168.1.120:5678', method='PUT', uri='/api/v1/jobs/65a6eabfe79d4e46b44762cf5c5b6804', version='HTTP/1.1', remote_ip='192.168.1.119')
Traceback (most recent call last):
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/tornado-5.1.1-py3.6-linux-x86_64.egg/tornado/web.py", line 1548, in _stack_context_handle_exception
    raise_exc_info((type, value, traceback))
  File "<string>", line 4, in raise_exc_info
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/tornado-5.1.1-py3.6-linux-x86_64.egg/tornado/stack_context.py", line 339, in wrapped
    ret = fn(*args, **kwargs)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/tornado-5.1.1-py3.6-linux-x86_64.egg/tornado/gen.py", line 232, in final_callback
    if future.result() is not None:
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/tornado-5.1.1-py3.6-linux-x86_64.egg/tornado/gen.py", line 1141, in run
    yielded = self.gen.throw(*exc_info)
  File "/home/pj/src/ndscheduler/ndscheduler/server/handlers/jobs.py", line 256, in modify_job_yield
    yield self.modify_job(job_id)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/tornado-5.1.1-py3.6-linux-x86_64.egg/tornado/gen.py", line 1133, in run
    value = future.result()
  File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/home/pj/src/ndscheduler/ndscheduler/server/handlers/jobs.py", line 248, in modify_job
    self._modify_job(job_id)
  File "/home/pj/src/ndscheduler/ndscheduler/server/handlers/jobs.py", line 233, in _modify_job
    self.scheduler_manager.modify_job(job_id, **self.json_args)
  File "/home/pj/src/ndscheduler/ndscheduler/corescheduler/scheduler_manager.py", line 169, in modify_job
    self.sched.modify_scheduler_job(job_id, **kwargs)
  File "/home/pj/src/ndscheduler/ndscheduler/corescheduler/core/base.py", line 196, in modify_scheduler_job
    job.modify(**kwargs)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/APScheduler-3.6.3-py3.6.egg/apscheduler/job.py", line 62, in modify
    self._scheduler.modify_job(self.id, self._jobstore_alias, **changes)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/APScheduler-3.6.3-py3.6.egg/apscheduler/schedulers/base.py", line 484, in modify_job
    job._modify(**changes)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/APScheduler-3.6.3-py3.6.egg/apscheduler/job.py", line 180, in _modify
    check_callable_args(func, args, kwargs)
  File "/home/pj/src/ndscheduler/.venv/lib/python3.6/site-packages/APScheduler-3.6.3-py3.6.egg/apscheduler/util.py", line 402, in check_callable_args
    ', '.join(unsatisfied_args))
ValueError: The following arguments have not been supplied: db_tablenames```

Web authentication for jobs

The current version has no authentication at all. So getting the ip and the port, anyone can change and update the jobs on the machine. Need some method of authentication for it.

Make registration of SIGINT handler optional

There are environments in which it is undesirable to register a handler for the SIGINT signal (for example, when running under a service manager framework which runs the scheduler in a thread other than the primary thread). It is possible to bypass this registration by overriding SchedulerServer.run() but that approach has its own drawbacks. Better would be to either accept an optional keyword argument to run() to suppress the call to signal.signal or (possibly more elegant) support a flag in the settings dictionary. What do you think?

UnicodeDecodeError: 'ascii' codec can't decode byte 0xd0 in position 74: ordinal not in range(128)

ok ( "name":"Hello world" )

$ curl -d '{"job_class_string":"simple_scheduler.jobs.sample_job.AwesomeJob","name":"Hello world","month":"1"}' -H "Content-Type: application/json" -X POST http://localhost:8888/api/v1/jobs
{"job_id": "a1d9c1861c6b11ea885b0242c0a82002"}

not ok ( "name":"Россия", utf8 symbols ) :

$ curl -d '{"job_class_string":"simple_scheduler.jobs.sample_job.AwesomeJob","name":"Россия","month":"1"}' -H "Content-Type: application/json; charset=utf-8" -X POST http://localhost:8888/api/v1/jobs
Traceback (most recent call last):
  File "/mnt/scheduler/local/lib/python2.7/site-packages/tornado/web.py", line 1422, in _execute
    result = self.prepare()
  File "/mnt/scheduler/src/ndscheduler/ndscheduler/server/handlers/base.py", line 25, in prepare
    self.json_args = json.loads(self.request.body.decode())
UnicodeDecodeError: 'ascii' codec can't decode byte 0xd0 in position 74: ordinal not in range(128)

Server-side pagination of executions / audit logs

This would enable us to go back further in time and display a longer history of executions and audit logs. Currently the limit of a day is basically a sanity check on not letting the UI easily submit requests that could overwhelm the DB.

See #19

Getting Job Name?

Is there an easy way to get the job name? I can't seem to figure out how. Any help would be great.

Question about RC or snapshot releases?

Hi,
It has been a while since the last release and there are significant changes since then. Is there a plan for a release soon?
Can RC or snapshot releases be published instead?

Thanks!

Ability to run a task every few seconds

Is it possible to run a task every few seconds with ndscheduler? I see that apscheduler allows running tasks using seconds. Inspecting the ndscheduler code, it looks like that seconds are not passed to add_job.

And the scheduler wakes up every 60 seconds; my guess is this can be customized via settings

vulnerability CVE-2018-9206 ?

Hello,

It’s just a question, I’m currently using the version 0.2.1 of ndscheduler.
This version of Ndschduler use jquery ?
If yes, ndschduler is affect by the CVE-2018-9206 the vulnerability with jquery ? and what version choose for not be affect for the vulnerability ?
Thanks in advance,

ImportError: No module named corescheduler

I'd like to implement custom job class (same as ShellJob but which run only once)

I made a class:

"""A job to run executable programs ONLY ONCE."""

from subprocess import call

from ndscheduler.corescheduler import job

class ShellJobOnce(job.JobBase):

    @classmethod
    def meta_info(cls):
        return {
            'job_class_string': '%s.%s' % (cls.__module__, cls.__name__),
            'notes': ('This will run an executable program ONLY ONCE. You can specify as many '
                      'arguments as you want. This job will pass these arguments to the '
                      'program in order.'),
            'arguments': [
                {'type': 'string', 'description': 'Executable path'}
            ],
            'example_arguments': '["/usr/local/telegram_notify.pl", "user_id", "arg2"]'
        }

    def run(self, *args, **kwargs):
        code = call(args)
        self.scheduler_manager.pause_job(self.job_id);
        #self.scheduler_manager.remove_job(self.job_id);
        return {'returncode': code}


if __name__ == "__main__":
    # You can easily test this job here
    job = ShellJobOnce.create_test_instance()
    job.run('at', '-h')

and run NDScheduler using docker-compose like:

version: '2'

services:
    scheduler:
        image: wenbinf/ndscheduler
        volumes:
        - "${PWD}/python/shell_job.py:/mnt/scheduler/src/ndscheduler/simple_scheduler/jobs/shell_job.py"
        ports:
        - "8888:8888"

After that I get at http://localhost:8888/

Traceback (most recent call last):
  File "/mnt/scheduler/local/lib/python2.7/site-packages/tornado/web.py", line 1443, in _execute
    result = method(*self.path_args, **self.path_kwargs)
  File "/mnt/scheduler/src/ndscheduler/ndscheduler/server/handlers/index.py", line 16, in get
    meta_info = utils.get_all_available_jobs()
  File "/mnt/scheduler/src/ndscheduler/ndscheduler/utils.py", line 155, in get_all_available_jobs
    job_module = importlib.import_module('%s.%s' % (job_class_package, module_name))
  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
  File "/mnt/scheduler/src/ndscheduler/simple_scheduler/jobs/shell_job_once.py", line 5, in <module>
    from ndscheduler.corescheduler import job
ImportError: No module named corescheduler

Could you please explain me why such happened?

My module has same import from ndscheduler.corescheduler import job sting as original ShellJob

P.S. Even if I mount same ShellJob module with same content inside wenbinf/ndscheduler container I got same error.

Make script fails on Windows

Running make simple in a Windows environment, using Cygwinyields an error when it attempts to execute the .venv/bin/activate command in the init target.

. .venv/bin/activate && .venv/bin/pip install flake8
/bin/bash: .venv/bin/activate: No such file or directory
Makefile:10: recipe for target 'init' failed
make[2]: *** [init] Error 1`
make[2]: Leaving directory '/cygdrive/c/Nextdoor/ndscheduler'
Makefile:24: recipe for target 'install' failed
make[1]: *** [install] Error 2
make[1]: Leaving directory '/cygdrive/c/Nextdoor/ndscheduler'
Makefile:36: recipe for target 'simple' failed

The initial difficulty is that in a Windows environment, the virtualenv command doesn't create a .venv/bin directory. Instead, the executables are placed in: .venv/Scripts. (As of Python 2.7.2.)

The PYTHON, PIP and SOURCE_VENV variables must be changed to reflect the .venv/Scripts location. e.g.

PYTHON=.venv/Scripts/python
PIP=.venv/Scripts/pip
SOURCE_VENV=.venv/Scripts/activate

This change is not sufficient to resolve the problem. The virtualenv command installs activate as a .bat file instead of a .exe. The bash shell doesn't know to look for .bat files, and the command fails. Changing the SOURCE_VENV variable to reference activate.bat (with the extension) allows activate to be found, however bash doesn't know how to interpret the file, and the script still fails.

[Question] Scalability

I am interested in using ndscheduler in a solution that I am developing but I have the following requirement:

  1. My app will use the ndscheduler API to generate about 30,000 schedules per day within 2 hours time frame.

  2. These schedules will run only once and their purpose will be to send an attendance confirmation email to a customer.

  3. Schedules will have their execution start date and time distributed over a time period of 6 hours during the day.

  4. After the schedule is successfully executed it can be deleted from the ndscheduler.

Does the ndscheduler support this workload?

Thank you.

Viewing execution result doesn't work past page 1

Currently, the render function in table-view.js binds the click events to view the execution results, but it only works for the buttons that are currently in view since it uses jQuery to get a list of buttons. The render function only executes once, so any buttons past page 1 won't have their click events bound.

Adding Custom Fields to Job in Web UI

Has anyone had success with add custom fields to the web UI? Example below:

image

I have fields that I use often such as an email address to send to on successful execution and I'm also considering having a fields to trigger another job. I can pass these as arguments or kwargs, but my JSON string starts to get extremely long and it would be nice to have it broken out for the commonly used ones.

The html was simple to edit, and I was able to pass the custom fields as a kwarg to the datastore, however I'm having difficulty retrieving the kwarg data so that when the modify modal window opens, the data is present in the correct fields. I have only very basic js experience so I'm not sure how to populate the window with the data or where exactly the Web UI is getting the data for each job.

Any help would be appreciated.

Job Dependency Supported?

You guys did an amazing job on this. Thanks for making it open sourced.

One question, I know that you are using APScheduler and it doesn't have the capability to say "Job B depends on Job A.... it will not run until Job A is successful".
How do you guys address this in your production env?
You must have a job that will start after another job finishes.....

Thanks

Cannot run a job with the rest api

Try to run a job with the rest api via Postman
Screenshot_2
but get the following error message:
Traceback (most recent call last):
File "C:\python\python37-32\lib\site-packages\tornado\web.py", line 1401, in _stack_context_handle_exception
raise_exc_info((type, value, traceback))
File "", line 3, in raise_exc_info
File "C:\python\python37-32\lib\site-packages\tornado\stack_context.py", line 314, in wrapped
ret = fn(*args, **kwargs)
File "C:\python\python37-32\lib\site-packages\tornado\gen.py", line 196, in final_callback
if future.result() is not None:
File "C:\python\python37-32\lib\site-packages\tornado\concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "", line 3, in raise_exc_info
File "C:\python\python37-32\lib\site-packages\tornado\gen.py", line 267, in wrapper
result = func(*args, **kwargs)
File "C:\python\python37-32\lib\types.py", line 277, in wrapped
coro = func(*args, **kwargs)
TypeError: post() missing 1 required positional argument: 'job_id'

Support customized tornado_settings

It is currently not possible to override ndscheduler's default settings for tornado without completely replacing SchedulerServer.__init__(), which is undesirable for obvious reasons (losing access to bug fixes and enhancements added to the method by Nextdoor, and -- worse -- risking future incompatibilities if ND introduces changes elsewhere in ndscheduler for which modifications must be made to the constructor). If I come up with a pull request, would you consider an enhancement which adds an optional keyword argument which could be used to pass in a dictionary of custom settings to be applied to self.tornado_settings?

Not compatible with Python 3.5

Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/tornado/web.py", line 1443, in _execute
result = method(*self.path_args, **self.path_kwargs)
File "/src/ndscheduler/ndscheduler/server/handlers/index.py", line 16, in get
meta_info = utils.get_all_available_jobs()
File "/src/ndscheduler/ndscheduler/utils.py", line 155, in get_all_available_jobs
job_module = importlib.import_module('%s.%s' % (job_class_package, module_name))
File "/usr/local/lib/python3.5/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 986, in _gcd_import
File "", line 969, in _find_and_load
File "", line 958, in _find_and_load_unlocked
File "", line 673, in _load_unlocked
File "", line 673, in exec_module
File "", line 222, in _call_with_frames_removed
File "/src/ndscheduler/simple_scheduler/jobs/apns_job.py", line 7, in
from apns import APNs, Payload
File "/usr/local/lib/python3.5/site-packages/apns.py", line 218
except ssl.SSLError, err:
^
SyntaxError: invalid syntax
2017-05-04 21:05:03,639 - tornado.access - ERROR - 500 GET / (172.17.0.1) 21.06ms

Functionality to disable running more then one execution with same Job ID

I would like to thank you for the great job you did!
Is there a functionality to disable running multiple jobs (with the same id) at the same time? For example if job is running and someone start the same job again and the second want to wait the first one to finish. If there is no such functionality could you please advice which is the best way to implement it.
Thanks in advanced!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.