---> conda create --prefix ./sch python=3.7 -y
---> activate the env
---> pip install django
---> django-admin startproject scrape_scheduler
---> pip install celery
---> making modifiction to settings.py
---> django-admin startapp main
---> install redis on linux
---> think of celery as a worker which will do all the tasks instead of django
---> once the settings in project folder is modified to clery configs
---> create a celery.py in project folder
---> create a task.py in main folder
---> pip install redis
----> project folder urls will redirect you to application
----> application have their own urls.py
''' start the djnago server and then in new terminal celery -A project_name.celery worker --pool=solo -l info''' for windows / linux remove --pool=solo
''' pip install django-celery'''
''' every time you add apps in settings.py run python manage.py makemigrations python manage.py migrate ######### run all the above in linux before getting any started
even the task has not been completed return will be done from django and celery runs the task parallely
###-------->