Giter Site home page Giter Site logo

dye's Introduction

pipeline status

DYE

UNSTABLE SOFTWARE

Please be aware that DYE is not stable yet - it is a rework of our internal deployment scripts to make them easier to reuse, and the rework is a work in progress. Please have a play, test against your dev server and give us feedback. But don't download and immediately run against your production server.

DYE is a bacronym for "Deploy Your Environment" - a set of scripts and functions to deploy your web app along with the required python libraries in a virtualenv, either locally or on a remote server. It is built on fabric. It is most well developed for Django web apps, but we have used it for PHP projects aswell.

A Cookiecutter Project

You can create a compatible project using cookiecutter.

See readme-cookiecutter.rst for details on how to do this.

Expected Project Structure

A bare bones project structure would be:

/apache                    <- contains config files for apache, for different servers
    staging.conf
    production.conf
/deploy                    <- contains settings and scripts for deployment
    bootstrap.py
    fab.py
    localfab.py            <- optional
    localtasks.py          <- optional
    pip_packages.txt       <- list of python packages to install
    project_settings.py
    tasks.py
    ve_mgr.py
/django
    /website               <- top level for Django project
        manage.py          <- a modified version of manage.py - see examples/
        .ve/               <- contains the virtualenv
        local_settings.py  <- a link to the real local_settings.py.env
        local_settings.py.dev
        local_settings.py.staging
        local_settings.py.production
        manage.py          <- our modified version
        private_settings.py   <- generated by these scripts
        settings.py        <- this will import local_settings.py
        urls.py
/wsgi                      <- holds WSGI handler
    wsgi_handler.py

A certain amount of the directory structure can be overridden in project_settings.py but that is not well tested currently.

Tasks.py

tasks.py is designed to make it easy to get your development environment up and running easily. Once the project is set up, getting going should only require:

cd deploy
./bootstrap.py
./tasks.py deploy:dev
cd ../django/django_project
./manage.py runserver

bootstrap.py will create the virtualenv and install the python packages required

tasks.py deploy:dev will:

  • generate a .private_settings.py. (random database password and Django secret key)
  • link to one of your local_settings files (selects database etc)
  • init and update git submodules (if any)
  • create the database (if using MySQL at least) and run syncdb and migrations

Your Django application will then be good to go.

tasks.py includes a number of other tasks that should make your life easier. It also makes it easy to add your own tasks and integrate them into the deploy task by using:

localtasks.py

This is a file where you can define your own functions to do stuff that you need for your project. You can also override functions from tasklib.py simply by defining a function with the same name as the function in tasklib.py

You can override the main deploy() function, but you might lose out if the deploy function starts to do more. Generally a better strategy is to define a post_deploy() function - when you run ./tasks.py deploy then after the other tasks have completed it will check if there is a function called post_deploy() in localtasks.py and if so it will call it.

manage.py and bootstrap.py

We use a modified version of manage.py that knows about the virtualenv in the .ve/ directory. So when you run manage.py it will automatically relaunch itself inside the virtualenv, so you don't have to worry about activating/deactivating.

It also knows about the list of packages in pip_packages.txt - if that is updated without the virtualenv being updated (or if the virtualenv doesn't exist) then manage.py will complain. You can then create/update the virtualenv with:

../../deploy/bootstrap.py

Note that this will only update the virtualenv if it's required, though you can use --force to do it anyway. Also note that by default it will only update the packages that need updating - use --full-rebuild to force it to delete the virtual env and then rebuild. To see all options use --help.

Fabric and DYE

We have developed a set of fabric functions for our standard project layout. In order to avoid violating the DRY principle, we delegate the work to tasks.py where possible. Our standard fab deploy will:

  • check if you have made any local changes to the server. If it finds any it will alert you to them and give you the choice of whether to continue or not.
  • if using git it will check the branch set in project_settings.py, the branch currently on the server and the branch you are currently on locally. If there is a mismatch it will ask you to confirm what branch you want to use.
  • create a copy of the current deploy in a directory called next/ - or create the directory on the server (if this is the first deploy)
  • checkout or update the project from your repository (git, svn and CVS currently supported). It uses the next/ directory so that your running website is unaffected.
  • remove all *.pyc files. (If x.py is removed by your VCS but x.pyc remains, then as far as python is concerned, x.py is still present).
  • ensure the virtualenv is created and packages installed (as bootstrap.py does)
  • unlink the webserver config and reload the web server (effectively turning off the site)
  • do a database dump in the current directory
  • move the current directory to a subdirectory of previous so that a rollback can occur, and then move next/ to where the current directory is.
  • call tasks.py deploy - which ensures settings are ready and does all the database related stuff.
  • relink the webserver config and reload the web server (effectively turning on the site again)
  • delete excess copies in previous/ (by default 5 copies are retained).

As with tasks.py you can add extra functions and override the default behaviour by putting functions in:

localfab.py

Server Directory Structure

Dye will create /var/django/project_name and in that directory will be:

dev/           <- contains the checked out code
previous/      <- copies for rollback, with directories named by timestamp

During a deploy there will also be:

next/         <- only temporary

You can override the project root with the server_home variable in project_settings.py

dye's People

Contributors

birdsarah avatar bitterjug avatar daniell avatar decentral1se avatar foobacca avatar jurecuhalev avatar lwm avatar martinburchell avatar qris avatar samastur avatar skiold avatar tomd-aptivate avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dye's Issues

We should be clearing out expired sessions from Django apps

We're allowing large numbers of sessions to accumulate. One server has a 270 MB sessions table, by far the largest part of the database.

We should be running manage.py clearsessions automatically for each installed Django project, probably daily.

If you’re using the database backend, note that session data can accumulate in the django_session database table and Django does not provide automatic purging. Therefore, it’s your job to purge expired sessions on a regular basis.

To understand this problem, consider what happens when a user uses a session. When a user logs in, Django adds a row to the django_session database table. Django updates this row each time the session data changes. If the user logs out manually, Django deletes the row. But if the user does not log out, the row never gets deleted.

Django provides a sample clean-up script: django-admin.py cleanup. That script deletes any session in the session table whose expire_date is in the past – but your application may have different requirements.

https://docs.djangoproject.com/en/dev/ref/django-admin/#django-admin-clearsessions

Show ssh username on password prompt

When pairing or when working on a machine which isn't on LDAP, the local username is often different to the remote one. The DYE password prompt doesn't make that clear.

[fen-vz-360giving-stage.fen.aptivate.org] Executing task 'deploy'
[fen-vz-360giving-stage.fen.aptivate.org] Login password:

Which "login" is the login password for? It could also be useful to prompt people to login with a different username if necessary. For example this might work:

[fen-vz-360giving-stage.fen.aptivate.org] Logging in as "andybaker". Add "-l <username>" to the fab.py command to change it.

`get_remote_dump_and_load` is slow

It takes about 10 minutes to load a reasonably-sized remote dump (1.1 MB from washwatch staging server).

Disabling the rsyncable code makes it about 10x faster (about 30 seconds). I'm not sure what the impact on the rsyncability of the file is, but this can equate to a big time saving.

This is the code in question, in dye/tasklib/database.py:

# this option will mean that there will be one line per insert
# thus making the dump file better for rsync, but slightly bigger
if for_rsync:
    dump_cmd.append('--skip-extended-insert')

It might be useful to have a command-line option to enable this manually; for example such dumps are easier to manipulate for loading into a different database engine. But I don't really see a need for it in get_remote_dump_and_load.

bootstrap.py *still* blows away changes to editable git checkouts

Add an editable git checkout to your deploy/pip_packages.txt:

-e git+https://github.com/aptivate/cmsbootstrap.git@2ccf433e101ec9e7eef7ccc4a6c16cc8c5eb59f6#egg=cmsbootstrap

Run deploy/bootstrap.py.

Make changes to the editable code, for example:

echo 'test {}' >> django/website/.ve/src/cmsbootstrap/cmsbootstrap/static/sass/cmsbootstrap.scss

Touch pip_packages.txt and run bootstrap again:

touch deploy/pip_packages.txt
deploy/bootstrap.py

Check what happened to your change:

grep test django/website/.ve/src/cmsbootstrap/cmsbootstrap/static/sass/cmsbootstrap.scss

And it's gone!

Silently blowing away changes to an editable repo is not cool.

Update to new recommended Apache/mod_wsgi config

We are currently using this mod_wsgi configuration as the default for new projects:

WSGIDaemonProcess {{ cookiecutter.project_name }} processes=2 threads=10

I think we should be using this instead:

WSGIDaemonProcess {{ cookiecutter.project_name }} processes=1 threads=10 maximum-requests=400 display-name='%{GROUP}' deadlock-timeout=30
WSGIApplicationGroup %{GLOBAL}

Rationale for the changes:

  • processes=1: I can't see any reason to use multiple processes when we have threads enabled. Using two processes eats twice as much memory and ensures that some state is not shared between processes (e.g. cached Site objects) which requires a manual restart of Apache.
  • maximum-requests=400 ensures that the WSGI daemon process is recycled regularly, freeing up any memory that it may have leaked.
  • display-name='%{GROUP}' allows us to identify the WSGI process(es) in the process list, differentiating them from normal Apache processes. They will appear as (wsgi:<project_name>) instead of /usr/sbin/httpd.
  • deadlock-timeout=30 enables the detection and breaking of deadlocks caused by extensions starting a long-running operation in native C code without releasing the Python GIL lock. Otherwise these will hang the whole WSGI process for a long time, perhaps forever.
  • WSGIApplicationGroup %{GLOBAL} causes all Python code to run in the first Python interpreter instead of a sub-interpreter, avoiding a deadlock with native extensions.

References for these recommendations:

Can we upgrade django_extensions to latest

1.0.2 which is currently in cookie cutter doesn't work with django1.6 so was wondering if we could upgrade to latest which is 1.3.3. if yes, i'll do the pull request, no issues.

How about enabling CACHE by default?

MIDDLEWARE_CLASSES = (
    'django.middleware.cache.UpdateCacheMiddleware',
    ...
    'django.middleware.cache.FetchFromCacheMiddleware',
)

# turn on general caching, but not for logged in users
CACHE_MIDDLEWARE_ANONYMOUS_ONLY = True
CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
        'LOCATION': 'cache',
    }
}

# test stuff
import sys
if 'test' in sys.argv:
    CACHES = {
        'default': {
            'BACKEND': 'django.core.cache.backends.dummy.DummyCache',
        }
    }

CMS_CACHE_DURATIONS = {
    'content': 60,
    'menus': 600,
}

Unify fab.py and tasks.py into a single interface

It is not clear, or sufficiently easy to remember from the name - fab - tasks, what each of these files is supposed to providing and what kind of commands can be run. They should be one file, or interface, with syntax to differentiate the remote/local nature of the tool (so I'm told, this is the difference). I'm creating this issue to track trying efforts to experiment with this.

what is this mysterious repo_name?

Dye cookiecutter now asks me for some parameter called repo_name. What is it?

project_name (default is "project_name")? pypicache
repo_name (default is "repo_name")? 

deploy fails on an empty project because no assets files

On an empty project built from cookiecutter when you are doing a first deploy, deploy:dev fails with the following error.

### Collecting static files and building webassets
Failed to execute command: ['/home/bird/Dev/opencontracting/validator/validator/django/website/.ve/bin/python', '/home/bird/Dev/opencontracting/validator/validator/django/website/manage.py', 'assets', 'clean']: returned 1
CommandError: No asset bundles were found. If you are defining assets directly within your templates, you want to use the --parse-templates option.

Should be possible to select a Python in project_settings.py

We will soon have python 2.7 available (as /usr/bin/python2.7) on CentOS 6 servers, where /usr/bin/python is python 2.6.

Applications needing Python 2.7 will need to bootstrap their virtualenv using it. We manually modified bootstrap.py in Washwatch, changing the shebang line to run Python 2.7. However this requires /usr/bin/python2.7 to be available on all machines where Washwatch will ever be deployed (by fabric), which is not ideal.

I think it would be good to be able to specify a minimum Python version in project_settings.py, and then bootstrap tries to find one that meets the requirements by looking in various paths (starting with the system python in /usr/bin/python), and fails if it can't find one.

What do you think?

celery not restarted on deploy

deploying inasp site recently, I'd added a task in main/tasks.py but it wasn't active after the deploy. Restarting celerybeat and celeryd fixed that, but it wasn't done by dye.

get_remote_dump_and_load gives AttributeError: valid_envs

What does this mean? Am I missing something in my configuration?

./fab.py get_remote_dump_and_load
Traceback (most recent call last):
File "/home/martinb/workspace/cbmdb/django/website/.ve/local/lib/python2.7/site-packages/fabric/main.py", line 710, in main
_args, *_kwargs
File "/home/martinb/workspace/cbmdb/django/website/.ve/local/lib/python2.7/site-packages/fabric/tasks.py", line 321, in execute
results[''] = task.run(_args, *_new_kwargs)
File "/home/martinb/workspace/cbmdb/django/website/.ve/local/lib/python2.7/site-packages/fabric/tasks.py", line 113, in run
return self.wrapped(_args, *_kwargs)
File "/home/martinb/workspace/cbmdb/django/website/.ve/src/dye/dye/fablib.py", line 805, in get_remote_dump_and_load
filename=filename, local_filename=local_filename, rsync=rsync)
File "/home/martinb/workspace/cbmdb/django/website/.ve/src/dye/dye/fablib.py", line 773, in get_remote_dump
require('user', 'host', provided_by=env.valid_envs)
File "/home/martinb/workspace/cbmdb/django/website/.ve/local/lib/python2.7/site-packages/fabric/utils.py", line 168, in getattr
raise AttributeError(key)
AttributeError: valid_envs

Remove image stuff from normal django_type cookiecutter.

This has niggled me in the past, but then the Pillow security upgrade made me think of it.

Can the following just be {% if cookiecutter.django_type == 'cms' %} I don't understand why the image stuff is included for normal django.

{% if cookiecutter.django_type == 'cms' or cookiecutter.django_type == 'normal' %}
easy-thumbnails==1.4
pillow==2.3.1
image_diet==0.7.1
{% endif %}

cookiecutter --commit is wrong?

I tried to run:

cookiecutter --commit develop git://github.com/aptivate/dye.git

And it failed with this output:

usage: cookiecutter [-h] [--no-input] [-c CHECKOUT] [-v] input_dir

cookiecutter: error: unrecognized arguments: --commit git://github.com/aptivate/dye.git

I tried again with -c instead of --commit, and it failed differently:

$ cookiecutter -c develop git://github.com/aptivate/dye.git
IOError: [Errno 2] No such file or directory: u'git://github.com/aptivate/dye.git/cookiecutter.json'

I had to switch to the HTTP URL to make it download the cookiecutter.json file properly:

$ cookiecutter -c develop https://github.com/aptivate/dye
You've cloned /home/installuser/.cookiecutters/dye before. Is it okay to delete and re-clone it? [Y/n] 
Cloning into 'dye'...
remote: Counting objects: 1498, done...

Use ssh variables in non fabric commands e.g. rsync

Am using get_remote_dump_and_load, fab is able to ssh in as it knows about my keys (at the bottom of project settings) but rsync doesn't so fails. There may be other places where ssh is called outside of fabric.

south.patch

south.patch is referenced in tasklib, but doesn't yet exist in dye - fix this

How should we add new environments?

I'm deploying a project where "production" doesn't make sense as there are potentially many "production" servers and no "staging".

So I changed the keys in project_settings.py, but can't deploy because fab doesn't know anything about the new environments.

I suppose I should create a localfab.py file, or something like that, but I'm not sure. It shouldn't be necessary anyway, as all the information it requires should be in project_settings.py.

Upgrade to South 1.0?

Improves backwards compatibility by searching for south migrations before django migrations in new projects. The latest django-reversion for example requires this.

"South has had very few changes for a couple of years now, and the only reason it's getting a release at all is one new feature to aid in the upgrade to Django 1.7. This feature is that South will now look first for a south_migrations directory, before falling back to migrations. This one small change means that third-party libraries and apps can support both South and the new-style 1.7 migrations, by moving their legacy South migrations into the south_migrations directory, and having the Django migrations in the migrations directory."

Autotruncate db user in cookie cutter?

Was just setting up a new project with a long project name. Using everything default out of the box with cookiecutter, the new db user couldn't be created because: _mysql_exceptions.OperationalError: (1470, "String 'opendatacomparison' is too long for user name (should be no longer than 16)")

Should we just auto truncate this in cookie cutter?

How to avoid copying uploaded files every deploy

Currently our standard way of storing uploaded files is within the django tree, so they get copied around on deploy. If we stored the files somewhere else then the copying would not be required.

dye setup.py uses distutils.core.setup instead of setuptools.setup

We used dye's setup.py as an example for cmsbootstrap. I got stuck trying to make it run tests, with the following error:

chris@lin-vnc2:~/cmsbootstrap$ /tmp/ve/bin/python ./setup.py test
usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
   or: setup.py --help [cmd1 cmd2 ...]
   or: setup.py --help-commands
   or: setup.py cmd --help

error: invalid command 'test'

The problem turned out to be that we (and DYE) started setup.py with this line:

from distutils.core import setup

instead of:

from setuptools import setup

as described on Stack Overflow.

This might not be affecting DYE right now, if there are no tests, but is there a reason for using setup from distutils instead of setuptools that we should be aware of? Or does it make sense to change DYE to use setuptools instead?

Add task to retrieve latest deploy time

Is this already here? Whilst this should probably be viewable on some UI, it'd be better than nothing to have the ./tasks.py deploy ... command create a small file which stamps when it was last run on the server side. This would be helpful for other members of the team to come along and check when the last release was (outside of the developer).

Flatten django/website structure to top level single folder

It is extremely tedious to drive the project from django/website and then back up to ../../deploy and so forth. I propose that we just put the django application folder a single depth, from the root, with the name of {{cookiecutter.project_name}}.

Ignore some tables in dump

We could speed things up a little by ignoring some tables during the db dump. Tables we could ignore include:

  • cache table
  • sessions table

total_seconds not available in python 2.6

When I run a deploy from a host running Ubuntu 10.04 (e.g. fen-desktop7) I get this failure right at the end:

  File "/home/daniell/git/ruforum/django/website/.ve/src/dye/dye/fablib.py", line 210, in _report_downtime
    utils.puts("Downtime lasted for %.1f seconds" % downtime.total_seconds())
AttributeError: 'datetime.timedelta' object has no attribute 'total_seconds'

I think this is because timedelta.total_seconds was only added in Python 2.7, and this machine has 2.6.

Deploy fails for first ssh to server

If you've never sshed into a server before, you can get the one-off warning about all of the things you can break with sudo privilege:


We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:

    #1) Respect the privacy of others.
    #2) Think before you type.
    #3) With great power comes great responsibility.

This means a deployment will fail:


/var/django/360giving/2014-05-22_12-39-17 /var/django/360giving/2014-05-22_16-55-15
[fen-vz-360giving-stage.fen.aptivate.org] out: sudo password:
[fen-vz-360giving-stage.fen.aptivate.org] out: cp: target `System\r' is not a directory
[fen-vz-360giving-stage.fen.aptivate.org] out: /bin/bash: line 1: Administrator.: command not found
[fen-vz-360giving-stage.fen.aptivate.org] out: : command not found
[fen-vz-360giving-stage.fen.aptivate.org] out: : command not found
[fen-vz-360giving-stage.fen.aptivate.org] out: : command not found
[fen-vz-360giving-stage.fen.aptivate.org] out: /bin/bash: line 8: /var/django/360giving/2014-05-22_12-39-17: is a directory

Fatal error: sudo() received nonzero return code 126 while executing!

Issue with media location on collect static

This may be occurring because I did a cookie cutter of develop and then pegged dye to a specific earlier commit that I knew was stable, so my bad if that's the case.

Got this error:

Failed to execute command: ['/my_project_dir/django/website/.ve/bin/python', '/home/my_project_dir/django/website/manage.py', 'collectstatic', '--noinput']: returned 1
OSError: [Errno 2] No such file or directory: '/my_project_dir/django/media'

Couldn't immediately see why it wasn't looking in '/my_project_dir/django/website/media'

I tend to get really confused about static collection, so sorry if this is a non-issue.

Can we do deploys without downtime

Think about what the potential downsides would be:

  • if no migrations, then there may well be no problem
  • if there are migrations there might be a few server errors for some pages between updating the code and running the migrations.
  • what else?

At the least we could provide an option. Maybe we start by having the default be the current behaviour, with an option to have a no-downtime-deploy, get confident with that, then move to the default being no-downtime but leave an option for when we're a bit scared ...

Think about using rsync for back up copies

rather than making a recent copy and then deleting an old copy, we could:

  • check there are enough old versions, if not just do a copy as usual, but if there are:
  • rename the old copy to rsync_target
  • rsync the current version to rsync_target
  • rename rsync_target to the current date/time and then use that as the new copy

The rsync should be a lot faster than the copy.

Cookiecutter initial run fails on delimiter error

With cookiecutter 0.7.0 I get the following error:
markos@afrodita:~/aptivate$ cookiecutter --checkout develop https://github.com/aptivate/dye.git
Cloning into 'dye'...
remote: Reusing existing pack: 1551, done.
remote: Counting objects: 5, done.
remote: Compressing objects: 100% (5/5), done.
remote: Total 1556 (delta 0), reused 5 (delta 0)
Receiving objects: 100% (1556/1556), 359.24 KiB | 420.00 KiB/s, done.
Resolving deltas: 100% (916/916), done.
Checking connectivity... done
Already on 'develop'
Traceback (most recent call last):
File "/usr/local/bin/cookiecutter", line 9, in
load_entry_point('cookiecutter==0.7.0', 'console_scripts', 'cookiecutter')()
File "/usr/local/lib/python2.7/dist-packages/cookiecutter/main.py", line 115, in main
cookiecutter(args.input_dir, args.checkout, args.no_input)
File "/usr/local/lib/python2.7/dist-packages/cookiecutter/main.py", line 58, in cookiecutter
default_context=config_dict['default_context']
File "/usr/local/lib/python2.7/dist-packages/cookiecutter/generate.py", line 48, in generate_context
obj = json.load(file_handle, encoding='utf-8', object_pairs_hook=OrderedDict)
File "/usr/lib/python2.7/json/init.py", line 290, in load
*_kw)
File "/usr/lib/python2.7/json/init.py", line 351, in loads
return cls(encoding=encoding, *_kw).decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 365, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 381, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Expecting , delimiter: line 4 column 5 (char 75)

Checks wrong user for an SSH passphrase

I'm in Daniel's session and trying to deploy. Daniel has an SSH key with a passphrase, so I try to tell fab.py to use my own username instead:

daniell@fen-desktop7:~/git/ruforum/django/website$ ../../deploy/fab.py -u chris staging deploy 

But it still asks me for a passphrase:

[fen-vz-ruforum-stage.fen.aptivate.org] Executing task 'deploy'
[fen-vz-ruforum-stage.fen.aptivate.org] Passphrase for private key: 

And I have to press Ctrl+C at this prompt to skip it, then it asks for my login password as it should:

[fen-vz-ruforum-stage.fen.aptivate.org] Login password: 

This may be a Fabric bug? It happens here:

Sorry, you can't enter an empty password. Please try again.
[fen-vz-ruforum-stage.fen.aptivate.org] Passphrase for private key: Traceback (most recent call last):
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/main.py", line 710, in main
    *args, **kwargs
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/tasks.py", line 292, in execute
    multiprocessing
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/tasks.py", line 191, in _execute
    return task.run(*args, **kwargs)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/tasks.py", line 113, in run
    return self.wrapped(*args, **kwargs)
  File "../../deploy/localfab.py", line 41, in deploy
    fablib.deploy(revision=revision, keep=keep)
  File "/home/daniell/git/ruforum/django/website/.ve/src/dye/dye/fablib.py", line 152, in deploy
    _create_dir_if_not_exists(env.server_project_home)
  File "/home/daniell/git/ruforum/django/website/.ve/src/dye/dye/fablib.py", line 134, in _create_dir_if_not_exists
    if not files.exists(path):
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/contrib/files.py", line 35, in exists
    return not func(cmd).failed
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/network.py", line 457, in host_prompting_wrap
per
    return func(*args, **kwargs)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/operations.py", line 904, in run
    return _run_command(command, shell, pty, combine_stderr)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/operations.py", line 814, in _run_command
    stdout, stderr, status = _execute(default_channel(), wrapped_command, pty,
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/state.py", line 340, in default_channel
    chan = connections[env.host_string].get_transport().open_session()
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/network.py", line 84, in __getitem__
    self.connect(key)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/network.py", line 76, in connect
    self[key] = connect(user, host, port)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/network.py", line 351, in connect
    password = prompt_for_password(text)
  File "/home/daniell/git/ruforum/django/website/.ve/lib/python2.6/site-packages/fabric/network.py", line 427, in prompt_for_password
    new_password = getpass.getpass(password_prompt, stream)
  File "/usr/lib/python2.6/getpass.py", line 71, in unix_getpass
    passwd = _raw_input(prompt, stream, input=input)
  File "/usr/lib/python2.6/getpass.py", line 135, in _raw_input
    raise EOFError

Unable to switch to newly added git branch

On first deployment after adding a new branch the new branch can not be selected.

Eg. Server has 'master', and we reconfigure project_settings.py to use 'staging' branch at the same time as creating the staging branch in git. The new 'staging' branch does not yet exist on the server, so isn't offered as a choice to switch to.

Minor breakages in Fabric deployment

/var/django/projectname/previous is not created automatically:

[gcc.flexdns.net:48002] sudo: mv /var/django/mailer/dev /var/django/mailer/previous/2013-09-11_21-34-11
[gcc.flexdns.net:48002] out: mv: cannot move `/var/django/mailer/dev' to `/var/django/mailer/previous/2013-09-11_21-34-11': No such file or directory

And tasks.py isn't executable by default when installed in /deploy by cookiecutter, so fabric can't run it.

Prompt to merge your branch before deploy?

Let's say I'm working on the develop branch and I request a deploy to the staging server, which uses the staging branch.

Most likely I want to merge the develop branch into staging before deploy. I often forget to do that (my fault), so I deploy a no-op. I know that dye prompts me if the branches are different, but we're already logged into the remote server by that time, and all I can do is cancel the deploy, merge, push and deploy again.

The steps that I very often need are:

  • git checkout staging
  • git pull
  • git merge develop
  • git push
  • fab.py deploy staging
  • git checkout develop

Dye could actually prompt me before logging into the remote server, if my current branch is different than the one which will be deployed, whether I should merge into the target branch beforehand. This would save some manual work before each deployment.

Of course, one of these steps might fail, for example I might have local changes that would be lost if I checked out a different branch, or merge conflicts, and I'm fine with Dye aborting in that case and leaving me to sort out the mess.

Does this seem like a reasonable feature? Let me know if so, and I'll try to work up a patch.

Need to be in a user-writable directory to run `fab.py get_remote_dump_and_load`

If I run this from a project directory, that's owned as root, it fails:

[chris@blah website]$ ../../deploy/fab.py production get_remote_dump_and_load
[blah] Executing task 'get_remote_dump_and_load'
[blah] sudo: /var/django/inaspauthoraid/current/deploy/tasks.py dump_db:/tmp/db_dump.sql,for_rsync=true
[blah] Login password: 
[blah] out: sudo password:
[localhost] local: rsync -vz -e 'ssh -p 48001' blah:/tmp/db_dump.sql ./db_dump.sql
RSA host key for IP address '176.58.122.49' not in list of known hosts.
blah-inaspauthoraid.blah's password: 
rsync: mkstemp "/var/django/inaspauthoraid/2013-12-11_12-22-13/django/website/.db_dump.sql.kXC48X" failed: Permission denied (13)

sent 30 bytes  received 3083305 bytes  198924.84 bytes/sec
total size is 14349845  speedup is 4.65
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1505) [generator=3.0.6]

Fatal error: local() encountered an error (return code 23) while executing 'rsync -vz -e 'ssh -p 48001' chris@blah:/tmp/db_dump.sql ./db_dump.sql'

Aborting.

That's because I'm running it from a directory where my normal user account doesn't have permission to write. And if I sudo fab.py then it fails to log into the remote server, because it tries to log in as root instead of my normal user.

I can work around it by cd'ing to /tmp or working from my home directory, and running fab.py using its full path.

I think the fix is to copy the database dump file to /tmp instead of the current directory.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.