EzDev.org

fabric

Simple, Pythonic remote execution and deployment. Welcome to Fabric! — Fabric documentation


fabric password

Every time fabric runs, it asks for root password, can it be sent along same for automated proposes.

fab staging test

Source: (StackOverflow)

Python Fabric: How to answer to keyboard input?

I would like to automate the response for some question prompted by some programs, like mysql prompting for a password, or apt asking for a 'yes' or ... when I want to rebuild my haystack index with a ./manage.py rebuild_index.

For MySQL, I can use the --password= switch, and I'm sure that apt has a 'quiet' like option. But how can I pass the response to other programs ?


Source: (StackOverflow)

How do I copy a directory to a remote machine using Fabric?

I have a directory on my local machine that I would like to copy to a remote machine (and rename it) using Fabric. I know I can copy file using put(), but what about a directory. I know it's easy enough using scp, but I would prefer to do it from within my fabfile.py if possible.


Source: (StackOverflow)

How do I create a postgresql user with fabric

I want to create a database user for my setup fabric script but createuser has interactive password entering and seams not to like fabric.


Source: (StackOverflow)

How to get Fabric to automatically (instead of user-interactively) interact with shell commands? Combine with pexpect?

Seeking means to get Fabric to automatically (instead of user-interactively) interact with shell commands (and not just requests for passwords, but also requested user input when no "stdin/interactive override" like apt-get install -y is available).

This question along with these Fabric docs suggest that Fabric can only "push the interactivity" back to the human user that's running the Fabric program. Seeking to instead fully automate without any human presence. Don't yet have a "real," current problem to solve, just preparing for possible, future obstacle.

Possibly useful to combine with pexpect (or similar, alternative mechanism) if Fabric can't exclusively handle all stdin/prompts automatically? Hoping it doesn't need to be an "either/or" kind of thing. Why not leverage both (pexpect and Fabric) where appropriate, if applicable, in same program/automation?


Source: (StackOverflow)

How to discover current role in Python Fabric

This is a very Fabric specific question, but more experienced python hackers might be able to answer this, even if they don't know Fabric.

I am trying to specify different behaviour in a command depending on which role it is running for, i.e.:

def restart():
    if (SERVERTYPE == "APACHE"):
        sudo("apache2ctl graceful",pty=True)
    elif (SERVERTYPE == "APE"):
        sudo("supervisorctl reload",pty=True)

I was hacking this with functions like this one:

def apache():
    global SERVERTYPE
    SERVERTYPE = "APACHE"
    env.hosts = ['xxx.xxx.com']

But that is obviously not very elegant and I just discovered roles, so my question is:

How do I figure out which role a current instance belongs to?

env.roledefs = {
    'apache': ['xxx.xxx.com'],
    'APE': ['yyy.xxx.com'],
}

Thanks!


Source: (StackOverflow)

Get the current value of env.hosts list with Python Fabric Library

I've got this code (foo and bar are local servers):

env.hosts = ['foo', 'bar']

def mytask():
    print(env.hosts[0])

Which, of course prints foo every iteration.

As you probably know, Fabric iterates through the env.hosts list and executes mytask() on each of them this way:

fab mytask

does

task is executed on foo
task is executed on bar

I'm looking for a way to get the current host in every iteration.

Thanks,


Source: (StackOverflow)

fabric appears to start apache2 but doesn't

I'm using fabric to remotely start a micro aws server, install git and a git repository, adjust apache config and then restart the server.

If at any point, from the fabfile I issue either

sudo('service apache2 restart') or run('sudo service apache2 restart') or a stop and then a start, the command apparently runs, I get the response indicating apache has started, for example

[ec2-184-73-1-113.compute-1.amazonaws.com] sudo: service apache2 start
[ec2-184-73-1-113.compute-1.amazonaws.com] out:  * Starting web server apache2
[ec2-184-73-1-113.compute-1.amazonaws.com] out:    ...done.
[ec2-184-73-1-113.compute-1.amazonaws.com] out: 

However, if I try to connect, the connection is refused and if I ssh into the server and run sudo service apache2 status it says that "Apache is NOT running"

Whilst sshed in, if run sudo service apache start, the server is started and I can connect. Has anyone else experienced this? Or does anyone have any tips as to where I could look, in log files etc to work out what has happened. There is nothing in apache2/error.log, syslog or auth.log.

It's not that big a deal, I can work round it. I just don't like such silent failures.


Source: (StackOverflow)

How to ForwardAgent yes using fabric?

I am successfully run()ning commands on remote server with my private key pair.

However, I'd like to do git clone ssh://private/repo on remote server using my local key (or using local ssh agent I'm in).

How to do it using fabric?


Source: (StackOverflow)

Install Python Fabric on Windows [closed]

How to get a working Python Fabric installation on Windows?


Source: (StackOverflow)

Fabric's cd context manager does not work

I have set up my development environment on a new PC and seems I am having strange error with Fabric. Its 'cd' context manager seems does not change the current directory, and thus a lot of my commands don't work. I have written the test and it showed me results I have not expected to get:

from __future__ import with_statement
from fabric.api import local, run, cd

def xxx():
    with cd("src"):
        local("pwd")

Here are the results after running fab xxx:

[localhost] local: pwd
/home/pioneer/workspace/myproject

But instead of /home/pioneer/workspace/myproject there should be /home/pioneer/workspace/myproject/src, I think.


Source: (StackOverflow)

Python 3 support for fabric

Does fabric (http://docs.fabfile.org/en/1.7/) support Python 3 yet. As per Python 3 Wall of Superpowers it does not yet. If not what is the best alternative if using Django 1.5 with Python 3.


Source: (StackOverflow)

Best way to add an environment variable in fabric?

I would like to pass a few values from fabric into the remote environment, and I'm not seeing a great way to do it. The best I've come up with so far is:

with prefix('export FOO=BAR'):
    run('env | grep BAR')

This does seem to work, but it seems like a bit of a hack.

I looked in the GIT repository and it looks like this is issue #263.


Source: (StackOverflow)

Is there a deployment tool similar to Fabric written in JavaScript?

I put together a mobile development stack that is almost entirely using Javascript on node.js. With the only exception of SASS (prefer it to LESS) and Fabric. I prefer not to pollute my development directory and as I have to combine and minify JS and CSS anyway, I thought I could also use node.js to serve my code.

I would like to reduce my dependence on Ruby and/or Python. I don't really use all features of Fabric so I have the hope of replacing it. But I couldn't find any similar tool written in Javascript.

All I need is to:

  • Pull from git repository.
  • Install depencies locally.
  • Minify and combine JS/CSS invoking require.js/SASS.
  • Run testsuite.
  • Serve the code via node.js for hands-on or testing with Ripple.

Fabric might already be overkill, I only use it for my Python web projects because the necessary servers don't run on my machine, but that wouldn't be the case here. What would be the best way to handle this without Fabric?


Source: (StackOverflow)

Can a Python Fabric task invoke other tasks and respect their hosts lists?

I have a fabfile like the following:

@hosts('host1')
def host1_deploy():
    """Some logic that is specific to deploying to host1"""

@hosts('host2')
def host2_deploy():
    """Some logic that is specific to deploying to host2"""

def deploy():
    """"Deploy to both hosts, each using its own logic"""
    host1_deploy()
    host2_deploy()

I would like to do

fab deploy

and have it be equivalent to

fab host1_deploy host2_deploy

In other words, run each of the subtasks and for each one use the list of hosts that it specifies. However, this does not work. Instead, the deploy() task wants its own list of hosts that it will propogate to all of its subtasks.

Is there a way to update the deploy() task here so it will do what I want while leaving the subtasks alone so they can be run individually?


Source: (StackOverflow)