fugue / credstash Goto Github PK
View Code? Open in Web Editor NEWA little utility for managing credentials in the cloud
License: Apache License 2.0
A little utility for managing credentials in the cloud
License: Apache License 2.0
I can successfully create a key with a non-default kms alias:
credstash put -k alias/test-credstash test-key test
test-key has been stored
But retrieving the key does not work:
credstash get --key='alias/test-credstash' test-key
usage: credstash [-h] [-r REGION] [-t TABLE] [-p PROFILE | -n ARN]
{delete,get,getall,list,put,setup} ...
credstash: error: unrecognized arguments: --key=test-credstash
Am I doing something wrong here?
I'm wondering how i would use this from AWS as example I have a Lambda function that needs to pull in credentials to connect to an RDS instance
Hi,
Sorry in advance if this is the wrong place to ask but maybe one of you has encountered that issue before. I try to use Credstash with Ansible:
{{ lookup('credstash', 'test.sample', 'region=eu-central-1') }}
but it always returns:
The credstash lookup plugin requires credstash to be installed
Everything is fine on the remote server: credstash is installed and can retrieve values from KMS..
I'm trying to store a binary file, but I get this error when I try to put:
credstash put: error: argument value: invalid value_or_filename value: '@/Users/Carson/.keystore'
This works fine with a PHP port I've been working on... ๐
So if I put with PHP and try to get with Python I get this error:
File "/usr/local/bin/credstash.py", line 299, in getSecret
plaintext = plaintext.decode("utf-8")
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xfe in position 0: invalid start byte
Thoughts?
It would be nice to have the put operation support an automatic version increment. For example -v auto to use a version of getHighestVersion(credname,region,table)+1
This would allow new credentials to be stored without needing to looking what the version number is when updating.
I am little confused where the key/value to be decrypted.
I currently set KMS key and created dynamoDB by credstash. I can run credstash get my-github-password
to get the decrypted values directly on my laptop.
Now I need run ansible playbook from my computer. So when ansible code go with lookup
- debug: msg="Credstash lookup! {{ lookup('credstash', 'my-github-password') }}"
Is the credstash command running from the remote server or run from my computer to get the decrypted value?
I need think about what policy to be assigned to my local account or to the IAM role for that instance.
The version field can be any value. The logic to get the latest version may cause confusing results when non-numeric versions are used.
We could constrain this at the argument level, or the dynamodb field level, or both (probably better), or leave it like it is. Thoughts?
Appears that data['contents']
is set to None
instead of ""
when an empty string is passed for storage, which causes the decode steps to fail.
Example:
credstash put empty_string_key ""
credstash get empty_string_key
Traceback (most recent call last):
File "/Virtualenvs/default/bin/credstash", line 11, in <module>
sys.exit(main())
File "/Virtualenvs/default/bin/credstash.py", line 474, in main
context=args.context))
File "/Virtualenvs/default/bin/credstash.py", line 251, in getSecret
hmac = HMAC(hmac_key, msg=b64decode(material['contents']),
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/base64.py", line 73, in b64decode
return binascii.a2b_base64(s)
TypeError: must be string or buffer, not None
It looks like pycrypto
project is dead and also has a security vulnerability. Is this a problem for credstash
?
pycrypto
also makes it quite tricky to install credstash
on Windows, it looks like there are alternatives which make installation simpler.
The KMS decrypt calls had to be ported to boto3 (see #36). There's now a mix of boto 2 and boto 3. It might make sense to move everything to boto 3.
I had an idea to provide STS Assume Role support so you can pass the STS credentials (AccessKey, Secret, Session Token) along as parameters to credstash. This way will be a way of storing credentials in the STS account.
[user@machine ~]$ pip list | grep credstash
credstash (1.9.0)
[user@machine ~]$ credstash -r ap-southeast-2 list
Traceback (most recent call last):
File "/usr/local/bin/credstash", line 9, in <module>
load_entry_point('credstash==1.9.0', 'console_scripts', 'credstash')()
File "/usr/local/bin/credstash.py", line 485, in main
credential_list = listSecrets(region=region, table=args.table)
File "/usr/local/bin/credstash.py", line 171, in listSecrets
ExpressionAttributeNames={"#N": "name"})
File "/usr/local/lib/python2.7/site-packages/boto3/resources/factory.py", line 455, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/boto3/resources/action.py", line 79, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 310, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 407, in _make_api_call
raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the Scan operation: User: arn:aws:sts::<AWS account id>:assumed-role/<role>/<instance id> is not authorized to perform: dynamodb:Scan on resource: arn:aws:dynamodb:us-east-1:<AWS account id>:table/credential-store
# And now using the environment variable:
[user@machine ~]$ AWS_DEFAULT_REGION=ap-southeast-2 credstash list
key1 -- version 1
key2 -- version 1
key3 -- version 1
The put_item
call to DDB no longer does a conditional put. That means that its possible to overwrite items
If Table.scan would return more than 1MB, the result is paginated.
LastEvaluatedKey is set and scan() should be called again.
Hi,
When trying to create the dynamodb table in eu-west-1 (credstash -r eu-west-1 setup), I see that it still gets created in N. Virginia.
It would be nice if it could use the AWS_DEFAULT_REGION env var by default
First off, as an author of a similar utility congrats on your impeccable taste. :)
My one piece of feedback is that AES-CTR is trivially malleable, which is to say an attacker can modify the ciphertext and change the resulting plaintext without anyone noticing. I would strongly suggest using an authenticated mode like GCM, CCM, OCB, etc. or an Encrypt-Then-MAC construction with HMAC-SHA256 and an independent key.
I knew @ can be used to input the file as value. But it looks weird.
And how to deal with a value start with @
credstash put my-github-password @secure123
usage: credstash put [-h] [-k KEY] [-v VERSION] [-a]
credential value [context [context ...]]
credstash put: error: argument value: Unable to read file secure123
Can we have an option such as -f
, so we can input the file as:
credstash put my-github-password -f password.file
It'd be great if we could pass in profile names to credstash method parameters, rather than only being able to specify the profile via the environment variable. This would allow users to take advantage of multiple AWS profiles when they are using credstash as a Python module.
Boto3 supports the generation of sessions based on profile names, so the enhancement is very easy to implement and doesn't affect base functionality in any way. I've actually already implemented it in my own project here (although my implementation doesn't respect the environment variable).
If the idea is alright with you, I'd be willing to put my own changes (modified to respect the environment variable) in as a pull request. Thanks for your time!
Has anyone thought about how this could work with DDB cross region replication?
I guess as long as there is a wrapped data key per region which unwraps to the same data key value, this should work?
When you try to store a value that begins with a - program does not execute as expected.
python --version: Python 2.7.11
uname -a: Darwin ip-192-168-128-62.ec2.internal 15.4.0 Darwin Kernel Version 15.4.0: Fri Feb 26 22:08:05 PST 2016; root:xnu-3248.40.184~3/RELEASE_X86_64 x86_64
credstash put a.b.c "-Putanythinginherethatstartswitha-" usage: credstash put [-h] [-k KEY] [-v VERSION] [-a] credential value [context [context ...]] credstash put: error: too few arguments
Please advise.
It would be nice to have the ability to use custom backends to store and retrieve data such as etcd, redis and mysql.
Hi, I would like to see this tool to have a changeable schema.
For instance, I would like to store the keys based on the environment and project.
I can do that with providing context but if the key name is same for a different project then I don't have any other option currently.
For instance:
credstash put keySL value1 environment=stg project=SL
credstash put keySL 1value environment=stg project=DD
ERROR: authKeyIdTRDSL version 0000000000000000001 is already in the credential store. Use the -v flag to specify a new version
Would be good to see some sort of ACL capabilities for example based on IAM groups where Test applications would not have access to Production secrets etc.
As an example "myapp.db.prod" secret should be available only to IAM Groups "Admins" and "Production Environment" but not to "Test Environment" and "Developers".
Admin group user should be able to list all secrets in the store.
Maybe somehow doable via transparently managing multiple KMS master keys?
Right now credstash uses the environment variable AWS_DEFAULT_REGION or the -r CLI parameter to specify in which region credstash should operate. If none of those are available it defaults forcibly to us-east-1 (this is hardcoded).
Boto3 will use one of the configured AWS profiles, for which a default region can be specified. Boto3 will use that region as the default when creating any client. Plus, Boto3 will use as a default the EC2 instance region when using Instance Profiles to fetch credentials.
Is there any reason to have us-east-1 hardcoded instead of relying in the default behaviour of boto3? I've already fallen trap of this a couple time by forgetting to specify a region and not realising why it was not working.
With 1.7 release, 'credstash put foo bar -a' only support up to 10 versions.
After that 'credstash put foo bar -a' always update the value of 10th version,
and 'credstash get foo' always return the value of the 9th version.
I was wondering if it would be possible to put in a feature request to add an additional getall format.
I would love to support the key/value format supported by this library. There are quite a few frameworks that support this format, would definitely be a huge win.
Example format:
DB_HOST=localhost
DB_PORT=3306
DB_USERNAME=root
DB_PASSWORD=root
In addition to store passwords i'd also like to store certificates (the content of it), will that be possible? Asking as i believe there is a limitation in input size?
Because InvalidCiphertextException isn't imported properly, or rather caught properly, credstash blows up with:
except boto.kms.exceptions.InvalidCiphertextException:
AttributeError: 'module' object has no attribute 'exceptions'
when kms.decrypt() raises an exception.
Decrypting a secret on python 3 gives an error message:
KMS ERROR: Decryption error b'CiDwY9zNRGfL4YKvl0S2YvKyCmmpRmCFJhlq6ZK2J1vAahLLAQEBAQB48GPczURny+GCr5dEtmLysgppqUZghSYZaumStidbwGoAAACiMIGfBgkqhkiG9w0BBwaggZEwgY4CAQAwgYgGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMnMrQ0WcbAIolzHLPAgEQgFuHSyoaYRL+bZZQ4QEW/2HnO8uq5qIZYTAzmKesRuBE3emK+FTe85gT6uwHc4icKVKyMgW7Yf2rlxmDmoB5jh3a+rSPSss3spLfKE26uv5p4o7J+nXXuaCZHEPF' is not JSON serializable
this is the same whether the secret was put by python 2 or python 3.
Decrypt works just fine on python 2
Hi,
I was wondering if it's possible to get the latest version of master as a new release available through pip?
I was trying to get the --autoversion working, but I realized the version I have is 1.5.2 which does not contain that feature.
Thanks
I struggled with setting up cross-account access for a little while. I thought it would be useful if it were documented somewhere, but I'm not sure where it would go.
This tutorial was super helpful.
In short:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"kms:Decrypt"
],
"Effect": "Allow",
"Resource": "arn:aws:kms:us-east-1:AccountA:key/b535d0f6-299f-4499-bf69-c19c63d822b1"
},
{
"Action": [
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan"
],
"Effect": "Allow",
"Resource": "arn:aws:dynamodb:us-east-1:AccountA:table/credential-store"
}
]
}
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::AccountB:root"
},
"Action": "sts:AssumeRole"
}
]
}
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::AccountA:role/name-of-role-from-accounta"
}
]
}
credstash --arn arn:aws:iam::AccountA:role/name-of-role-from-accounta get some_key_that_should_be_there
Awesome tool!
Question -- your tool will utilise AWS secret & access key .. How do we encrypt them? We wouldn't leave them either on config file or even as environment variable?
Please advise. Thanks.
Here is a go implementation of something similar to credstash : sneaker
It may make sense to take a look at it and see if there are any ideas that we should use from it.
Running credstash.py
using python 3 produces this stacktrace:
โ [alex:~/code/credstash] master ยฑ python --version
Python 3.4.3
โ [alex:~/code/credstash] master ยฑ python ./credstash.py
Traceback (most recent call last):
File "./credstash.py", line 482, in <module>
main()
File "./credstash.py", line 401, in main
if args.action == "delete":
AttributeError: 'Namespace' object has no attribute 'action'
Runs fine on 2.x.
Hey! I would like to use credstash to store secrets in different environments (staging, qa, prod). Initially, I thought I could use the context, and credstash would handle creating the duplicate rows and retrieving the correct credential given context. But after spending a bit of time seeing how the application works, I realize that this is not correct, and setting DBPASSWORD context=qa, then DBPASSWORD context=prod would cause a collision.
The present idea is to use different tables for each environment. This seems to be working, but I am wondering if there is a more recommended approach.
Thanks for this awesome utility!
Hello,
Working on getting this running for the first time and I seem to be stuck. I have done the following:
credstash setup
credstash put foo bar
KMS ERROR: Could not generate key using KMS key alias/credstash
I thought that it might have to do with alias/credstash
so I also tried: credstash put foo bar -k credstash
but received the same error with the different alias name.
I seem to be missing something fundamental, here, as I imagine this should just work.
There are several projects that provide credstash compatibility with other languages. These should be acknowleged & linked from the README:
Hi,
I love your project, very tidy/ clean/ simple !
However i'd appreciate your thoughts/ rational for creating credstash rather than using hashicorp vault?
TL;DR When using getall -f csv the line endings are always set to \r\n, which is not appropriate depending on the environment. The issue seems to be related to the csv.writer used in the csv_dump() function in crestash. There is more information about the appropriate way to create those wirters in the docs: https://docs.python.org/3/library/csv.html#csv.writer. There is a newline parameter which seems to be sort of required, but using the value specified in there ('') doesn't seem to work either. I got it to work on my distro by setting newline to None. I still don't fully understand why.
Context and further details:
I've been trying to debug a really weird issue I found while trying to load all secrets at once.
I am launching my application using a script that loads the credstash secrets one by one, sets those on variables (with hardcoded names in the script) and exports those. Then it execs my application.
As I keep adding secrets startup times started to lag a bit, a few seconds per new variable. I found out that the issue was the python interpreter startup times: loading it once or twice it's no problem, but loading it 10 or 12 times (once per key) it was definitely noticeable.
I tried a new strategy: using getall and some scripting. Using a while loop to which I pipe my secrets I export those variables. For simplicity the credstash key names are the same as the names or the variables being exported. Although I struggled a bit to get it to work (I hate shell scripting) in the end I managed to get it to work with the following snippet (environment: alpine linux 3.4 using busybox's dash, sh, shell):
PARAMS="-r $AWS_DEFAULT_REGION -t $CREDSTASH_TABLE"
credstash $PARAMS getall -f csv > secrets.txt
while IFS=, read -r varname varvalue
do
if [ -z $varname ]
then
continue
fi
export $varname=$varvalue
done < secrets.txt
exec "$@"
(Further info: I tried not having to output the results of credstash using a here doc instead of redirecting stdin from a file, but I've already lost many hours trying to figure out why the ๐ฉ it doesn't work. Some help would be appreciated!)
The problem is that now all my variable values end in '\r'. Weird. I could trim it using tr or something like that, but I don't get why that is when Python now technically uses Universal Line Endings.
First: love your little utility!
Second: it seems that the the code uses the default value (=1) for the initial counter value for CTR mode for each and every key and encrypted message, which is ... kind of ok.
Reusing the same initial counter value with the same key is bad, but credstash seems to use a new unique key for each encrypted secret it stores. That particular usage pattern doesn't introduce any vulnerability for the same counter value.
However, any possible future changes of credstash or other possible applications that would use the encrypted secrets in DDB, which would use the same key to encrypt another secret, would have to change that practice and should use a different initial counter values.
Furthermore, any audit of the code will have security engineers cringe and they will have to spend additional effort to ensure that keys and counter values are not reused together.
Looking at the code, it feels that the changes to initialize the counter with a random value, and to store that value with the meta-data in the DDB-record, would not change the code significantly. This could even be implemented in a backwards compatible way (no record value for the initial counter value would default the value to 1).
Are you open to change the code to accommodate unique random initial counter values per key and per encrypted secret?
I'm trying to use credstash to retrieve credentials in an aws lambda function e.g.
from credstash import getSecret
password = getSecret(USERNAME)
But I'm getting the following error when the function runs:
Unable to import module '[module_name]': /var/task/Crypto/Cipher/_AES.so: invalid ELF header
I'm not sure what the cause is. My lambda function lives in a virtualenv, and when I deploy I copy the contents of lib/python2.7/site-packages/ into the zip. Do you have any ideas?
Right now it appears to only work with the 'default' profile in ~/.aws/credentials, it'd be nice to support multiple credentials for those of us who have many AWS accounts to work with.
Great little utility! We are looking to utilize it but we need to constrain each service to specific encryption context. In your documentation you state the following:
"They are also useful for constraining access to a given credstash stored credential by using KMS Key Policy conditions and KMS Grant conditions. Doing so allows you to, for example, make sure that your database servers and web-servers can read the web-server DB user password but your database servers can not read your web-servers TLS/SSL certificate's private key."
Could you please elaborate or perhaps include an example of how one can use Key Grants to accomplish above?
Thanks,
Alex
A possibly good place to look at before defaulting to us-east-1
Hi there,
I've been looking for a tool like this or vault to manage the secrets for web applications.
It looks to be pretty popular and great tool. The one question I have is:
How do applications access the secrets using credstash?
Since this is a script to access the dynamoDb and KMS to do the functional requirements, application won't be using API to get the secrets... What's the best or recommended approach to expose the secrets to the applications?
Thanks,
In order to be able to provide access control to the credentials stored by credstash (somewhat like what's requested in Issue #13), credstash needs to be able to pass Encryption Contexts to the KMS API with get
and put
requests.
Doing so will allow for the creation of both KMS Key Policies and KMS Grants which constrain access to specific credentials.
I've put together PR #18 to provide this added functionality.
It looks like PyCrypto is not maintained for more than one year.
Is there something we can replace pycrypto with?
It would be ideal to have an additional input value for the database to store a reference. This could be used to do things like tag who made a change or why the change is made.
For example assume I want to allow a secret to be created at an EC2 instance bootstrap and I want the instance to add the secret at bootstrap using an IAM role. I may want to provide a reference value of who owns the new secret based off an EC2 instance tag with the name of the instance owner.
As another example I may want to update a password due to a standard rotation policy or password compromise. I would like to have my put transaction allow a comment stating "Standard Rotation yyyy-mm-dd" or "password compromised"
I'm sorry, my change to using subparsers conflicts with the way the -i INFILE
option works.
Previously, the value
argument for the put
action was optional. This got around the fact that it was only used for put
(and not get
, list
, etc) and that the value might be empty if the -i
option was used.
Currently at credstash 1.3.1 if you execute
credstash put -i /etc/hosts myhostsfile
It will fail because it's expecting a value argument after myhostfile
So, fixes for this
value
to optional and ignores it if the -i
argument is present. This sounds ugly.@
character, it indicates that the value is not a value but instead a filename to the file containing the value. This can be seen in curl options like --data
, --form
, --write-out
and others. This solves the problem, but isn't backwards compatible. Anyone already using the -i
syntax to pass file contents in as a value would stop working.I'm happy to code this up (as the bug is my fault), but I want to find out what direction to go in first.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.