WeirdAAL (AWS Attack Library)
Documentation available on the wiki.
WeirdAAL (AWS Attack Library)
WeirdAAL (AWS Attack Library)
Documentation available on the wiki.
output is too unwieldy...needs to dump a html or text file with what it has permissions to
need an s3 function to put a file into a specified s3 bucket
Put together a file that works as a "hey, you know nothing except that you have access keys. Let's run this to get started"
Working on it in my fork: https://github.com/cktricky/weirdAAL/blob/master/weirdAAL.py
Could not connect to the endpoint URL: "https://discovery.us-east-1.amazonaws.com/"
[-] No discovery actions allowed [-]
none of the code handles session tokens... this might bite me in the ass later.
most likely need to add the session=None in some places. will be required to do any assume role stuff then to use the creds it spits out
boto3 will supposedly support a proxy...might be nice to support this for evasion
CG mentioned some git voodoo that will keep the config.py file but not track any updates to it.
Thru some experimentation, we realized it would be beneficial to automate listing all topics and their subscribers at once.
The credential check ideally only occurs if you are actually doing something meaning, working with a module. So that's the first change - moving that logic into the if (args.module)
section.
Secondly, we need to determine if the module being called is for AWS or for GCP and then perform the relevant credential check. This is because we have no reason to perform an AWS credential check if we're using a GCP module and vice versa.
--list (list modules) doesnt work without a valid aws key.
python3 weirdAAL.py --list
The AWS Access Keys are not valid/active
Check the above error message and fix to use weirdAAL
this should probably work or run before the access check. thoughts @cktricky ?
python3 weirdAAL.py -m s3_download_file -a 'bucket.whatever, 'logs.sql.gz' -t test
Works as it downloads the file to the root of the 'loot' directory
python3 weirdAAL.py -m s3_download_file -a 'bucket.whatever/path/tofile/', 'logs.sql.gz' -t test
Fails - the module doenst check for the existence of the path && create it if it doenst exist
workaround if you get here before it's fixed is to just mkdir the file path and it will download
The corresponding wiki page for S3 usage examples does not seem to exist
need a requirements.txt and some virtualenv & install instructions
due to the way we developed the codebase the client exception block is kinda all over the place. need to investigate building a single function to handle all the various exceptions. this should clean up the code base and make things more consistent.
make code pydoc friendly so we/people can generate docs for the project
https://stackoverflow.com/a/13050049
if you haven't created the dbs as part of setup, when you run modules they will run until it goes to write to the db (that doesn't exist). it's a graceful error but not awesome. we should probably check the db is setup when running anything except for help
We need dynamic global variables (already took care of the db_name)
Hi Guys,
Got this error message below, but to log file to see the actual error
Check the above error message and fix to use weirdAAL
Need to include this - important to cut most notifications off at the knees. SNS is essentially the achilles heel of AWS. Most if not all services rely on it to warn administrators that something is amiss.
in aws_pwn you can call the metadata url from ec2 if you have creds. that's useful. create function/script to do that
see:
https://github.com/dagrz/aws_pwn/blob/master/elevation/bouncy_bouncy_cloudy_cloud.py
https://github.com/dagrz/aws_pwn/blob/master/elevation/assume_roles.py
useful functions around listing roles and assuming roles for an account
still have a few s3 modules in the old format. fix them up and the s3 lib/modules
list_services_by_key only lists for the CURRENT key. would be nice to be able to pass another accesskey id to a module for an arbitrary key via the -a argument to list data for a key in the database
Making sure to leave a note re: @carnal0wnage and I's discussion last night re: conventions.
Background: Now that we have both GCP and AWS functionality we might have a situation where a method is named the same between the two. So imagine a GCP method named module_do_something
and an AWS module named module_do_something
. This would be less than ideal.
So going forward, our thoughts were to do one of the following:
module_aws_do_something
and module_gcp_do_something
(prepending module_<cloudservice
aws/module_do_something
and gcp/module_do_something
Anyways, recording this conversation in case we need to figure out later why we chose whatever convention we go with (which - I believe we're leaning towards module_aws_
or module_gcp_
๐
Could not connect to the endpoint URL: "https://devicefarm.us-east-1.amazonaws.com/"
Could not connect to the endpoint URL: "https://devicefarm.us-east-1.amazonaws.com/"
[-] No devicefarm actions allowed [-]
need to verify which regions have DeviceFarm endpoints and test those in a loop
right now only a 2 of the functions in EC2 module log to DB. ALL the modules should log results.
do a quick trial of guardduty and run recon_all against it
if you just DL the repo, install deps, and just doing weirdAAL.py -h you'll get something similar to below
(weirdAAL-python) lookupfailed-2:weirdAAL-gcp CG$ python3 weirdAAL.py
Traceback (most recent call last):
File "weirdAAL.py", line 27, in <module>
exec("from %s import *" % module)
File "<string>", line 1, in <module>
File "/Users/CG/Documents/pentest/weirdAAL/modules/aws/cloudtrail.py", line 4, in <module>
from libs.cloudtrail import *
File "/Users/CG/Documents/pentest/weirdAAL/libs/cloudtrail.py", line 22, in <module>
AWS_ACCESS_KEY_ID = credentials.access_key
AttributeError: 'NoneType' object has no attribute 'access_key'
this is because nothing can be loaded because the .env load is failing. just do the
cp env.sample .env
and you should be able to get to further with the start process
need initial setup docs and module docs
Could not connect to the endpoint URL: "https://data.mediastore.us-east-1.amazonaws.com/"
[-] No mediastore-data actions allowed [-]
added a function that should all the deletion of a MFA device, need to test it
from lunch...
we will log the results of the recon module to the DB. we need some functions/modules that will check what services and sub-services the key has access to and either suggest or execute follow on activity. once the data is in the DB, this should be relatively easy to do
possible flow...
weirdaal.py --recon
--populates awskey, service, sub_service
weirdaal.py --show_services
EC2, DescribeInstances
EC2, DescribeVolumes
EMR, DescribeRepositories
...
weirdaal.py --suggest
EC2, DescribeInstances, list instances module / action
...
EMR, DescribeRepositories, list repositories module / action
raise ConnectTimeout(e, request=request)
botocore.vendored.requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='importexport.amazonaws.com', port=443): Max retries exceeded with url: / (Caused by ConnectTimeoutError(<botocore.awsrequest.AWSHTTPSConnection object at 0x10a76fb00>, 'Connection to importexport.amazonaws.com timed out. (connect timeout=60)'))
ec2
snapshot volume
snapshot memory
start ecs or ec2 with required tools and mount volume
steal the shit
s3
need a module to list all the targets in the database without having to sqlite3 in the database.
if you've been using weirdAAL for other targets good chance you dont remember the old ones
listing enabled
world write
world read
policy allows for change but bucket doesnt not have the perms
Could use a way to pass into arguments to our step_
methods.
we should do our best to make sure this code works for both or decide to make it python3 right now
We should institute a -s
(show all modules) type function. Chris suggested a -m show_all_modules
and I proposed in addition we add a command line switch of -s
for new users.
weirdAAL would say invalid key but move on to the module which would usually throw an error if we werent handling it correctly
now that ken has the global targets stuff working update the DB's to use it
write some code to determine what, if any, logging is going on.
cktricky's code does most of this and its partially done
since buckets are unique to regions we need to loop thru them all
This is so unbelievably dirty that it would be awesome to have a working module to backdoor code and re-upload
update_function_code(**kwargs)
http://boto3.readthedocs.io/en/latest/reference/services/lambda.html#Lambda.Client.update_function_code
total wishlist item though
Hi Chris and contributors,
Thank you for all your great work here ! I was wondering if there was support or planned for leveraging the credentials file in the aws dot directory ?
describe all instances in lightsail
Using the module - ec2_describe_instances_basic
it would be nice to know which regions were tested that resulted in the errors like below:
An error occurred (OptInRequired) when calling the DescribeInstances operation: You are not subscribed to this service. Please go to http://aws.amazon.com to subscribe
AWSKEYHERE : (AuthFailure) when calling the DescribeInstances -- key is invalid or no permissions for region.
We should also add the ability to add ourselves as a subscriber to a topic. I'm thinking just email at first. Maybe later we expand to all available options.
everyone seems to have
elasticbeanstalk:DescribeApplications
elasticbeanstalk:DescribeApplicationVersions
elasticbeanstalk:DescribeEnvironments
elasticbeanstalk:DescribeEvents
opsworks:DescribeStacks
route53:ListGeoLocations
sts:GetCallerIdentity
but rarely have anything there, cept for sts one.
write a check that will check these specific things so it can be a suggested follow up.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.