Giter Site home page Giter Site logo

sethkor / connect-backup Goto Github PK

View Code? Open in Web Editor NEW
19.0 5.0 2.0 100 KB

A tool to backup and restore AWS Connect, with some useful other utilities too

License: MIT License

Makefile 5.27% Go 94.73%
aws connect backup go golang lambda sam tools flows ci cd export import

connect-backup's Introduction

connect-backup

A tool to back-up and restore AWS Connect. You can back-up to file, S3 or just to stdout.

usage: connect-backup --instance=INSTANCE [<flags>] <command> [<args> ...]

A tool to backup and restore your AWS Connect instance

Flags:
  -h, --help               Show context-sensitive help (also try --help-long and --help-man).
      --profile=PROFILE    AWS credentials/config file profile to use
      --region=REGION      AWS region
      --instance=INSTANCE  The AWS Connect instance id to backup
  -v, --version            Show application version.

Commands:
  help [<command>...]
    Show help.

  backup [<flags>]
    Backup your instance

  restore --type=TYPE [<flags>] <json>
    Restore a connect component

  rename-flows [<flags>]
    Rename all call flows with a suffix

Getting connect-backup

Easiest way to install if you're on a Mac or Linux (amd64 or arm64) is to use Homebrew

Type:

brew tap sethkor/tap
brew install connect-backup

For other platforms take a look at the releases in Github. I build binaries for:

OS Architecture
Mac (Darwin) amd64 (aka x86_64)
Linux amd64, arm64, 386 (32 bit)
Windows amd64, 386 (32 bit)

Let me know if you would like a particular os/arch binary regularly built.

Lambda

If you'd rather set up a lambda to periodically trigger a backup, clone the repo as it contains all the lambda bits and a template to use with AWS SAM to deploy it. You will need to update the env.mk file with the values fo your environment.

Then just simply:

make lambda
make sam-deploy

You can either set environment variables for the lambda or trigger the lambda with an event json that contains the connect instance id and S3 bucket URL like this:

{
  "ConnectInstanceId": "your-AWS-connect-instance-id",
  "S3DestURL": "s3://your-backup-bucket/whatever-prefix-you want-like-the-instance-id"
  "FlowsRaw": true
}

FlowsRaw, which is boolean and doesn't need quotes, follows the same logic as --flows-raw on the command line (see below) where the contact flow is also written to it's own file in S3 with pretty print json. If the value is omitted it is treated as false.

ConnectInstanceId is only required if you wish to backup a specific connect instance. Omitting this will backup all instances (IAM policy permitting).

The sam template in lambda/template.yaml contains a single sample AWS::Events::Rule with an Input that constructs this JSON. You can add additional AWS::Events::Rule to back up other connect instances (or the same one to different destinations). If you are using the same backup bucket to backup multiple connect instances, try adding the connect instance id as a prefix in the S3DestURL value of the json.

You can also specify the connect instance is and s3 destination URL as environment vars and leave the event blank. This provides some backward compatibility to early generations of this lambda that relied soley on environment vars.

If you want to undeploy you can run:

make sam-remove

but remember to make sure your bucket is empty first (including all object versions) otherwise you won't be able to drop the stack.

What is included in the backup

  • Published Call Flows (The AWS API restricts this to published flows only)
  • Raw Call flows as json objects without AWS Connect provisioning metadata
  • Flow Modules
  • Routing Profiles including Routing Profile Queues
  • User Data (except Passwords)
  • User Hierarchy Groups
  • User Hierarchy
  • Prompt Data (But not any wav files)
  • Hours of Operation
  • Quick Connects
  • Queues (except the default AGENT queue)
  • Instance
  • Instance Attributes
  • Lambda Functions ARN
  • Lex Bots ARN (But there seems to be a bug in the AWS API with no results being returned. It's only in preview)

For contact flows, the actual flow is a json object encapsulated within the connect json flow object. If you wish to export also just the flow as a json object, pass the --flows-raw flag and it will write the contact flow itself as a seperate json in the flows-raw directory of prefix. This seperate raw flow is for informational purposes only and is not involved in restoration.

connect-backup use a directory/prefix (see what I did there?) structure so everything is neat and tidy. If the structure is not there it will create it on the fly:

your-connect-backup-workspace
   └──your-connect-instance-id
       ├──common
       ├──flows
       ├──flows-raw
       ├──hours-of-operation
       ├──prompts
       ├──quick-connects
       ├──routing-profile-queues
       ├──routing-profiles
       ├──user-hierarchy-groups
       └──users

If you wish to only backup or export a single contact flow, pass --flow-name to the backup comand.

The default behaviour is to backup every connect instance found unless you specify an instance with --instance

Restoration

You can restore AWS Connect elements you have previously backed up:

  • Published Call Flows (The AWS API restricts this to published flows only)
  • Routing Profiles including Routing Profile Queues
  • User Data (except Passwords)
  • User Hierarchy Groups
  • User Hierarchy

The --create flg will allow you to create a new element, rather than overwriting the existing one.

If you choose to restore with a new call flow name via --create you can only do this once for the new name. If you wish to overwrite this new flow with another restore then omit --create like a normal overwrite restoration.

When restoring Users, in order for the restoration to be reflected in the AWS Connect Console, you must refresh the User Management screen. This is due to the console using the listing on this screen as a cache to the underlying data.

You can use the restore function for a user to update the users first/last name by editing the json backup file. You can't do this via the AWS Connect Console at all.

If you use the --create flag when restoring a user a new user will be created with the user id passed with the --create flag. The password will be set to a very random long string (64chars, Caps and Upper case, Symbols and Numbers included) Which won't be returned. You will have to instruct the user to go through the password reset process to reset it. If the user already exists the user will not be recreated or updated.

Restoring to another connect instance

You can restore to another connect instance very simple flows using the --create flag with a flow name. Only flows that do not reference any other resources can be restored to another instance at the moment. Referencable resources are:

  • Announcements with wav
  • Lambda functions
  • Queues
  • Lex bots
  • Anything else that has an ARN in it

This might make restoration to another instance seem pointless, most contact flows incorporate at least one of these. The limitation here is that the ARN contains things like the source AWS account, instance id and resource id. These all need to be manipulated before restoration is possible. I am working on a solution for this at this moment. It also means the type of resources that are reference must be able to be restored to a different instance too.

Some resources, in particular wav files can never be backed up, AWS Connect does not support this. Nor is there an API command available to create a new announcement with a WAV file, this can only be done from the AWS console. Before restoring any call flow with a WAV file consider YOU MUST manually crate the announcements with the WAV via the AWS console.

Dynamic IDs

Any dynamic id usage will require the logic you implement to generate the dynamic id to handle the account the id is in and the id resource id. This tool can not assist with this, you should implement your dynamic logic so that it is abstract enough to handle this.

Renaming all the contact flows

AWS Connect won't let you delete any contact flows. Ever. Also every new instance you create comes with a bunch of example contact flows you can never delete either. This leads to your contact flow workspace jumbling up the contact flows you create and work with every day with the examples making things annoyingly hard to find. Now you can use --rename-flows which will add a prefix to the default set of AWS demo contact flows that are created when your AWS Connect Instance is first created which can help you with sorting and put all the example flows at the bottom of your contact flow list. If you wish to rename all contact flows pass the --all-flows flag. The default prefix is ~ (you can supply a different one to use) which will mean it will sort the renamed flows at the bottom. You can run this when you first create a connect instance or any time after. As the Name is really only metadata, renaming flows won't impact any references or live call flows.

IAM Policy

You will need the following IAM access ata minimum. The Lambda example deploys this policy for you. The resource scope is lest as * to cover the use case of backing up all connect instances, but the scope can be limited to a particular instance only (as per the comments below).

            - Effect: Allow
              Action:
                - s3:PutObject
                - s3:PutObjectACL
              Resource: !GetAtt s3Bucket.Arn
            - Effect: Allow
              Action:
                - ds:DescribeDirectories
              Resource: "*"
            - Effect: Allow
              Action:
                - connect:ListInstances
              Resource: "*"
            - Effect: Allow
              Action:
                - connect:ListContactFlow
                - connect:ListRoutingProfiles
                - connect:ListUserHierarchyGroups
                - connect:ListUsers
                - connect:ListPrompts
                - connect:ListHoursOfOperations
                - connect:ListQueues
                - connect:ListLambdaFunctions
                - connect:DescribeUserHierarchyStructure
                - connect:DescribeInstance
                - connect:DescribeQueue
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}"
            - Effect: Allow
              Action:
                - connect:DescribeContactFlow
                - connect:ListContactFlows
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/contact-flow/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/contact-flow/*"
            - Effect: Allow
              Action:
                - connect:DescribeUser
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/agent/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/agent/*"
            - Effect: Allow
              Action:
                - connect:DescribeRoutingProfile
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/routing-profile/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/routing-profile/*"
            - Effect: Allow
              Action:
                - connect:DescribeUserHierarchyGroup
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/agent-group/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/agent-group/*"
            - Effect: Allow
              Action:
                - connect:ListQuickConnects
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/transfer-destination/*"
#              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/transfer-destination/*"
            - Effect: Allow
              Action:
                - connect:DescribeHoursOfOperation
              Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/*/operating-hours/*"
 #             Resource: !Sub "arn:aws:connect:${AWS::Region}:${AWS::AccountId}:instance/${connectInstanceId}/operating-hours/*"

FAQ

Can I take a backup json and restore it manually via the AWS Connect Console?

No. The export/import function on the console supports a completley different format to the AWs API leveraged by connect-backup

How about restoring a call flow export taken manually from the AWS Connect Console?

No. Like the question above, the formats are very different.

Can I restore to a different connect instance as the source?

No, not yet anyway. See Restoring to another connect instance

Can I back-up and restore saved flows?

No. Only published flows can be operated on. This is a limitation of the AWS API.

Why can't I restore routing profile queues?

The AWS API appears to have a bug with the UpdateRoutingProfileQueue API currently

Why can't I restore a user hierarchy group to be empty?

The AWS API doesn't accept an empty or nil value for this currently

What is the Raw Flow?

Contact flows are json objects stored within another json object. This means they are escaped and can't be parsed or read easily. The ecapsulating json object also has other attributes (name, description etc) that are needed for restoration. A Raw flow takes this json object within the json object, unescapes it and pretty prints it so you can have a better visual representation of your contact flow as a json object.

I've found a bug, what do I fo?

Report it and I'll take a look.

Do you accept feature requests?

Yes. Let me know what you would like to see and I'll consider adding it to the backlog.

Will you provide any other handy tools for AWS Connect.

Yes, now that we finally have an AWS API I'll add usefull things over time. You may also want to take a look at my tools for provisioning AWS Lex chat bots via yaml/json called Lexbelt

connect-backup's People

Contributors

sethkor avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

connect-backup's Issues

backup/restore as part of ci/cd process?

Hi - first, thanks for creating this super useful tool.

I'm curious if it can be extended to support a traditional deployment model. For example, I would like to create Contact Flow(s) in a 'dev' AWS account ('dev' Connect instance), test/qa, and if approved, deploy the update Contact Flow(s) to another AWS account ('prod' Connect instance). I see the json elements use the ARN which includes the AWS account # and the Connect instance ID. Does the tool change these when doing a restore to another account/instance? If not could that be added?

SSO profile failing

Hello. Thanks for making such a useful tool. Does connect-backup currently support SSO for aws profiles? I'm getting an error, but only when I try to use a profile I've signed in as using "aws configure sso":

2021/05/25 14:52:22 NoCredentialProviders: no valid providers in chain
caused by: EnvAccessKeyNotFound: failed to find credentials in the environment.
SharedCredsLoad: failed to load profile, MySSOProfileName.
EC2RoleRequestError: no EC2 instance role found
caused by: RequestError: send request failed

File Output Issues with Special Characters

During the backup of an instance, object names with certain special characters do not write to the file system properly.

Example:

A connect Instance with a Hours of Operation entry called "24/7" results in a failure as it tries to write out a 24/7.json file and the OS interprets it a as a path.

# connect-backup --instance REDACTED backup --file=ConnectBackup
2023/11/14 13:44:07 Backing up Hours
2023/11/14 13:44:07 open ConnectBackup//REDACTED/hours-of-operation/24/7.json: no such file or directory
2023/11/14 13:44:07 Failed to write to the destination

Suggestion: Escape or Quote the files before writing to disk or strip path characters from the file.
Suggestion 2: Change the name of the argument from --file to --directory or --folder

Unpublished contact flows interrupt the rest of the backup

Hi, thanks for making this tool.

I've encountered an issue when there is an unpublished contact flow. In the logs you get
2021/10/01 18:57:38 Failed to describe flow

when I ran to describe the contact flow manually, I found the reason was
An error occurred (ContactFlowNotPublishedException) when calling the DescribeContactFlow operation: Contact flow not published

Now when it hits this roadblock it doesn't backup the rest of the contact flows and continues to backing up users. Is it possible to skip all the unpublished contactflows and backup the other contact flows?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.