Giter Site home page Giter Site logo

dc-cli's Issues

Allow Import/Export/Info of Single Assets

This tool is written in a needlessly complicated way, seems unsupported, and is missing some pretty basic features, like dealing with single content-items. Would love to be proven wrong.

A massive quality-of-life improvement would be to add the ability to list content-items based solely on their folderID, "update" aka import content-items based solely on their unique UUID id, and export content-items based solely on their unique UUID id given an array of ids.

Attempting to use tool for republish of a particular content type

Attempting to perform a republish for a given content type (or group of content types) for the purposes of triggering webhook background processing. The only way commands i can see to accomplish is an export and then import. But when i attempt the import with --republish, it fails due to delivery key. If i exclude keys, then a new content item is created, which is not the desired behavior.

Is this the correct process for accomplishing this task, or is there other method? May be a feature request.

Can no longer export settings in 0.11

We were able to use settings export on 0.10 but this no longer works in 0.11, instead returning the following error:

Error: Cannot read property 'open' of undefined

This is on Windows, perhaps there has been a change to the path resolution that means this is no longer working?

Schema exports can overwrite one another

Schema exports can overwrite one another when you have two schema ids with the same ending, for example:

  1. https://host/segment1/test
  2. https://host/segment2/test

This will export the file .../content-type-schemas/test.json for both schemas; the one that is exported last is the one that remains in the file system post-export:

{
  "body": "./schemas/test-schema.json",
  "schemaId": "https://host/segment2/test",
  "validationLevel": "CONTENT_TYPE"
}

This also applies to schema json (see "body" above) and content type exported files, where the segment is lost and exported files get overwritten.

My suggestion would be to include the segment in the path so the exported naming is more like segment1-test and segment2-test.

"dc-cli content item import --publish true" fails with "Failed to initiate publish for ..."

Issue
When you import content items and set the publish flag to true, the import of items works, but then you get the error message "Failed to initiate publish for ${item.label}: ${e.toString()}" for each content item.

Steps to replicate
Run the command dc-cli content item import --publish true {credentials}

Root cause
The auth client in src/common/import/publish-queue.ts:47 requires AUTH_URL environment variable and this is not set by default.

Workaround
Set the environment variable value in your runtime environment.

Possible resolution paths
A) Initialise the auth client by using a configuration object, as happens within the ContentHub src/common/ch-api/ContentHub.ts:75.
B) Fall back to a default value in the client, as in src/common/ch-api/oauth2/services/OAuth2Client.ts:22

Include export/import extensions for UI extensions

We're currently using the CLI to import/export content types, schemas and settings as a part of our deployment pipeline. The missing part for us is the import/export of UI extensions. It'd be great to get this functionality added as a part of the tool if possible.

Failing commands return a zero exit-code

When a command fails due to an error, we would expect this to also return a non-zero exit-code, so the error can be detected.

Currently it does not so it "seems" like the command has succeeded when it hasn't. This is misleading in automated scenarios such as CI/CD pipelines, where we would want a pipeline to report an error if a dc-cli command fails.

Improve content type syncing

Currently content type syncing requires an id but:

  1. This id isn't a value that is exported (understandable from a cross-environment perspective) and although it can be retrieved from the Dynamic Content url, it would be nicer to allow it to also work with schemaId, like other parts of the cli.
  2. There isn't an option to sync all content types, which would be really useful when ensuring your environment is up-to-date for example.

Include support for exporting webhooks

Currently the CLI has no commands for exporting webhooks which would be really handy right now. As there are management API calls for webhooks we could always run these ourselves, though it would be great to have this functionality as an simple CLI command.

Associated repos not updating

When running a Content Type export, it doesn't seem to update the associated repos list once it's been set.

I created the Content Type in the Hub with 3 repos ticked but realised later it should only have 2 ticked. Every time I run an export it returns me 3 even though it's set to 2 in the Hub.

Facet by deliveryKey

Hi!
It would be nice to have --facet option to check if an item has deliveryKey, so you could export and re-import items with --publish flag.
like "dc-cli content-item export [dir] --facet "deliveryKey: *"

Currently I've faced some issues while trying to clone the entire hub. Items have been correctly moved, but I couldn't publish them using cli tool.

feature request: allow import of single json file for content type (schema)

I'd like to request the ability to import a single content type / schema without importing a whole directory of json schemas.

could probably make use of loadFileFromDirectory in import.service.ts (which seems to be completely unused atm)

extend commands/content-type-schemas/import.ts handler method, check if arg in argv is dir or json filename

if arg is json filename

importedContentTypes = [new ContentTypeWithRepositoryAssignments(
  JSON.parse(
    fs.readFileSync(filename, 'utf-8')
  )
)]

or something idk

Mapping file can get out of sync

Since the mapping file is only written to the filesystem after a complete (and successful) import run it will get out of sync when an error occurs during the import proces. Making it impossible to run the import multiple times in case of an error.

After every push to amplience the mapping file should be saved to the filesystem

collisions on deliveryKey when importing a folder of content items

I keep running into collisions on deliveryKey when importing a folder of content items

running dc-cli --baseRepo 62b07a46c9e77c00010c38a5 content-item import ././wallpaperCSVs/wallpaper/MY_FOLDER_NAME --force --publish --batchPublish

The goal is that I want to upsert NEW content items using dc-cli; if item does not exist then create new, else update existing item (which is why I added --force).

I get this error almost every 2nd or 3rd attempt at running the dc-cli import command... the only "fix" I know to do it to remove the .json file that has the deliveryKey that is an issue - and attempt to reimport the folder again...

Existing mapping loaded from '/Users/robb/.amplience/imports/repo-62b07a46c9e77c00010c38a5.json', changes will be saved back to it.
Scanning structure and content in '././trbCSVs/blogs/trb-2019-schema-1680220742-clean-import' for repository 'Content'...
Done. Validating content...
Found 1 dependancy levels in 6 items, 0 referencing a circular dependancy.
Importing 6 content items...
ERROR: Failed creating Article - Ladies Turkey Hunt for Texas Rios:
Error: Request failed with status code 409: {"errors":[{"level":"ERROR","code":"CONTENT_ITEM_DELIVERY_KEY_DUPLICATE","message":"Required delivery key already exists on this hub","property":"body._meta.deliveryKey","entity":"ContentItem","invalidValue":"the-realblog-with-stephanie-mallory/ladies-turkey-hunt-for-texas-rios"}]}
Importing content item failed, aborting. Error: Error: Request failed with status code 409: {"errors":[{"level":"ERROR","code":"CONTENT_ITEM_DELIVERY_KEY_DUPLICATE","message":"Required delivery key already exists on this hub","property":"body._meta.deliveryKey","entity":"ContentItem","invalidValue":"the-realblog-with-stephanie-mallory/ladies-turkey-hunt-for-texas-rios"}]}
Log written to "/Users/robb/.amplience/logs/item-import-1680230116189.log".

Authorization token is not refreshed

When cloning or importing hubs with +/- 14k pages the process never completes since the authorization token isn't refreshed.

Error: Request failed with status code 403: {"errors":[{"level":"ERROR","code":"FORBIDDEN","message":"Authorization required."}]}

clean hub does not clean the entire hub

Hello,

Unfortunately the hub clean command does not get rid of everything in the Hub. Statuses are not archived and Folders are not archived / deleted too. I'd assume clean hub does archive / delete everything.

Support `--force` for export

Export operations prompt if files need to be overwritten. It would be useful if the --force parameter could be used to bypass this prompt for the sake of scripting.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.