datocms / js-datocms-client Goto Github PK
View Code? Open in Web Editor NEWNodeJS/Browser client for DatoCMS
Home Page: https://www.datocms.com
License: MIT License
NodeJS/Browser client for DatoCMS
Home Page: https://www.datocms.com
License: MIT License
Here's the program:
const SiteClient = require('datocms-client').SiteClient;
const dato = new SiteClient("YOUR APIKEY");
dato.uploadFile('<path to a file larger than 10MB>')
.then(x => console.log(x));
Results in
(node:28475) UnhandledPromiseRejectionWarning: Error: Request body larger than maxBodyLength limit
at RedirectableRequest.write (/Users/mats-ola.persson/Work/yadayada/node_modules/follow-redirects/index.js:97:24)
at RedirectableRequest.end (/Users/mats-ola.persson/Work/yadayada/node_modules/follow-redirects/index.js:116:8)
at dispatchHttpRequest (/Users/mats-ola.persson/Work/yadayada/node_modules/axios/lib/adapters/http.js:272:11)
at new Promise (<anonymous>)
at httpAdapter (/Users/mats-ola.persson/Work/yadayada/node_modules/axios/lib/adapters/http.js:20:10)
at dispatchRequest (/Users/mats-ola.persson/Work/yadayada/node_modules/axios/lib/core/dispatchRequest.js:59:10)
at processTicksAndRejections (internal/process/task_queues.js:89:5)
I make a request with the SiteClient to update a field. This apparently leads to a 422 Unprocessable Entity
(no idea why). But the error I receive is:
TypeError: Cannot read property 'url' of undefined
in ApiException
I checked out the library code and this ApiException is thrown in generateClient
:
js-datocms-client/src/utils/generateClient.js
Line 164 in c103377
But here, where this ApiException is thrown, it's missing the third argument for the constructor. And of course that's leading to the TypeError. And now I'm seeing this TypeError instead of the real Unprocessable Entity
error I should see and that makes debugging a little bit harder :)
Hi, I am using gatsby-source-datocms module. I get some connection errors from your module. My internet connection became unstable and I think my modem could cause this but I'm not sure. Anyways, my internet connections is unstable and I am getting this error very frequently:
FetchError: request to https://site-api.datocms.com/site?include=item_types%2Citem_types .fields failed, reason: Client network socket disconnected before secure TLS connection was established
I also encounter with Could not subscribe to real-time events!
error, which seems to come from here (pusher integration):
So is it possible to add some retry logic to both api requests and pusher connections? I just wanted to discuss this first and if it's OK for you, I'd happily try to make a PR.
Thanks.
I have this code that is generating .md files:
module.exports = (dato, root) => {
root.directory("content/projects", projectsDir => {
dato.projects.forEach(project => {
projectsDir.createPost(`${project.slug}.md`, "yaml", {
frontmatter: {
title: project.title,
category: project.category.name,
photos: project.photos.forEach(photo => {
photo.url();
})
},
content: project.content
});
});
});
};
Everything works fine, except the photos front matter property that outpus null. I used the snippet that exists in the documentation https://www.datocms.com/docs/static-generators/other-ssg/fields#multiple-files-field
Is this just me or datocms-client is not usable in the browser because of the browser: {"node-fetch": false}
in package.json?
When writing migration scripts I was initially confused why my field definitions were not initialized as expected - it turned out that I had mistyped the appeareance
for a field as appearance
.
Unless this is a British English variant that I am unfamiliar with, I think this is a misspelling of appearance
. It is found in this form throughout the DatoCms client and gatsby-source
I still cannot import from WordPress.
it just hangs on importing data even from a brand new WordPress instance
$ ./node_modules/.bin/dato wp-import --token=token --wpUrl=http://test.local/ --wpUser=user --wpPassword=pass
√ Fetching existing data
It seems that the current implementation of the file upload with an url downloads the file locally before uploading it via the CMA. Is there a possibility to make use of streams or just an url instead of the current logic of downloading the file? This is giving me out of memory exceptions on Google Cloud functions of 2Gb memory.
Hi, I'm trying to import data from an existing wordpress site with your client. I've used the same command (changed the credentials) from your blog post: https://www.datocms.com/blog/wordpress-importer/ But the tools exits with a non-zero exit code at the
Line 21 in c98f547
I'm not sure what is the problem is, I don't know how do I extract some debug information from doctop
module. My purpose is to let you know about this problem.
I've temporarily set up minimist
for argument parsing. For anyone researching for a quick workaround, here it is:
/// src/cli.js
/// please comment out or delete if (...) lines in code..
const argv = require('minimist')(process.argv.slice(2))
if (argv._[0] === 'dump') {
dump(argv);
} else if (argv._[0] === 'check') {
check(argv);
} else if (argv._[0] === 'wp-import') {
const {
token,
wpUrl,
wpUser,
wpPassword,
} = argv;
wpImport(token, wpUrl, wpUser, wpPassword);
} else if (argv._[0] === 'contentful-import') {
const {
'--contentfulToken': contentfulToken,
'--contentfulSpaceId': contentfulSpaceId,
'--datoCmsToken': datoCmsToken,
'--skipContent': skipContent,
} = options;
contentfulImport(
contentfulToken,
contentfulSpaceId,
datoCmsToken,
skipContent,
);
}
Following these directions: https://www.datocms.com/docs/other/#passing-the-api-token-as-environment-variable
Running in nodejs v10, v12, or v14, v3.0.41 of the cli produces the following message on CentOS:
/usr/bin/env: node --async-stack-traces: No such file or directory
Workaround is to install v3.0.39 instead:
npm i [email protected] -g
Over at hike-one we're using the Dato client to dump data, after upgrading from version 0.6.1
to 0.6.2
running dato dump
returns:
❌ Error dumping data TypeError: Cannot read property 'path' of undefined
at File.get (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/local/fields/File.js:60:26)
at File.url (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/local/fields/File.js:42:52)
at image (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/utils/seoTagsBuilder.js:137:23)
at /home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/utils/seoTagsBuilder.js:148:18
at Array.reduce (<anonymous>)
at seoTagsBuilder (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/utils/seoTagsBuilder.js:147:34)
at Item.get (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/local/Item.js:230:44)
at Item.toMap (/home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/local/Item.js:105:16)
at /home/siilwyn/codeground/hike-one/node_modules/datocms-client/lib/local/fields/Links.js:34:17
at Array.map (<anonymous>)
Comparing these two versions not a lot has changed but I guess this new filter causes something to be undefined
? Thoughts?
Sidenote: the latest version 0.6.3
has the issue too.
SiteClient#uploadFile(imageUrl)
throws TypeError: Cannot read property 'relationships' of undefined
unexpectedly with normal usage.
The code previously worked, then randomly stopped functioning, both server-side and locally.
Setup:
datocms-client
version "2.0.10"The documented example fails:
await cmsClient.uploadFile('http://i.giphy.com/NXOF5rlaSXdAc.gif');
Full error output:
TypeError: Cannot read property 'relationships' of undefined
at findInfoForProperty (<dir>/node_modules/datocms-client/lib/utils/findInfoForProperty.js:18:49)
at jsonSchemaRelationships (<dir>/node_modules/datocms-client/lib/utils/jsonSchemaRelationships.js:29:58)
at deserializeJsonApi (<dir>/node_modules/datocms-client/lib/utils/deserializeJsonApi.js:120:64)
at deserialize (<dir>/node_modules/datocms-client/lib/utils/generateClient.js:114:80)
at <dir>/node_modules/datocms-client/lib/utils/generateClient.js:125:28
at run (<dir>/node_modules/core-js/modules/es6.promise.js:75:22)
at <dir>/node_modules/core-js/modules/es6.promise.js:92:30
at flush (<dir>/node_modules/core-js/modules/_microtask.js:18:9)
at process._tickCallback (internal/process/next_tick.js:61:11)
Edit: Images are uploading successfully, but the method seems to throw an error while processing the response, so the resulting image ID isn't returned
Hi, I'm integrating DatoCMS with Nuxt.
I get the result from GraphQL (via vanilla fetch()
)
the coverImage
is a field with type single asset
Here is what DatoCMS return to me
"coverImage": { "id": "1187106", "filename": "6lSja6y.jpg" }
How can I get the full URL to display the file in browser?
I'm struggling few hours google without success. Since I'm using vanilla JS fetch()
, there is no url()
function in this article for me.
https://www.datocms.com/docs/static-generators/other-ssg/fields
Thank you.
Steps to reproduce:
Observer Behavior:
Expected Behavior:
We would like to make project creation a part of our scaffolding process so it would be great if we could automate this.
I’ve been looking through the documentation but it seems like all API documentation already expects a project to exist.
dato new site --name=<name> --plan=<plan>
This will assume billing is setup and verified otherwise I expect the command to fail with an error "Payment not setup yet"
I'm using the SiteClient
to seed my CMS with values from a spreadsheet. When I upload a PNG it works just fine. No error is thrown when I upload an SVG, however, when I try to access it from the upload URL, it appears as Content-type: text/html
and is unusable. It also is not visible as a preview in the CMS. It works fine after I manually upload.
Dependencies:
"datocms-client": "0.8.15",
My code:
const newImageId = await client.uploadImage(url);
Source image: https://picturestart.dev.stinkstudios.la/static/images/partners/Bron_Logo.svg
Error in browser:
Cross-Origin Read Blocking (CORB) blocked cross-origin response https://www.datocms-assets.com/10992/1555085121-bronlogo.svg with MIME type text/plain. See https://www.chromestatus.com/feature/5629709824032768 for more details.
I think the title sums it all ;)
crashing on this creat post loop
root.directory('./_posts', dir => {
dato.posts.forEach(post => {
dir.createPost(
`${post.slug}.md`,
'yaml',
{
frontmatter: {
title: post.title,
featureimage: post.featureimage && post.featureimage.toMap(),
permalink: `/blog/${post.slug}`,
author: post.author && post.author.toMap(),
excerpt: post.excerpt,
layout: "blog-page",
categories: post.category.map(tag => tag.title),
date: post.date,
seo_meta_tags: post.seo
},
content: post.body
}
);
});
});
\ Writing content
<--- Last few GCs --->
[10712:0000024753D4E7A0] 116879 ms: Scavenge 1378.4 (1434.3) -> 1363.3 (1436.3) MB, 3.2 / 0.0 ms (average mu = 0.282, current mu = 0.213) allocation failure
[10712:0000024753D4E7A0] 116903 ms: Scavenge 1378.7 (1436.3) -> 1363.6 (1438.8) MB, 3.0 / 0.0 ms (average mu = 0.282, current mu = 0.213) allocation failure
[10712:0000024753D4E7A0] 116925 ms: Scavenge 1379.0 (1438.8) -> 1363.9 (1440.8) MB, 3.1 / 0.0 ms (average mu = 0.282, current mu = 0.213) allocation failure
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 000003240515C5C1]
Security context: 0x02375da9e6e9
1: walker [000000F7DD1C2BE9] [C:\Users\dthre\Documents\GitHub\eleventy-dev\node_modules\traverse\index.js:~116] [pc=00000324051D389D](this=0x02c00808d5e1 ,node_=0x02375daccfc1 <String[5]: field>)
2: /* anonymous */ [000000397BE3B6F9] [C:\Users\dthre\Documents\GitHub\eleventy-dev\node_modules\traverse\index.js:~203] [pc=000003240516...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 00007FF73722F04A v8::internal::GCIdleTimeHandler::GCIdleTimeHandler+5114
2: 00007FF73720A0C6 node::MakeCallback+4518
3: 00007FF73720AA30 node_module_register+2032
4: 00007FF7374920EE v8::internal::FatalProcessOutOfMemory+846
5: 00007FF73749201F v8::internal::FatalProcessOutOfMemory+639
6: 00007FF7379B2BC4 v8::internal::Heap::MaxHeapGrowingFactor+9556
7: 00007FF7379A9C46 v8::internal::ScavengeJob::operator=+24310
8: 00007FF7379A829C v8::internal::ScavengeJob::operator=+17740
9: 00007FF7379B0F87 v8::internal::Heap::MaxHeapGrowingFactor+2327
10: 00007FF7379B1006 v8::internal::Heap::MaxHeapGrowingFactor+2454
11: 00007FF73756CDB7 v8::internal::Factory::NewFillerObject+55
12: 00007FF737602CC6 v8::internal::WasmJs::Install+29414
13: 000003240515C5C1
Hi,
Since datocms-client
v3.10 upgrade, dato dump
ignore fields ending with a number.
v3.2.0
does not fixes the issue.
v3.0.44
works fine.
We can reproduce the issue with the following configuration file.
On impacted versions, undefined is printed.
dato.config.js
module.exports = (dato, root, i18n) => {
console.log(dato.content.score1)
console.log(dato.content.score2)
}
Hi,
If the dump command fails for an HTTP error, it returns the exit code 0.
It would be really useful for CI if the dump command could return 1 in this case.
Here is an example usecase
✖ Fetching content from DatoCMS
FetchError: invalid json response body at https://site-api.datocms.com/items?version=published&page%5Boffset%5D=300&page%5Blimit%5D=100 reason: Unexpected token < in JSON at position 0
- body.js:48
[opo-gdpr]/[node-fetch]/lib/body.js:48:31
- next_tick.js:189 process._tickCallback
internal/process/next_tick.js:189:7
Done in 78.60s.
I can submit a PR. Any ideas how / where to implement that ?
My point is to implement some retry in case of error.
But maybe, it would be more accurate to implement the retry in this library.
It depends on the time we should wait before trying another request I guess.
What's your opinion on this?
Thanks
Hello folks,
The docs for Content Management API (https://www.datocms.com/docs/content-management-api/resources/item/update) mentions on updating the meta fields. I also notice there is an HTTP example for the same.
I tried updating my record using:
client.items
.update(itemId, {
meta: {
createdAt: '',
updatedAt: '',
}
})
But this doesn't seem to work as expected as I presume the payload body updates only the record fields instead of any meta fields?
Is there a possibility to update meta fields using this library?
Thank you :)
In /docs/site-api-client.md
usage is mentioned like this;
await client.fields.create({
articleType.id,
apiKey: 'title',
fieldType: 'string',
appeareance: { type: 'title' },
label: 'Title',
localized: false,
position: 99,
hint: '',
validators: { required: {} },
});
The tests say it should be like:
await client.fields.create(
articleType.id,
{
apiKey: 'title',
fieldType: 'string',
appeareance: { type: 'title' },
label: 'Title',
localized: false,
position: 99,
hint: '',
validators: { required: {} },
});
also this is a syntax error.
{
articleType.id,
//...
}
When I try to
After including node module with
var SiteClient = require('datocms-client').SiteClient;
const client = new SiteClient(" ");
i get this error on midleman build ... local and netlify
events.js:174
throw er; // Unhandled 'error' event
^
Error: Line 1720: Unexpected token function
at constructError (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:2407:21)
at createError (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:2426:17)
at unexpectedTokenError (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:2500:13)
at throwUnexpectedToken (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:2505:15)
at consumeSemicolon (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:2620:13)
at parseStatement (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:4816:9)
at parseStatementListItem (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:3989:16)
at parseFunctionSourceElements (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:4869:23)
at parseFunctionExpression (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:5074:16)
at parsePrimaryExpression (/Users/milemodic/Sites/uec/node_modules/esprima/esprima.js:3273:24)
Emitted 'error' event at:
at DestroyableTransform.onerror (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_readable.js:640:52)
at DestroyableTransform.emit (events.js:198:13)
at DestroyableTransform._transform (/Users/milemodic/Sites/uec/node_modules/gulp-strip-debug/index.js:22:9)
at DestroyableTransform.Transform._read (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_transform.js:184:10)
at DestroyableTransform.Transform._write (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_transform.js:172:83)
at doWrite (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_writable.js:428:64)
at writeOrBuffer (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_writable.js:417:5)
at DestroyableTransform.Writable.write (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_writable.js:334:11)
at DestroyableTransform.ondata (/Users/milemodic/Sites/uec/node_modules/readable-stream/lib/_stream_readable.js:619:20)
at DestroyableTransform.emit (events.js:198:13)
While working with the new command dato environment destroy
I noticed it always fails because the environmentId
is improperly destructured from the options.
We need to use <environmentId>
instead of environmentId
.
It seems that the implementation of Image
was changed since 0.5.0 (when it was moved to the new File.js) and it seems to return different data.
This is the result of an API call with 0.4.6:
https://jsoneditoronline.org/?id=07e14f7ae68d4c29a853dcd9363af4f0
And this is what is returned in 0.5.0+ (tested all the way up to 0.5.6):
https://jsoneditoronline.org/?id=9aa5ac585a7d42a2947a04f680ad2bb9
You can see that "featuredImage" used to be an entire image record but is now just the id of the record.
The JS client code remains unchanged:
const fetchNews = () => {
const client = new SiteClient(process.env.DATO_API_TOKEN);
const fetchEntries = () =>
client.items.all({
'filter[type]': 'news_entry',
'page[limit]': 500
});
const fetchCategories = () =>
client.items.all({ 'filter[type]': 'news_category', 'page[limit]': 500 });
return Promise.all([fetchCategories(), fetchEntries()]);
};
I am getting some error with the wordpress api importing tool. See console error below:
Error: No header link found with rel="https://api.w.org/"
at locateAPIRootHeader (/usr/local/lib/node_modules/datocms-client/node_modules/wpapi/lib/autodiscovery.js:32:8)
at processTicksAndRejections (internal/process/task_queues.js:93:5)
(node:19548) UnhandledPromiseRejectionWarning: Error: Autodiscovery failed
at /usr/local/lib/node_modules/datocms-client/node_modules/wpapi/wpapi.js:465:10
at processTicksAndRejections (internal/process/task_queues.js:93:5)
(node:19548) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:19548) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.```
I can successfully create an upload by getting permission:
https://www.datocms.com/content-management-api/#upload_request-0
And creating an upload:
https://www.datocms.com/content-management-api/#upload-2
I get a valid upload Id returned, but the upload shows up as an empty image in the media library.
I assume I need to actually manually upload the file to the S3 url provided by the upload permission request, but I keep getting 403 errors when I try something like:
// get upload permission
let req = await client.uploadRequest
.create({
filename
})
.catch(error => {
console.log(error);
});
// send upload
fs.createReadStream(path.resolve(__dirname, filename)).pipe(
request.put(req.url, {
headers: {
'Content-Type': 'image/png'
}
})
);
Any examples of how to do this?
ApiException: 422 INVALID_FIELD (details: {"field":"content.de","field_id":"2307375","field_label" :"Content","errors":[],"code":"INVALID_FORMAT","message":"Value not acceptable for this field type ","failing_value":{"node_type":"document","data":{},"content":[{"node_type":"embedded-entry-block" ,"content":[],"data":{"target":{"sys":{"id":"4D13dWbsy40GwIBLgrQv25","type":"Link","link_type":"En try"}}}},{"node_type":"embedded-entry-block","content":[],"data":{"target":{"sys":{"id":"6Xvf378m1 qaDd42XSFaTmF","type":"Link","link_type":"Entry"}}}},{"node_type":"paragraph","content":[{"node_ty pe":"text","value":"","marks":[],"data":{}}],"data":{}}]}})
Would be helpful with an example of browser use. I am failing to import and make use of the non-node version.
Using the API, I forgot to specify an itemType when creating a record. This seems to have created several records with ids of "null" and itemTypes of "null". I can't see them in the admin to delete them, or fetch / delete them using the js client because of their null ids.
I've been using the CLI a lot more intensive now and am happy to start adding/improving some of the existing commands.
I do want to open a discussion about the ordering/structuring of the CLI arguments.
What I would like to propose to a "Git style" structuring of the CLI to have more predictable patterns when interacting with it.
Let's start out with these 2 commands:
dato wp-import
dato contentful-import
Both these commands import data from a source, one from Wordpress the other from Contentful.
For these commands I'd propose something like this:
dato import wordpress --token=<datoApiToken> --wpUrl=<url> --wpUser=<user> --wpPassword=<password> [--environment=<datoEnvironment>]
dato import contenful --token=<datoApiToken> --contentfulToken=<apiToken> --contentfulSpaceId=<spaceId> [--environment=<datoEnvironment>] [--skipContent] [(--includeOnly <contentType>...)]
For the migrations, I'm still a bit in doubt concerning would be the best way to group these operations.
Right now we can easily place these commands in the "migration group"
dato new migration <name>
dato migrate
As an initial proposal:
dato migration create <name> #...
dato migration apply [--source=<environment>] [--destination=<environment>] [--inPlace] [--migrationModel=<apiKey>] [--migrationsDir=<directory>] [--token=<apiToken>]
Maybe migration apply
should even be environment create
since that is what the command actually does.
I can imagine quite few scenarios where the CLI will interact with environments/sandboxes. This is why I think a group for this would be justified.
Current command:
dato environment list
dato environment destroy <environmentId>
dato environment create <environmentId>
Since there are also some miscellaneous command such as maintenance
and dump
I would imagine a site
group te be present to join those commands.
New commands:
dato site dump [--watch] [--verbose] [--preview] [--token=<apiToken>] [--environment=<environment>] [--config=<file>]
dato site maintenance (on|off)
Starting from version 2.0.0, this is not possible to access multiple links from the dump file.
languages
is a Multiple Link
field of the single instance model text
.
dato.config.js
module.exports = (dato, root, i18n) => {
console.log(dato.text.languages)
}
Expected output (from version 1.0.5)
[
Item {
entity: JsonApiEntity {
payload: [Object],
repo: [EntitiesRepo],
name: 'French',
key: 'fr',
updatedAt: '2018-05-16T21:39:03.836+02:00',
createdAt: '2018-05-16T21:39:03.829+02:00',
itemType: [Getter],
creator: [Getter]
},
itemsRepo: ItemsRepo {
entitiesRepo: [EntitiesRepo],
collectionsByType: [Object],
itemsById: [Object],
itemsByParentId: {},
itemTypeMethods: [Object]
},
key: [Getter],
name: [Getter]
},
Item {
entity: JsonApiEntity {
payload: [Object],
repo: [EntitiesRepo],
name: 'English',
key: 'en',
updatedAt: '2018-05-16T21:39:08.375+02:00',
createdAt: '2018-05-16T21:39:08.369+02:00',
itemType: [Getter],
creator: [Getter]
},
itemsRepo: ItemsRepo {
entitiesRepo: [EntitiesRepo],
collectionsByType: [Object],
itemsById: [Object],
itemsByParentId: {},
itemTypeMethods: [Object]
},
key: [Getter],
name: [Getter]
}
]
Actual output (version 2.0.0)
[ undefined, undefined ]
My dato.config.js file is as follows:
module.exports = (dato, root, i18n) => {
// ...iterate over the "Blog post" records...
dato.posts.forEach((post) => {
// ...and create a markdown file for each article!
root.createPost(`src/articles/${post.slug}.md`,"yaml", {
frontmatter: {
title: post.title,
slug: post.slug,
excerpt: post.excerpt,
date: post.publishedAt,
postType: "post",
status: "publish",
categories: post.categories.toMap(),
tags: post.tags.toMap(),
header_caption: post.headerImageCaption,
image: post.featuredImage.url()
},
content: post.content
}
);
});
};
It all works fine except for featuredImage. I've tried receiving the URL, path, name, format... all result in a crash related to File.js:
× Fetching content from DatoCMS
TypeError: Cannot read property 'path' of undefined
- File.js:60 File.get
[nodejs]/[datocms-client]/lib/local/fields/File.js:60:26
- File.js:42 File.url
[nodejs]/[datocms-client]/lib/local/fields/File.js:42:52
- dato.config.js:18 dato.posts.forEach
A:/programs/nodejs/dato.config.js:18:39
- Array.forEach
- dato.config.js:5 Function.module.exports
A:/programs/nodejs/dato.config.js:5:16
- dump.js:77 collectOperations
[nodejs]/[datocms-client]/lib/dump/dump.js:77:3
- dump.js:98 start
[nodejs]/[datocms-client]/lib/dump/dump.js:98:20
- dump.js:144 _callee$
[nodejs]/[datocms-client]/lib/dump/dump.js:144:30
- runtime.js:65 tryCatch
[nodejs]/[regenerator-runtime]/runtime.js:65:40
- runtime.js:303 Generator.invoke [as _invoke]
[nodejs]/[regenerator-runtime]/runtime.js:303:22
Any assistance would be appreciated.
The image URL returned from the API previously ended with ?
. This has changed with the latest API updates, which potentially breaks URLs if query strings were appended to them (this just happened to me).
Was this intentional, or will it be reverting to the previous response?
Any async/await code run with node is crashing due to regeneratorRuntime
not being defined :(
Would be happy to try to submit a PR to fix this, just wanted to file the issue first!
I use lambda function and I want upload image in to datocms. But I receive image inside lambda function as multipart/form-data. How to upload a picture in this case?
client.uploadImage('http://i.giphy.com/NXOF5rlaSXdAc.gif')
Hi there:
I'm trying to use datocms-client as documented in the official reference to retrieve data using dato.config.js file without success.
Here the code I'm using with read-only-api and an about_page
single instance model with a record:
module.exports = (dato, root, i18n) => {
var about = dato.aboutPage
root.createPost('src/data/about.json', 'json', about)
}
Running dato dump
the script generates an empty file.
Using the REST api call with the datocms-client I can get the content without issues, though.
I'd like to use the --preview
feature which I don't know I can use with the REST api.
My sys: node 8.9.0, dato 0.4.6, MacOSX 10.10.5
Thanks for any help
two points but i think they are fairly interconnected so i've only filed one issue.
it appears that there is partial support for this in the library but it's not documented. since all of your site's json gets returned from one API call this shouldn't be hard to do either. are you open to the idea of adding an option that would do nested resolution from within this client so that we don't have to traverse our objects for each implementation of the client? ideally this would look like contentful.js' option but could also be part of .get() or .all() functions. in practice all anyone ever wants is their fully resolved data so i could even imagine this option is true
by default.
while looking for this link resolution functionality i came across src/local/Loader.js
which I think does link resolution but returns a class object with no ability to serialize this to json. the assumption here is so that it would avoid circular references that might happen if it were json. as part of the link resolution
request, it makes sense that some kind of smart json serialization that avoids the infinite loops.
this seems like an extremely common use case and we've avoided filing this issue in the past because we've done our own link resolution within our spike-datocms
plugin, but imo this really should be handled by the client
cc @jescalan
While setting up our CI process to create sandbox environments for each of our MR's(Merge Requests), I noticed there was no way to remove/destroy the sandbox when the MR gets merged/closed.
By adding a simple command which calls the environments.destroy
method of JS client we can easily destroy sandboxes using the same tool by which is was created.
dato environment destroy <environmentId>
I just received a PR from greenkeeper for the 0.5.0 update, and the build is failing with the following error:
6:34:01 PM: ✖ Fetching content from DatoCMS
6:34:01 PM: TypeError: _this.image is not a function
6:34:01 PM:
6:34:01 PM: - build.js:83
6:34:01 PM: [repo]/[datocms-client]/lib/local/fields/build.js:83:20
6:34:01 PM:
6:34:01 PM: - Array.map
6:34:01 PM:
6:34:01 PM: - build.js:82 Object.gallery
6:34:01 PM: [repo]/[datocms-client]/lib/local/fields/build.js:82:32
6:34:01 PM:
6:34:01 PM: - build.js:121 build
6:34:01 PM: [repo]/[datocms-client]/lib/local/fields/build.js:121:38
6:34:01 PM:
6:34:01 PM: - Item.js:153 Item.readAttribute
6:34:01 PM: [repo]/[datocms-client]/lib/local/Item.js:153:34
6:34:01 PM:
6:34:01 PM: - Item.js:50 Item.get [as images]
6:34:02 PM: [repo]/[datocms-client]/lib/local/Item.js:50:23
6:34:02 PM:
6:34:02 PM: - dato.config.js:113 dato.works.forEach.work
6:34:02 PM: /opt/build/repo/scripts/dato.config.js:113:24
6:34:02 PM:
6:34:02 PM: - Array.forEach
6:34:02 PM:
6:34:02 PM: - dato.config.js:103 root.directory.workDir
6:34:02 PM: /opt/build/repo/scripts/dato.config.js:103:16
6:34:02 PM:
6:34:02 PM: - dump.js:75 collectOperations
6:34:02 PM: [repo]/[datocms-client]/lib/dump/dump.js:75:3
See the build log from netlify here and the PR here.
Since this shouldn't be a breaking change I'm assuming this is a bug?
When creating or updating a field, the defaultValue
attribute is not used.
I have an contenful account, wanted to migrate to datocms, but not everything, nor do I want to delete content in contentful. So, wanted to implement or provide a new option while running dato contentful-import, that takes in content types.
Not sure if this issue was discussed and not implemented
Hi guys, I was migrating a WP website to Dato and I encountered this issue with the media gallery files. Seems like the script tries to download the media objects by using only their relative path (i.e. /wp-content/uploads/2017/10/031c3b66-8693-35ac-a766-397dc4b0c898.jpg
instead of https://www.wp-site-url.tld/wp-content/uploads/2017/10/031c3b66-8693-35ac-a766-397dc4b0c898.jpg
)
This is the error it throws after the "Fetching media" phase.
Cannot import: /wp-content/uploads/2017/10/031c3b66-8693-35ac-a766-397dc4b0c898.jpg
Error: ENOENT: no such file or directory, access '/wp-content/uploads/2017/10/031c3b66-8693-35ac-a766-397dc4b0c898.jpg'
at Object.accessSync (fs.js:206:3)
at node (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/upload/adapters/node.js:26:15)
at Proxy.uploadFile (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/upload/uploadFile.js:10:10)
at _callee$ (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/wpImport/import/media.js:71:47)
at tryCatch (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/node_modules/regenerator-runtime/runtime.js:65:40)
at Generator.invoke [as _invoke] (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/node_modules/regenerator-runtime/runtime.js:303:22)
at Generator.prototype.<computed> [as next] (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/node_modules/regenerator-runtime/runtime.js:117:21)
at asyncGeneratorStep (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/wpImport/import/media.js:12:103)
at _next (/Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/wpImport/import/media.js:14:194)
at /Users/notmymac/Documents/workspace/wp-website/js-datocms-client/lib/wpImport/import/media.js:14:364 {
errno: -2,
syscall: 'access',
code: 'ENOENT',
path: '/wp-content/uploads/2017/10/031c3b66-8693-35ac-a766-397dc4b0c898.jpg'
}
I managed to overcome the issue locally by overwriting the mediaItemUrl
const in media.js file like this:
const mediaItemUrl = `https://www.wp-site-url.tld${mediaItem.source_url};`
Originally it was: const mediaItemUrl = mediaItem.source_url;
Cheers!
I would like to generate Markdown files without front matter since the static site generator that I am using (Harp) is able to process partials as Markdown but will output the frontmatter.
My data.config.js
file contains the following:
// Create a `public/_sections` directory (or empty it if already exists)...
root.directory('public/_content/sections', dir => {
// ...and for each of the sections stored online...
dato.sections.forEach((section, index) => {
// ...create a markdown file with all the metadata in the frontmatter
if (section.content) {
dir.createPost(`${section.slug}.md`, 'yaml', {
frontmatter: {
title: section.title,
slug: section.slug,
background_image: section.backgroundImage,
heading: section.heading,
lead: section.lead,
content: section.content,
button_text: section.buttonText,
button_url: section.buttonUrl,
video_url: section.videoUrl,
type: section.sectionType
},
content: section.content
});
}
if (section.content) {
dir.createPost(`${section.slug}-content.md`, 'toml', {
content: section.content
});
}
if (section.lead) {
dir.createPost(`${section.slug}-lead.md`, 'yaml', {
content: section.lead
});
}
});
});
The generated Markdown files still contain the front matter, for TOML:
+++
+++
and for YAML:
---
null
---
As I mentioned, this frontmatter actually gets parsed as Markdown by the static site generator when added through a partial, which is less than ideal.
Perhaps, there could even be another file format 'md' for Markdown-only.
By the way, I have a possible solution. At least, it appears to be working for me. (Pull request to come.)
When I use the root.* methods, it doesn't create the files if I use a full path. But if I use a relative path it is working.
The following does not work:
var path = require('path')
var pathToSiteYml = path.resolve(__dirname, 'src/data/site.yml')
root.createDataFile(pathToSiteYml, 'yaml', { title: 'My Site' })
The following does work:
var pathToSiteYml = 'src/data/site.yml'
root.createDataFile(pathToSiteYml, 'yaml', { title: 'My Site' })
Specifically relating to this function
https://www.datocms.com/docs/content-management-api/resources/item/instances
When not using a filter and just calling all()
it'll ignore the user role associated to the API key and list everything out, other functions like find() seem to be using the user role as expected
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.