byron / google-apis-rs Goto Github PK
View Code? Open in Web Editor NEWA binding and CLI generator for all Google APIs
Home Page: http://byron.github.io/google-apis-rs
License: Other
A binding and CLI generator for all Google APIs
Home Page: http://byron.github.io/google-apis-rs
License: Other
This should ONLY affect crate names, everything else stays as is.
Remember to adjust the library docs accordingly, possibly fitting the new RFC already.
See hyperium/hyper#428 for more information.
See mapsengine1
- `GeoJsonGeometry.
The 'Go' implementation supports this already, so should I. An enum seems to be the native data-type, even though Jsonification would certainly need to be tested beforehand.
Remember to update the documentation urls - they are hardcoded to structs.
Should tell people how to modify their cargo toml. This should possibly be put in front of the complete usage example.
Additionally, the documentation should be changed to use youtube3::new()
instead of youtube3::YouTube::new(...)
in all code samples.
Currently Option
is used all the time to aid encoding and decoding to json. However, some schemas do not need that, for example, response-values, and maybe others as well.
Do some research and implement it accordingly.
This is supported in some youtube APIs are least, and we should be sure to either support it or reject it right away.
While at it, we may want to check other parameters that we cannot support, for example, as they change the encoding of the return value.
Currently these are pulled from the google go api client repository, but that's not the original source of the information.
Turns out there is a discovery service which can easily be used to get the json files ! Write something to help doing it, ideally with make to support simple parallelization.
As Docs are a first-class citizen, we should integrate them into our central doc index right away.
For the CLI, I imagine mkdocs
should be used to generate beatiful static documentation from possibly shared markdown files.
Also, rustdoc shouldn't be invoked for the CLI at all - it's all done by mkdocs
. deps.mako
may have to learn how to handle that.
As the subject suggests.
Currently the system only wants to support API client code generators.
See subject.
Currently it's a stub - make sure it's a complete landing page which may also serve as 'index.md', to which it is already linked.
We might want to call the delegate instead, maybe he wants to retry.
Seeking happens alot, we might want a function or macro for it.
Maybe related to #39
Other parties should be able to generate the APIs themselves, which requires some documentation of the make based build system (which as we know is self-documenting).
Make sure to cover special cases, like how to get new APIs, which requires deletion of the .api.deps file.
When calling the delegate, currently the only information it gets about the method it should handle is its name.
As the delegate can be seen as state-machine, it would benefit from knowing more about what's going on. There should be information about a method that is passed to it right at the beginning, and it should be informed as well when the method is done executing, passing in the result for inspection.
Even though the begin()/end()
calls are not strictly required as the user knows when he is calling into the API and returning from it, it will nonetheless make handling API work more convenient.
Figure out how the grammar should look like, which basically specifies how the CLI can be controlled. Think about future addition of a Queue, and prefer designs allowing to add new functionality easily.
It's clear that the command-subcommand pattern is to be used here, and that it needs to be able to set all the values supported by the API.
Even though documentation is provided separately via mkdocs
, the command must be self-documenting when invoked from the commandline (ideally providing a URL to the public HTML docs as well).
This ticket could theoretically be tackled by writing the README file that contains the command overview, including the respective MAKO code. This basically makes it half-way towards implementing the CLI parser (which probably wants to be docopt for ease-of-use).
That way, one can spare all the 'if let Some(d) = delegate' calls.
And abort if uploaded resource exceeds the limit. We have that information, and should (now) use it.
Add a make target to generate all API documentation and link it together in a html front page that just links to the respective sub-directories using relative paths.
The documentation must be self-contained for upload on gh-pages. For the latter, there must be a target as well using gh-import
, to help simply pushing latest docs to github.
APIs like 'freebase' support methods which are not tied to resources. However, our system really wants that right now, and there should be a good way to upgrade it to deal with that case gracefully.
These 'methods' are added to the existing ones, but might actually be callable directly from the hub.
This includes
Hey, even though it's no longer Friday 13th, I just found a 404 on my first try to click on a link on http://byron.github.io/google-apis-rs/ ;)
The link for qpxexpress v1 is to qpxexpress1/index.html
, which doesn't exist.
Looking at the gh-pages branch tells me the correct path is qpxExpress1/index.html
(with a capitalized E).
Currently make doesn't know anything about whether or not a crate is already on crates.io. To track releases, one should create a central 'publish' command, which depends on the latest generated libraries (together), and publish files that are checked in.
Such a file could look like this: etc/api/youtube/v3/crates/0.1.0+2014040201
.
With the latest rustc
, the dependencies don't compile anymore. Make it work again.
And re-release with a new patch-level.
The crates currently online wouldn't work anymore.
#Arguments
descriptions for required parameters in MethodBuildersupload_resumable()
method docsMethodBuilder
trait and CallBuilder
trait,The documentation for the request value should represent the request value itself and document every single (nested) property.
Be sure you think about how we will later parse these values to finally build the API call - the same framework/functions should be used in these cases.
Especially Result
must just not be in cmn - ideally cmn goes away entirely.
It should be clear for which version (crate + API) the documentation was generated.
Also, place link to the repository.
Especially Result
must just not be in cmn - ideally cmn goes away entirely.
Just to get the Result
changes online, and improve the docs.
This includes 'multi-part' uploads which are not resumable. It will possibly change the structure of our current implementation quite a bit, as it needs more looping and multiple requests.
This RFC was used as basis for implementation.
This should allow about 95% of the operations already, and conclude the 0.1.0 milestone.
Also, re-enable all linters, and fix what needs to be fixed.
If a structure is unused, we shouldn't actually emit it. Therefore, the UnusedType
marker can be removed as well.
It should only work on a subset of all APIs, but besides that, test doc generation and actual unit-tests.
It should be a field on the Hub, which is preset with the default one, but which can be set by the user to affect all future calls.
That way, the library will be suited to other programs which just want to use it, keeping their own user agent or allowing them to maintain their own identity towards the google servers.
We control crate versions with one version for all, which should only change if the generator changes, and thus produces different code. Standard semver rules apply.
To assure we don't degenerate the exact API version the crate represents, we should append build metadata, to get versions such as 0.1.0+20150309
.
That way, we will only produce new crate versions if something truly changed.
Actually all structures currently generated will not encode-decode into anything useful as their identifiers have been mangled. This doesn't work though, as these must remain unchanged.
As this is not possible with rustc_serialize, this task should be given to serde.
Instead, the resource-to-activity
map should keep fully qualified activity names.
This will make the code cleaner and easier to understand/maintain.
There is a TODO in the library usage section.
Besides that, add a new index under 'Features' listing all methods supporting uploads and downloads respectively.
This will allow them to be seen when updating github pages for example.
Somewhat relates to #50 , as the latter seems to be the last big doc-related thing to do.
That way, matches will be more hierarchical. Err
contains another enum with all possible failure states.
Also, let query_transfer_state()
return a std::Result
, which is natural to it.
It seems that copying all files into one folder doesn't cut it, as it will also overwrite data related to the search index. This was done initially just to reduce redundancy, as all dependent crate docs are duplicated once per API.
The good news is that Git will deduplicate for us, even though the demand in space will rise on github.
A fix could be to just copy all doc-data into it's own subdirectory. The benefit will be that there is no special case anymore (in terms of URL) for the CLI and the API docs.
This episode gives you an idea why this project exists, and of course, how it got as far as it got.
Some of the topics are:
mako
template enginemake
for dependency handling and process invocationserde
for Json encoding and decodingCurrently we just panic, yet there may be various reasons for deserializations to fail. Allow the delegate to have the final word on what to do, if that happens. Do never panic.
What I want is names like
Sometimes, versions look different and should be converted as shown
v1beta4
-> name1_beta4
v1.3
-> name1p3
directory_v1
-> name1_directory
directory_v1.3
-> name1p3_directory
v1management
-> name1_management
Something to consider here is how the delegate is involved to store the upload information to allow actually resuming it in another call.
Supporting this might ripple through our method builder a bit, as it might need more information to allow this. Or the delegate provides it at runtime, which would certainly be prepared.
Goal is to support storing the current result, and resuming a few minutes/hours later without issues, i.e. the API must allow to do that natively.
illegal recursive struct type; wrap the inner value in a box to make it representable
It is currently pretending there is a resource of such name, but instead should just be aware of what it is.
A particularly good example is oauth2_v1
.
Currently the virtualenv call to set up the initial environment uses the default python executable of the system. This breaks the build when the system default points to a python3.x executable because some make modules require python2.x modules which are not found in this case.
Have a look at this and you see what I mean: http://byron.github.io/google-apis-rs/compute1/struct.Operation.html .
I think it would be best to fully qualify activities by default, and print them similarly as done in the overview on the landing page.
Drive2
for example supports downloading files, and we should make sure that this actually works within the bounds of our API.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.