Giter Site home page Giter Site logo

pseudomuto / protoc-gen-doc Goto Github PK

View Code? Open in Web Editor NEW
2.6K 27.0 460.0 854 KB

Documentation generator plugin for Google Protocol Buffers

License: MIT License

Makefile 5.70% Shell 0.46% Go 93.49% Dockerfile 0.35%
protobuf documentation-tool protoc hacktoberfest hacktoberfest2021 golang go

protoc-gen-doc's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protoc-gen-doc's Issues

CentOS yum repo doesn't appear accessible

Hello,

I get a 404 error when trying to download your CentOS 7 rpm via YUM. Is the yum repo still available?

I can build from source fine, but was just curious.

Thanks!

Build Windows .zip on Appveyor

Appveyor is the Travis for Windows builds. Can take a look at e.g. Tiled on how to build Qt apps on there. From that config, it seems it's possible to trigger upload of releases to a release draft on GitHub on tagging, which would be very convenient.

Support for including all proto files within subdirectories

At present we need to provide proto files individually or the package/*.proto format.
If the project has multiple packages, in different modules, then there is no way to run the protoc on parent directory and say **/*.proto to generate documentation for all the proto files.
Not sure if this needs to be at the protoc forum itself..

How to build on linux without having protobuf installed

How can I build this from source without having to install protobuf first?

In my current project, I build grpc from source, including also its third party projects including protobuf.
And I do not want to install protobuf on my build system for various reasons.

So, how may I override the existing build rules in the qmake project file for linux, so that it will use the protobuf build from grpc?

Not able to resolve include paths

I get this error:

protoc -I/protobuf/ -Idata-contracts/proto/ data-contracts/proto/services/events.proto \
  --doc_out=markdown,events.md:.

--doc_out: services/events.proto: services/events.proto: No such file or directory

I am using these versions:

protoc: 3.1.0
protoc-gen-doc: 0.8-1 (from prebuilt in Ubuntu directions)

All the internal code-generators are working, so I don't think it's an issue with protoc:

protoc -I/protobuf/ -Idata-contracts/proto/ data-contracts/proto/services/events.proto \
  --cpp_out=. \
  --csharp_out=. \
  --java_out=. \
  --js_out=. \
  --objc_out=. \
  --php_out=. \
  --python_out=. \
  --ruby_out=.

Am I doing something wrong?

Order of messages

Looks like by default generated html is using alphabetical order to display messages, is there any way to follow proto file order instead?

Add support for documentation header file

The --doc-out option should be changed to have the following format

--doc_out=docbook|html|markdown|<TEMPLATE_FILE>,<OUT_FILE>[,<HEADER_FILE>]:<OUT_DIR>

Where the optional <HEADER_FILE> specifies a file who's content will be included verbatim at the top of the generated documentation, provided it's supported by the used Mustache template. The built-in templates should all be updated to support this.

Parameter to consider or not @exclude

I have a suggestion of feature. In our case we need to generate internal (developers) and external (clients) documents. For internal, we need a documentation with all messages and atributes (not consider @exclude). But for clientes, we need a documentation without some messages (consider @exclude).

My suggestion is to have a parameter when generating the documents where I inform to consider or not @exclude.

Scalar Value Types section

Is it possible to add a flag to the generation in terms of the "Scalar Value Types" section. For us we are only interested in C# and Javascript types. We are not interested in the default C++, Java, Python types columns. Or maybe just give the ability of excluding that section altogether if the user isn't interested in seeing it as a footnote. The first option would be preferable for us as there is value in the information provided.

Officially support building with MinGW

This would include:

  • Making sure it builds.
  • Create a protoc-gen-doc-win32-zip.pri equivalent for MinGW.
  • Update .pro to use the above on MinGW builds.
  • Update .appveyor.yml.
  • Update BUILDING.md.

Also see #4.

Mustache Template or custom Messages

Hello Estan,

if i need to add sample message under description how do i add or if i want to add some image how do i catch in template, like lets say

/*
*comment ABC
*/
i have

message abc{
optional string id=1;
}
example message
abc{
[{id:0}]
}

some image i want to display in html

How do i add this in mustache

i see we already have message{{message_description}} but its only printing commentABC

lets say if i need to add example messae in html after message field table or some image

how do i capture this in mustache, please let me know

Custom markdown template is missing message fields

Hello,

I have slightly adapted the template/markdown.mustache file:

# MY TITLE
<a name="top"/>

{{#files}}
<a name="{{file_name}}"/>
<p align="right"><a href="#top">Top</a></p>

## {{file_name}}

{{#file_description}}{{& file_description}}{{/file_description}}

{{#file_messages}}
<a name="{{message_full_name}}"/>
### {{message_long_name}}
{{& message_description}}

{{#message_has_fields}}
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
{{#message_fields}}
| {{field_name}} | [{{field_long_type}}](#{{field_full_type}}) | {{field_label}} | {{#nobr}}{{& field_description}}{{#field_default_value}} Default: {{field_default_value}}{{/field_default_value}}{{/nobr}} |
{{/message_fields}}
{{/message_has_fields}}

{{#message_has_extensions}}
| Extension | Type | Base | Number | Description |
| --------- | ---- | ---- | ------ | ----------- |
{{#message_extensions}}
| {{extension_name}} | {{extension_long_type}} | {{extension_containing_long_type}} | {{extension_number}} | {{#nobr}}{{& extension_description}}{{#extension_default_value}} Default: {{extension_default_value}}{{/extension_default_value}}{{/nobr}} |
{{/message_extensions}}
{{/message_has_extensions}}

{{/file_messages}}

{{#file_enums}}
<a name="{{enum_full_name}}"/>
### {{enum_long_name}}
{{& enum_description}}

| Name | Number | Description |
| ---- | ------ | ----------- |
{{#enum_values}}
| {{value_name}} | {{value_number}} | {{#nobr}}{{& value_description}}{{/nobr}} |
{{/enum_values}}

{{/file_enums}}

{{#file_has_extensions}}
<a name="{{file_name}}-extensions"/>
### File-level Extensions
| Extension | Type | Base | Number | Description |
| --------- | ---- | ---- | ------ | ----------- |
{{#file_extensions}}
| {{extension_name}} | {{extension_long_type}} | {{extension_containing_long_type}} | {{extension_number}} | {{#nobr}}{{extension_description}}{{#extension_default_value}} Default: {{extension_default_value}}{{/extension_default_value}}{{/nobr}} |
{{/file_extensions}}
{{/file_has_extensions}}

{{#file_services}}
<a name="{{service_full_name}}"/>
### {{service_name}}
{{& service_description}}

| Method Name | Request Type | Response Type | Description |
| ----------- | ------------ | ------------- | ------------|
{{#service_methods}}
| {{method_name}} | [{{method_request_long_type}}](#{{method_request_full_type}}) | [{{method_response_long_type}}](#{{method_response_full_type}}) | {{#nobr}}{{& method_description}}{{/nobr}} |
{{/service_methods}}

{{/file_services}}

{{/files}}

I have removed the table of contents, the scalar value types and changed the title of the file. When I use this mustache template with protoc-doc-gen (usage like this protoc --doc_out=my_templates/custom_markdown.mustache,index.md:doc/api api/*.proto ) the output is missing the table with the fields description of a message. The same issue happens when I use a copy of the template/markdown.mustache.

But when I use the default markdown template (usage like this ```protoc --doc_out=markdown,index.md:doc/api api/*.proto


So i guess it is not related to my protobuf files but with my mustache template and the one provided in this repo in ```template/markdown.mustache```.  

Can you please help me or confirm that there is a bug when using custom mustache templates?

I am using the latest version of protoc (3.3) and the latest version of protoc-gen-doc. The issue occur on Ubuntu 16.04 and Win 7.

Thank you.

Add support for protobuf 3.0

Hello Elvis,
Many thanks for you project at first.
Are you going to support new version of protobuf 3.0? I suppose it's near release.

Comments not working

I can't get the comments to show up in the description field on Windows. I've tried using protoc version 2.6.1 and version 3.0. They don't show up in any output type. Any ideas?

Thanks

Extensions don't work

Maybe I am doing it wrong somehow, but I don't think the extension related fields are working. For example, I have a proto with a file-level extension, but {{file_has_extensions}} isn't defined and {{file_extensions}} is empty. Same with message extensions, etc. I am using the binary I downloaded and protoc version 2.6.1, if that matters.

any more detailed instruction how to install on Mac?

I am trying to install this on Mac OSX, searched qmake through HomeBrew, but could not find it. Can someone give more detailed instructions?

Currently, the doc says:

    $ export PROTOBUF_PREFIX=/path/to/protobuf-2.6.1
    $ qmake
    $ make

Installation instructions on Ubuntu 16.10 are broken

Hi

On a fresh ubuntu 16.10 docker image, following the instructions at:
https://software.opensuse.org/download.html?project=home%3Aestan%3Aprotoc-gen-doc&package=protoc-gen-doc

I'm getting:

The following packages have unmet dependencies:
 protoc-gen-doc : Depends: libprotobuf10 but it is not installable
                  Depends: libprotoc10 but it is not installable
                  Depends: libqt5core5a (>= 5.6.0~beta) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

Add homebrew support

@estan this package is fantastic, but I'd love to make it a bit easier to install for folks using OS X. I've written a few homebrew formulas before, would one for protoc-gen-doc be of interest? Let me know!

enums in a same order

Hello,

is there any way that i can display enums under every message not like a separate sections

like message1 and its enums, messag2 and its enums

right now its like messag1,message2 and all enums

Support JSON output

It's great that the tool can generate various doc formats. But we need to actually do something with the docs, so we need them in a machine readable format. I'm trying to craft a mustache template that outputs JSON, but generating valid JSON that way is going to be tricky and probably require me to do some post-processing on what is generated.

Multi-line comment bug with markdown output format

Hi,
I have built protoc-gen-doc from source on OSX (10.11.6) (from master, commit 97718ab) following the given steps. HTML output seems correct but markdown output has a bug with mulit-line comments where the last word on one line is concatenated to the first word on the next line. Here is a minimal example:

test.proto

syntax = "proto3";

message Foo {
  /**
   * Some
   * long
   * description
   */
  bool bar = 1;
}

After running

protoc --doc_out=markdown,test.md:. test.proto

The file test.md is as follows - the important thing is the appearance of Somelongdescription with no spaces between the words.

# Protocol Documentation
<a name="top"/>

## Table of Contents
* [test.proto](#test.proto)
 * [Foo](#Foo)
* [Scalar Value Types](#scalar-value-types)

<a name="test.proto"/>
<p align="right"><a href="#top">Top</a></p>

## test.proto



<a name="Foo"/>
### Foo


| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| bar | [bool](#bool) | optional | Somelongdescription |

...

Github compatible mardown?

The markdown generated by this is not compatible to github.

It would require some additional changes, for example a lambda that will turn a full_name to lower case (for named anchors generated by github) or that will transform any whitespace to - and so on.

At the very least, one must insert a newline between the <a .../> named anchors in the document and the following headline in order to make this work.

Care for a PR for making this happen?

Include the "files" key name in generated json

When generating json, it would be better to include the "files" key followed by the value instead of just the value e.g.

enum FileEnum {
    A = 1;
    B = 2;
}

message MyMessage {
    enum MessageEnum {
        C = 3;
        D = 4;
    }
}

currently generates -

[
    {
        "file_description": "",
        "file_enums": [
            {
                "enum_description": "",
                "enum_full_name": "FileEnum",
                "enum_long_name": "FileEnum",
                "enum_name": "FileEnum",
                "enum_values": [
                    {
                        "value_description": "",
                        "value_name": "A",
                        "value_number": 1
                    },
                    {
                        "value_description": "",
                        "value_name": "B",
                        "value_number": 2
                    }
                ]
            },
            {
                "enum_description": "",
                "enum_full_name": "MyMessage.MessageEnum",
                "enum_long_name": "MyMessage.MessageEnum",
                "enum_name": "MessageEnum",
                "enum_values": [
                    {
                        "value_description": "",
                        "value_name": "C",
                        "value_number": 3
                    },
                    {
                        "value_description": "",
                        "value_name": "D",
                        "value_number": 4
                    }
                ]
            }
        ],
        "file_extensions": [
        ],
        "file_has_extensions": false,
        "file_has_services": false,
        "file_messages": [
            {
                "message_description": "",
                "message_extensions": [
                ],
                "message_fields": [
                ],
                "message_full_name": "MyMessage",
                "message_has_extensions": false,
                "message_long_name": "MyMessage",
                "message_name": "MyMessage"
            }
        ],
        "file_name": "test.proto",
        "file_package": "",
        "file_services": [
        ]
    }
]

It would be better to generate -

"files" : [ ... ]

This way the generated json can directly be piped to a external program like an external mustache engine.

Add support for RPC services

The following .proto (assuming FooRequest and FooResponse are defined) does not appear to generate any documentation today. I can try looking at implementing it once I understand the code a little better.

service FooService {
    rpc FooMethod (FooRequest) returns (FooResponse);
}

The service keyword is described on the Google page for proto2 or proto3

protoc doc generator issue with MAC OS

Hello

AM trying to install protoc doc gen

i followed the steps from github steps but i kept getting this below error
protoc-gen-doc: program not found or is not executable
--doc_out: protoc-gen-doc: Plugin failed with status code 1.

make fails on OSx Yosemite

I tried the instructions listed for Mac:

INb:protoc-gen-doc steam$ export PROTOBUF_PREFIX=/usr/local/Cellar/protobuf/2.6.1/
INb:protoc-gen-doc steam$ qmake
/Users/steam/projects/gauge/protoc-gen-doc/protoc-gen-doc.pro:22: Unknown replace function: getenv
Project ERROR: You must set the PROTOBUF_PREFIX environment variable!
INb:protoc-gen-doc steam$ echo $PROTOBUF_PREFIX
/usr/local/Cellar/protobuf/2.6.1/
INb:protoc-gen-doc steam$

Despite setting the PROTOBUF_PREFIX value, it still complains about it.

Any tips?

two issues on macos 10.12.3

Hi,

I used to be able to build successfully, but with the latest updates of macos, I started to have issues.

I am following the steps in https://github.com/estan/protoc-gen-doc/blob/master/.travis.yml

first issue

when I ran qmake, I got the SDK path error:

> qmake                                                                                                                                                               
Project ERROR: Could not resolve SDK Path for 'macosx'

After searched online, it is due to the xcode-select developer directory was changed with OS upgrade. BTW, make sure you have xcode and command line tools installed already. See details in this post, simply running this command helped solve the issue:

sudo xcode-select -s /Applications/Xcode.app/Contents/Developer

second issue

when I ran make, pdf doc is not generated, docbook, html and md files are good. It might be the fop command has

> make                                                                                                                                               master [225664a] deleted untracked
protoc --doc_out=markdown,example.md:doc proto/*.proto
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Booking.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Customer.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Vehicle.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
protoc --doc_out=html,example.html:doc proto/*.proto
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Booking.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Customer.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Vehicle.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
protoc --doc_out=docbook,example.docbook:doc proto/*.proto
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Booking.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Customer.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: proto/Vehicle.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
fop -xml doc/example.docbook \
		-xsl /usr/share/xml/docbook/xsl-stylesheets-1.79.1/fo/docbook.xsl \
		-param use.extensions 0 \
		-param fop1.extensions 1 \
		-param paper.type A4 \
		-param page.orientation landscape \
		-pdf doc/example.pdf

USAGE
fop [options] [-fo|-xml] infile [-xsl file] [-awt|-pdf|-mif|-rtf|-tiff|-png|-pcl|-ps|-txt|-at [mime]|-print] <outfile>
 [OPTIONS]
  -version          print FOP version and exit
  -x                dump configuration settings
  -c cfg.xml        use additional configuration file cfg.xml
  -l lang           the language to use for user information
  -nocs             disable complex script features
  -r                relaxed/less strict validation (where available)
  -dpi xxx          target resolution in dots per inch (dpi) where xxx is a number
  -s                for area tree XML, down to block areas only
  -v                run in verbose mode (currently simply print FOP version and continue)

  -o [password]     PDF file will be encrypted with option owner password
  -u [password]     PDF file will be encrypted with option user password
  -noprint          PDF file will be encrypted without printing permission
  -nocopy           PDF file will be encrypted without copy content permission
  -noedit           PDF file will be encrypted without edit content permission
  -noannotations    PDF file will be encrypted without edit annotation permission
  -nofillinforms    PDF file will be encrypted without fill in interactive form fields permission
  -noaccesscontent  PDF file will be encrypted without extract text and graphics permission
  -noassembledoc    PDF file will be encrypted without assemble the document permission
  -noprinthq        PDF file will be encrypted without print high quality permission
  -a                enables accessibility features (Tagged PDF etc., default off)
  -pdfprofile prof  PDF file will be generated with the specified profile
                    (Examples for prof: PDF/A-1b or PDF/X-3:2003)

  -conserve         enable memory-conservation policy (trades memory-consumption for disk I/O)
                    (Note: currently only influences whether the area tree is serialized.)

  -cache            specifies a file/directory path location for the font cache file
  -flush            flushes the current font cache file

 [INPUT]
  infile            xsl:fo input file (the same as the next)
                    (use '-' for infile to pipe input from stdin)
  -fo  infile       xsl:fo input file
  -xml infile       xml input file, must be used together with -xsl
  -atin infile      area tree input file
  -ifin infile      intermediate format input file
  -imagein infile   image input file (piping through stdin not supported)
  -xsl stylesheet   xslt stylesheet

  -param name value <value> to use for parameter <name> in xslt stylesheet
                    (repeat '-param name value' for each parameter)

  -catalog          use catalog resolver for input XML and XSLT files
 [OUTPUT]
  outfile           input will be rendered as PDF into outfile
                    (use '-' for outfile to pipe output to stdout)
  -pdf outfile      input will be rendered as PDF (outfile req'd)
  -pdfa1b outfile   input will be rendered as PDF/A-1b compliant PDF
                    (outfile req'd, same as "-pdf outfile -pdfprofile PDF/A-1b")
  -awt              input will be displayed on screen
  -rtf outfile      input will be rendered as RTF (outfile req'd)
  -pcl outfile      input will be rendered as PCL (outfile req'd)
  -ps outfile       input will be rendered as PostScript (outfile req'd)
  -afp outfile      input will be rendered as AFP (outfile req'd)
  -tiff outfile     input will be rendered as TIFF (outfile req'd)
  -png outfile      input will be rendered as PNG (outfile req'd)
  -txt outfile      input will be rendered as plain text (outfile req'd)
  -at [mime] out    representation of area tree as XML (outfile req'd)
                    specify optional mime output to allow the AT to be converted
                    to final format later
  -if [mime] out    representation of document in intermediate format XML (outfile req'd)
                    specify optional mime output to allow the IF to be converted
                    to final format later
  -print            input file will be rendered and sent to the printer
                    see options with "-print help"
  -out mime outfile input will be rendered using the given MIME type
                    (outfile req'd) Example: "-out application/pdf D:\out.pdf"
                    (Tip: "-out list" prints the list of supported MIME types and exits)
  -svg outfile      input will be rendered as an SVG slides file (outfile req'd)
                    Experimental feature - requires additional fop-sandbox.jar.

  -foout outfile    input will only be XSL transformed. The intermediate
                    XSL-FO file is saved and no rendering is performed.
                    (Only available if you use -xml and -xsl parameters)


 [Examples]
  fop foo.fo foo.pdf
  fop -fo foo.fo -pdf foo.pdf (does the same as the previous line)
  fop -xml foo.xml -xsl foo.xsl -pdf foo.pdf
  fop -xml foo.xml -xsl foo.xsl -foout foo.fo
  fop -xml - -xsl foo.xsl -pdf -
  fop foo.fo -mif foo.mif
  fop foo.fo -rtf foo.rtf
  fop foo.fo -print
  fop foo.fo -awt

Feb 25, 2017 11:19:42 AM org.apache.fop.cli.Main startFOP
SEVERE: Exception
java.io.FileNotFoundException: Error: xsl file /usr/share/xml/docbook/xsl-stylesheets-1.79.1/fo/docbook.xsl not found
	at org.apache.fop.cli.CommandLineOptions.checkSettings(CommandLineOptions.java:952)
	at org.apache.fop.cli.CommandLineOptions.parse(CommandLineOptions.java:172)
	at org.apache.fop.cli.Main.startFOP(Main.java:169)
	at org.apache.fop.cli.Main.main(Main.java:217)

make: *** [doc/example.pdf] Error 1

environment that I am using:

protobuf 3.2.0
fop 2.1

It seems that the docbook.xml file is not found or the path is not correctly configured on macos. There are several docbook.xsl files in docbook-xsl package installed by homebrew, but I am not sure which one is the correct one to use, maybe you can fix this path issue and fop will generate pdf correctly?

/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/epub/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/epub3/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/epub3/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/fo/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/fo/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/html/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/html/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/manpages/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/manpages/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml-1_1/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml-1_1/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml5/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml5/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml5/xhtml-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl/xhtml5/xhtml-profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/epub/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/epub3/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/epub3/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/fo/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/fo/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/html/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/html/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/manpages/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/manpages/profile-docbook.xsl
find: docbook: No such file or directory
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml-1_1/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml-1_1/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml5/docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml5/profile-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml5/xhtml-docbook.xsl
/usr/local/Cellar/docbook-xsl/1.79.1/docbook-xsl-ns/xhtml5/xhtml-profile-docbook.xsl

Include comment at top of .proto file

Feature requests:

If a proto file contains a "/** ... */" comment at the top of the file, that comment should be included in the generated documentation.

Enhancement - detect comments

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/example/example.proto

spits out -

`# Protocol Documentation

Table of Contents

Top

example.proto

### Example
Field Type Label Description
features Features optional
### SequenceExample
Field Type Label Description
context Features optional
feature_lists FeatureLists optional
## Scalar Value Types
.proto Type Notes C++ Type Java Type Python Type
double double double float
float float float float
int32 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint32 instead. int32 int int
int64 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint64 instead. int64 long int/long
uint32 Uses variable-length encoding. uint32 int int/long
uint64 Uses variable-length encoding. uint64 long int/long
sint32 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int32s. int32 int int
sint64 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int64s. int64 long int/long
fixed32 Always four bytes. More efficient than uint32 if values are often greater than 2^28. uint32 int int
fixed64 Always eight bytes. More efficient than uint64 if values are often greater than 2^56. uint64 long int/long
sfixed32 Always four bytes. int32 int int
sfixed64 Always eight bytes. int64 long int/long
bool bool boolean boolean
string A string must always contain UTF-8 encoded or 7-bit ASCII text. string String str/unicode
bytes May contain any arbitrary sequence of bytes. string ByteString str
`

but if you look into comments - theres more data/text / comments inside proto file that could be surfaced

Binary for centOS

I have been looking for a good protobuf documentation generator tool. Can you provide the binaries for centOS?

does not capture comments or default values

Protobuf comments use double forward slash like this, not "triple" forward slash as expected by the current protoc-gen-doc. As a result none of my comments are being generated in "description" fields. In addition, default values are also ignored.

Message enums are included in file_enums

Enums nested inside a message definition should be included as part of a new key "message_enums" instead of being included at the top file level "file_enums".

enum FileEnum {
    A = 1;
    B = 2;
}

message MyMessage {
    enum MessageEnum {
        C = 3;
        D = 4;
    }
}
[
    {
        "file_description": "",
        "file_enums": [
            {
                "enum_description": "",
                "enum_full_name": "FileEnum",
                "enum_long_name": "FileEnum",
                "enum_name": "FileEnum",
                "enum_values": [
                    {
                        "value_description": "",
                        "value_name": "A",
                        "value_number": 1
                    },
                    {
                        "value_description": "",
                        "value_name": "B",
                        "value_number": 2
                    }
                ]
            },
            {
                "enum_description": "",
                "enum_full_name": "MyMessage.MessageEnum",
                "enum_long_name": "MyMessage.MessageEnum",
                "enum_name": "MessageEnum",
                "enum_values": [
                    {
                        "value_description": "",
                        "value_name": "C",
                        "value_number": 3
                    },
                    {
                        "value_description": "",
                        "value_name": "D",
                        "value_number": 4
                    }
                ]
            }
        ],
        "file_extensions": [
        ],
        "file_has_extensions": false,
        "file_has_services": false,
        "file_messages": [
            {
                "message_description": "",
                "message_extensions": [
                ],
                "message_fields": [
                ],
                "message_full_name": "MyMessage",
                "message_has_extensions": false,
                "message_long_name": "MyMessage",
                "message_name": "MyMessage"
            }
        ],
        "file_name": "test.proto",
        "file_package": "",
        "file_services": [
        ]
    }
]

File exclusion of imports

We have google.protobuf.Empty (empty.proto) included in many of our files. We're using protobuf 3, but with proto2 syntax. This include from Google is in proto3 syntax, and contains options that reference plugins not available on all platforms. Therefore, we want to exclude it from documentation. There doesn't seem to be a way to flag an import statement as "don't follow" or "exclude." Can this feature be added?

File exclusion should happen before parsing

As in issue #74: We have google.protobuf.Empty (empty.proto) included in many of our files. We're using protobuf 3, but with proto2 syntax. This include from Google is in proto3 syntax, and contains options that reference plugins not available on all platforms. Therefore, we want to exclude it from documentation.

When we put ///@exclude at the top of the file, it was excluded from the documentation. However, the parser still crashed on syntax="proto3";. When we changed this to proto2, it then complained about unsupported options. We commented out options until the doc_out command ran successfully, and the file was excluded. Why is the parser insisting on parsing a file it's been instructed to exclude? Can we move the exclusion check earlier so that a file excluded due to incompatible syntax won't crash the whole job?

graphviz dot

I wrote a mustache template for outputting a graphviz dot file for the proto. Its pretty rough, but I thought others who came by might be interested in a graphical representation of their proto, as I was. If there is interest, I can open a PR to add it.
dot.txt

The new building is not take effect

step1. I'm forked this repository, and modify the main.cpp's function which is using the single-line(//) documentation comments replace the origin single-line (///) documentation comments.

step2. I building a protoc-gen-doc file by my git source code.

step3. I try to generated the html, and it is success, but the documentation comments is not display.

sudo protoc-gen-doc --doc_out=html,proto.html:./ ./*.proto

Q1: How to modified the right code, thanks!
Background: My team's project have lots of proto file, and the comments is defined single-line(//).

message AddFinancialPlanSaleResponse
{
    optional    proto.basic.ResponseBasic   basic               = 1; //demo field
    optional    uint64                      sale_id             = 2;  //sold id
}

Here is my modified code:

while (!stream.atEnd()) {
        QString line = stream.readLine().trimmed();
        if (line.isEmpty()) {
            continue;
        } else if (line.startsWith("//")) {
            while (!stream.atEnd() && line.startsWith("//")) {
                description += line.mid(line.startsWith("// ") ? 3 : 2) + '\n';
                line = stream.readLine().trimmed();
            }
            description = description.left(description.size() - 1);
        } else if (line.startsWith("/**") && !line.startsWith("/***/")) {
            line = line.mid(2);
            int start, end;
            while ((end = line.indexOf("*/")) == -1) {
                start = 0;
                if (line.startsWith("*")) ++start;
                if (line.startsWith("* ")) ++start;
                description += line.mid(start) + '\n';
                line = stream.readLine().trimmed();
            }
            start = 0;
            if (line.startsWith("*") && !line.startsWith("*/")) ++start;
            if (line.startsWith("* ")) ++start;
            description += line.mid(start, end - start);
        }
        break;
    }

import without package namespace not working

I have a setup like this:

proto files are stored in src/main/foo/bar/User.proto and src/main/foo/bar/Name.proto

//User.proto
import "Name.proto";
package foo.bar;
option java_package = "foo.bar";
option java_outer_classname = "UserProtos";
message User {
  ...
}

//Name.proto
package foo.bar;
option java_package = "foo.bar";
option java_outer_classname = "NameProtos";
message Name {
  ...
}

then, when I compile proto, I use this and it compiled fine.

protoc --java_out=src/main/java \
-Isrc/main/foo/bar/ \
src/main/foo/bar/User.proto \
src/main/foo/bar/Name.proto

When I use proto-gen-doc on this setup, it gives error about not able to find proto files to import. I tried src/main/foo/bar/ option to specify import search path, but protoc-gen-doc does not seem to support that option.

markdown.template lists are not correctly formatted

In the markdown.template is a small bug that leads to wrong formatted lists.

This is also shown in the markdown example

The current output:

Table of Contents

Should be:

Table of Contents

The template should be adapted by simply adding an additional space.

Before change:

Table of Contents

{{#files}}

After change:

Table of Contents

{{#files}}

I tried to create a custom template based on the templates/markdown.mustache with the above shown changes but then the output is missing the whole {{#message_fields}} section.

Thank you.

Support for HTML links inside comments

I am using protoc-gen-doc to generate HTML documentation. I would like to add HTML links in the comments that appear as clickable links in the documentation.

Example:

required int64 myValue = 1;  /// More info on http://www.google.com

Unfortunately the link is not clickable in the resulting documentation. So I tried to add HTML code in the comment like

required int64 myValue = 1;  /// More info on <a href="http://www.google.com">Google</a>

but that gets escaped and also does not work.

Is there any other way in order to get clickable links?

Message/Field descriptions not generated on Windows

Latest Windows release generates only file description, but not message or field description. The result is same with the latest Appveyor CI .zip version.

e.g. for the booking.proto in the examples, the following markdown is generated (note that no descriptions are available for messages and fields) -

Protocol Documentation

Table of Contents

Top

booking.proto

Booking related messages.

This file is really just an example. The data model is completely
fictional.

Author: Elvis Stansvik

### Booking
Field Type Label Description
vehicle_id int32 required
customer_id int32 required
status BookingStatus required
confirmation_sent bool required
payment_received bool required
### BookingStatus
Field Type Label Description
id int32 required
description string required
### BookingService
Method Name Request Type Response Type Description
BookVehicle Booking BookingStatus
## Scalar Value Types
.proto Type Notes C++ Type Java Type Python Type
double double double float
float float float float
int32 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint32 instead. int32 int int
int64 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint64 instead. int64 long int/long
uint32 Uses variable-length encoding. uint32 int int/long
uint64 Uses variable-length encoding. uint64 long int/long
sint32 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int32s. int32 int int
sint64 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int64s. int64 long int/long
fixed32 Always four bytes. More efficient than uint32 if values are often greater than 2^28. uint32 int int
fixed64 Always eight bytes. More efficient than uint64 if values are often greater than 2^56. uint64 long int/long
sfixed32 Always four bytes. int32 int int
sfixed64 Always eight bytes. int64 long int/long
bool bool boolean boolean
string A string must always contain UTF-8 encoded or 7-bit ASCII text. string String str/unicode
bytes May contain any arbitrary sequence of bytes. string ByteString str

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.