cbonte / haproxy-dconv Goto Github PK
View Code? Open in Web Editor NEWHAProxy documentation converter
Home Page: http://cbonte.github.io/haproxy-dconv/
License: Apache License 2.0
HAProxy documentation converter
Home Page: http://cbonte.github.io/haproxy-dconv/
License: Apache License 2.0
Unfortunately the sidebar blocks out content and also has CSS z-index irregularities. There it's currently not possible or at least very hard to browse docs on iPhones.
First of all, thank you so much for providing these docs. I'm a frequent user for years and they provide great value.
Since 1.6 the CSV and socket documentation seems to have moved from configuration.txt to management.txt in HAProxy. So far I usually open the 1.5 documentation to quickly look up some definition. It'd be very helpful to have access to a nicely rendered version of the 1.6 documentation for that. Is there any chance you parse management.txt as well? I couldn't find it under https://cbonte.github.io/haproxy-dconv/management-1.6.html or linked in the menu.
This applies to v2.2-dev.
It's not an issue but a feature ask. Is it possible to implement an export to pdf?
Thanks ;)
Please also export the other documentation files:
Please provide documentation also in epub format for ereaders.
I tried pandoc but the result is not usefull because it crops the right part of the sentences:
pandoc -f html -t epub3 -o intro.epub intro.html
Currently, the parser can't detect keywords or acls with a '+' inside.
This prevents an acl like 'base32+src' to be referenced and searchable.
Hello.
I find mistake on manual page.
http://cbonte.github.io/haproxy-dconv/configuration-1.5.html#9.2-shutdown%20sessions
in command "shutdown sessions /" need to be
"shutdown sessions server /".
I'm seeing the 'status code' as -1 in haproxy logs, whereas the documentation specifies:
"The status code is always 3-digit."
I do see the 'normal' result codes, but I also see a lot of -1's.
adding minimal robot.txt and generating sitemap.xml based on url list like in the start page.
in that way all search about haproxy configuration in google/bing point correctly to this pages... because the documentation while be indexed...
Minimal robot.txt:
User-agent: *
Disallow:
Sitemap: https://cbonte.github.io/haproxy-dconv/sitemap.xml
sample sitemap (generate it at the same time of head page : https://cbonte.github.io/haproxy-dconv/):
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://cbonte.github.io/haproxy-dconv/</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.6/intro.html</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.6/configuration.html</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.6/management.html</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.5/intro.html</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.5/configuration.html</loc>
</url>
<url>
<loc>https://cbonte.github.io/haproxy-dconv/2.5/management.html</loc>
</url>
....
</urlset>
use https github in place on http (google prefere https...) every where in the documentation and in about.
https://cbonte.github.io/haproxy-dconv/
in place of http://cbonte.github.io/haproxy-dconv/
submit the sitemap url (https://cbonte.github.io/haproxy-dconv/sitemap.xml) to google/bing search console:
The "What HAProxy is and isn't" section of the manual (http://cbonte.github.io/haproxy-dconv/2.1/intro.html#3.1) describes things HAProxy does not perform (which is fine). However, one of the sentences there is pretty puzzling for the beginner I am :
HAProxy is not :
- a packet-based load balancer : it will not see IP packets nor UDP datagrams, will not perform NAT or even less DSR.
Ignoring what "DSR" is, DuckDuckGo gave me 2 definitions :
Considering "DSR" is something HAProxy does not perform, I didn't expect it to be referred to further in the manual, but actually found these (possibly obsolete) :
Could you, please, clarify this "...will not perform NAT or even less DSR" sentence ?
The following example is in de manual for version 1.5.16:
Append 'www.' prefix in front of all hosts not having it
http-request redirect code 301 location www.%[hdr(host)]%[req.uri]
unless { hdr_beg(host) -i www }
This goes wrong on two things:
1: req.ur should be capture.req.uri
2: http:// or https:// should be in front to avoid a redirect loop with a relative url.
http-request redirect code 301 location http://www.%[hdr(host)]%[capture.req.uri]
unless { hdr_beg(host) -i www }
If you look at https://cbonte.github.io/haproxy-dconv/ you have a list of all documented haproxy branches. 2.1 is listed as 2.1-dev, even though it's been released for a while now.
In Configuration, section 2, "2.4 Exemples" should be "2.5 Exemples".
In haproxy documentation it is specified that:
Currently, HAProxy is capable of
generating codes 200, 400, 401, 403, 404, 405, 407, 408, 410,
413, 425, 429, 500, 501, 502, 503, and 504.
But further down in https://www.haproxy.com/documentation/haproxy-configuration-manual/latest/#1.4.1 it is specified only:
HAProxy may emit the following status codes by itself :
....
410 when the requested resource is no longer available and will not
be available again
500 when HAProxy encounters an unrecoverable internal error, such as a
memory allocation failure, which should never happen
....
And information when 413 is generated is not provided. However haproxy can and will return 413 in some cases:
2024 IP https~ https/<NOSRV> -1/-1/-1/-1/0 413 0 - - PR-- 10/10/0/0/0 0/0 "<BADREQ>"
The only way to find more about 413 code is to look at source code of haproxy where more information is provided with configuration option for potential fix:
/* Reject HTTP/1.0 GET/HEAD/DELETE requests with a payload except if
* accept_payload_with_any_method global option is set.
*There is a payload if the c-l is not null or the the payload is
* chunk-encoded. A parsing error is reported but a A
* 413-Payload-Too-Large is returned instead of a 400-Bad-Request.
*/
if (!accept_payload_with_any_method &&
!(h1m->flags & (H1_MF_RESP|H1_MF_VER_11)) &&
(((h1m->flags & H1_MF_CLEN) && h1m->body_len) || (h1m->flags & H1_MF_CHNK)) &&
(h1sl.rq.meth == HTTP_METH_GET || h1sl.rq.meth == HTTP_METH_HEAD || h1sl.rq.meth == HTTP_METH_DELETE)) {
h1s->flags |= H1S_F_PARSING_ERROR;
htx->flags |= HTX_FL_PARSING_ERROR;
h1s->h1c->errcode = 413;
Can you add information about 413 code into documentation?
My version is 1.5.4.
I opened a multi-process model, and a number of sock documents, the allocation is as follows:
nbproc 4
ulimit-n 231097
stats bind-process 1
stats socket /tmp/haproxy1.sock level admin
stats bind-process 2
stats socket /tmp/haproxy2.sock level admin
stats bind-process 3
stats socket /tmp/haproxy3.sock level admin
stats bind-process 4
stats socket /tmp/haproxy4.sock level admin
I would like to know, and now take from the sock file to the idle or the whole process is a separate idle idle.
Server name in this example should probably be imap
rather than mail
:
use-server imap if { req_ssl_sni -i imap.example.com }
server mail 192.168.0.1:993 weight 0
becomes:
use-server imap if { req_ssl_sni -i imap.example.com }
server imap 192.168.0.1:993 weight 0
Apparently dconv has issues with the "Quoting and escaping" chapter. It is cut off at the table:
https://cbonte.github.io/haproxy-dconv/2.5/configuration.html#2.2
Discussed here: incorrectly displayed opening single quotes
At HAProxy 2.6 - Configuration Manual pretty much all opening (or rather 1st on the line) single quote characters in any examples are shown as '
instead of '
Examples:
.diag "WTA/2021-05-07: replace 'redirect' with 'return' after switch to 2.4"
7.3.1. Converters - regsub(<regex>,<subst>[,<flags>])
http-request set-header x-path "%[hdr(x-path),regsub('/+','/','g')]"
http-request set-header x-query "%[query,regsub([?;&]*,'')]"
http-request redirect location %[url,'regsub("(foo|bar)([0-9]+)?","\2\1",i)']
etc (just search for the '
string)
Compare that with the text version of the doc http://www.haproxy.org/download/2.6/doc/configuration.txt where no such issue can be seen.
This also affects older doc versions.
Can this be fixed so it wouldn’t be confusing?
For my project it must be possible to check different bytes from \x00 to \xFF via tcp-check expect rstring. But it is not possible to check NULL bytes. Everything after the first NULL byte will be ignored.
My test configuration:
tcp-check send-binary 9C00000800870100
tcp-check expect rstring ^\x9a\x00\x00.{5}\x00{13}\x10\x00\x04
It would be amazing if HAProxy would also support loadbalancing of UDP packets.
HAProxy would then be the perfect loadbalancer for products like Graylog. (Loadbalancing syslog UDP messages, for example. Many devices don't support syslog TCP)
NGINX can do it, but it does not support health checks on the free version. And the paid version is too expensive for just this feature....
Sorry if this is not the right place for feature requests.
See the unset-var(<var name>)
in 7.3.1. Converters is unparsed (perhaps because of the space char in variable name).
Now each txt file only generates one corresponding HTML page, which is unfriendly at times.
For example, when using browser translation, the translation is very slow because of the large amount of text.
Can we add the option or default to convert each section to a separate page.
Hello Cyril,
You do not give Python dependencies to use your project.
i found that at least Mako should be installed with pip.
Olivier
The current generated manual for 1.4 documents redirect scheme
which is only "released" in HAProxy 1.5 yet. In the 1.4 branch, it is only available in master but was not yet released.
The following files do NOT document this option:
configuration.txt
fileIt is however available in documentation and code in the current master in the haproxy-1.4 master branch. Having the option documented here now is rather surprising given that most people won't find the option in the code version they use.
Thus, at least the 1.5 docs should be generated from the latest released version, not from the master branch.
References:
As title
Would it be possible to have some kind of hint showing which section of the document I'm currently viewing? for example, if I'm currently scrolled to the be_server_timeout documentation have a hint somewhere that tells me I am in section 7.3.3 "Fetching samples at layer 4"
In the section about capture request header <name> len <length>
, at the end there is the following example:
capture request header Host len 15
capture request header X-Forwarded-For len 15
capture request header Referrer len 15
The last line is wrong as the header name should be Referer
(i.e. the misspelled version with a single r
instead of the correct version with two rr
).
(Later on in the documentation it is written "correctly" with a single r
.)
If a keyword contains spaces, e.g. option forwardfor
, the generated linked anchor, as well as the <a name="…">
links contain spaces. This at least leads to issues when copying and pasting the link.
As a solution, you could either urlencode() the link or replace all non-word characters with an underscore in the anchors.
I'm really sorry for not providing a pull request, but I'm not exactly sure where the elements are actually generated...
haproxy2.0.6
/etc/haproxy/haproxy.cfg
frontend ss-in
timeout client 1m
bind *:10000-50000
default_backend ss-out
backend ss-out
timeout connect 10s
timeout server 1m
server server1 2001:470:c:f56::2 maxconn 20480
haproxy -D -f /etc/haproxy/haproxy.cfg
: parsing [/etc/haproxy/haproxy.cfg:9] : 'server server1' : could not resolve address '2001:470:c:f56:'.
: Failed to initialize server(s) addr.
if single port will work fine.
Hello,
I've just detected missing words in a sentence in the converted document. I think it is related to handling of Example sections.
In the source document, the sentence is in "7.2. Using ACLs to form conditions":
..................., with a space before and after each brace (because
the braces must be seen as independent words). Example :
The following rule :
....................
In the converted HTML, the words the braces must be seen as independent words).
are absent ; while the Example is properly decorated.
Check here: http://cbonte.github.io/haproxy-dconv/configuration-1.5.html#7.2
May be there are some other occurrences, I did not check.
Is it possible to fix this ?
Thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.