gorilla / http Goto Github PK
View Code? Open in Web Editor NEWPackage gorilla/http is an alternative HTTP client implementation for Go.
Home Page: https://gorilla.github.io
License: BSD 3-Clause "New" or "Revised" License
Package gorilla/http is an alternative HTTP client implementation for Go.
Home Page: https://gorilla.github.io
License: BSD 3-Clause "New" or "Revised" License
Client should grow the ability to transparently handle redirects internally
A good http client should support retry.
For example, user could set their customize retry policy.
maxRetryNum
retryStatusCodes
interval
etc.
I compiled the curl
example and found that it doesn't work for some URLs. I'm wondering if I missed anything (./curl
is the example executable):
$ ./curl https://google.com //works
$ ./curl https://api.github.com // doesn't work
2013/10/15 06:39:51 unable to fetch "https://api.github.com": dial tcp 192.30.252.137:80: connection refused
$ ./curl https://github.com // hangs forever
$ curl https://api.github.com // works
It seems that passing a bytes.Reader
causes an invalid request to be written to the wire, which hangs the handler. The test case below demonstrates this issue.
package http
import (
"bytes"
"io/ioutil"
"net/http"
"net/http/httptest"
"strings"
"testing"
)
func TestChunked_gorilla_bytes(t *testing.T) {
t.Parallel()
postdata := []byte("FOO_bytes")
var gotPostdata []byte
handled := make(chan struct{}, 1)
mux := http.NewServeMux()
mux.Handle("/", http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
gotPostdata, _ = ioutil.ReadAll(r.Body)
if !bytes.Equal(postdata, gotPostdata) {
t.Errorf("want postdata == %q, got %q", postdata, gotPostdata)
}
handled <- struct{}{}
}))
server := httptest.NewServer(mux)
defer server.Close()
err := Post(server.URL, bytes.NewReader(postdata))
if err != nil {
t.Error("Post: ", err)
}
<-handled
}
func TestChunked_gorilla_string(t *testing.T) {
t.Parallel()
postdata := []byte("FOO_string")
var gotPostdata []byte
handled := make(chan struct{}, 1)
mux := http.NewServeMux()
mux.Handle("/", http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
gotPostdata, _ = ioutil.ReadAll(r.Body)
if !bytes.Equal(postdata, gotPostdata) {
t.Errorf("want postdata == %q, got %q", postdata, gotPostdata)
}
handled <- struct{}{}
}))
server := httptest.NewServer(mux)
defer server.Close()
err := Post(server.URL, strings.NewReader(string(postdata)))
if err != nil {
t.Error("Post: ", err)
}
<-handled
}
func TestChunked_stdlib(t *testing.T) {
t.Parallel()
postdata := []byte("FOO_stdlib")
var gotPostdata []byte
handled := make(chan struct{}, 1)
mux := http.NewServeMux()
mux.Handle("/", http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
gotPostdata, _ = ioutil.ReadAll(r.Body)
if !bytes.Equal(postdata, gotPostdata) {
t.Errorf("want postdata == %q, got %q", postdata, gotPostdata)
}
handled <- struct{}{}
}))
server := httptest.NewServer(mux)
defer server.Close()
resp, err := http.Post(server.URL, "text/plain", bytes.NewReader(postdata))
if err != nil {
t.Error("Post: ", err)
}
defer resp.Body.Close()
<-handled
}
Running it produces:
$ go test -test.run=TestChunked -test.v -test.p=8
=== RUN TestChunked_gorilla_bytes-8
=== RUN TestChunked_gorilla_string-8
=== RUN TestChunked_stdlib-8
--- PASS: TestChunked_gorilla_string-8 (0.00 seconds)
--- PASS: TestChunked_stdlib-8 (0.00 seconds)
... (hanging on TestChunked_gorilla_bytes) ...
^Cexit status 2
FAIL github.com/sqs/http 0.678s
I tried debugging the use of ChunkedWriter but I couldn't figure out the problem. Any pointers?
As suggested by @davecheney the following could be added to the http package:
type StatusError Status
func (s *StatusError) Error() string
Which would allow us to convert a status code into an error.
All the infrastructure is there, a buffer should be inserted when the writer moves to the requestline
phase, and flushed when StartBody() is called.
Currently if you want to receive gzip'd content you need to pass a header do Get/Do. This should be made a property of the Client and should be enabled for the DefaultClient.
If the application calls
client.Get("http://example.com/%2f")
then the package should write
GET /%2f HTTP/1.1
to the network. The standard http package writes
GET // HTTP/1.1
to the network. This is rarely what an application wants.
The standard http package generates the request URI using the following lines of code:
u, err := url.Parse(urlStr)
...
req.URL = u
...
ruri := req.URL.RequestURI()
Because url.Parse() decodes escape sequences, the RequestURI() method cannot recover the URI as specified by the application.
A lot of REST API's will spit out an error reason into the body of the response and return a 400 or 406 error code. Would it not make sense to expose this somehow?
if _, err := http.Get(os.Stdout, "http://localhost:8000/"); err != nil {
fmt.Println(body) ?
panic(err)
}
Provide a knob for enabling the Expect: 100-continue header and the corresponding handling of the request body.
I want to use Sourcegraph for http code search, browsing, and usage examples. Can an admin enable Sourcegraph for this repository? Just go to https://sourcegraph.com/github.com/gorilla/http. (It should only take 30 seconds.)
Thank you!
On the server, an application receives data from a peer using io.Reader and sends data to the peer using an io.Writer. On the client, we handle send differently. The application provides an io.Reader that is copied to the peer.
I think the server model for sending data is conceptually easier to work with. The application drives the generation of data instead of being called to generate the data.
The break with net/http.Request opens up the possibility of using an io.Writer for sending data to the peer. Here's what the code might look like:
requestBody, err := client.Start("POST", "http://example.com/api/endpoint", headers)
err = json.NewEncoder(requestBody).Encode(someStruct)
status, headers, responseBody, err := requestBody.Fiinsh()
Unfortunately, the Start / Finish method calls are more complicated than Do. This extra complication is very noticeable for GET requests and in cases where the application already has the request body on hand as an io.Reader, []byte or a string.
I cannot think of a way to make this palatable. I am throwing this out in case one of you has some ideas about it.
Suggestion from dsal for a Conditional GET helper
type ReaderMaker func() (io.Reader, error)
// ConditionalGet writes the body of url to w if the server provides a 200 response.
// If the response is 304, f is called to provide an io.Reader which will provide the body.
// If f is nil, and a 304 is encounted, StatusError{304} will be returned.
func ConditionalGet(w io.Writer, url, etag string, modtime time.Time, f ReaderMaker) (int64, error)
Have the authors considered using golang.org/x/net/context
as a mechanism for controlling timeouts?
The idea would be that the caller would pass in a context.Context
and implementation would cancel in-flight requests if the <-chan struct{}
returned by ctx.Done()
receives.
For example, Do
would add a context.Context
as its first argument:
func (c *Client) Do(ctx context.Context, method, url string, headers map[string][]string, body io.Reader) (client.Status, map[string][]string, io.ReadCloser, error)
And usage would look like:
// spend no more than 1 second from request initiation to response return
ctx, cancel := context.WithTimeout(context.Background(), 1 * time.Second)
defer cancel()
_, err := http.Do(ctx, "POST", "http://www.gorillatoolkit.org/", make(map[string][]string), strings.NewReader("POST body goes here"))
// ...
I'm happy to submit a proposal pull request for this if interested, wanted to float the idea here first.
Reference material on context: https://blog.golang.org/context (see bottom near the httpDo
func)
Authenticators to support, in order of importance / difficulty
If an application can specify the read deadline when reading the response body, then the application can easily detect dropped connections for Twitter streaming endpoints and similar services.
The following snippet of code shows how an application can use the feature. The Twitter streaming endpoints write a blank keep alive line very 30 seconds.
scanner := bufio.NewScanner(body)
body.SetReadDeadline(time.Now().Add(40 * time.Second))
for scanner.Scan() {
p := scanner.Bytes()
if len(p) > 0 {
processTweet(p)
}
body.SetReadDeadline(time.Now().Add(40 * time.Second))
}
var DefaultClient = Client{
dialer: new(dialer),
FollowRedirects: true,
}
why not
var DefaultClient = Client{
dialer: new(dialer),
FollowRedirects: true,
Jar: Cookiesjar}
}
In Client struct:
type Client struct {
dialer Dialer
// FollowRedirects instructs the client to follow 301/302 redirects when idempotent.
FollowRedirects bool
}
dialer attribute is not exported, it's not set by library and there is no method to set it from outside of the package.however it's referenced in the code which causes application to crash with nil pointer dereference.
P.S. Seriously, i see this project is not developed anymore, but you could at least remove it from your website if it contains such bugs like this (especially when in code comments you encourage people to create own client while this is not possible, wtf)
The current http.conn http.dialer implementations do not reuse connections. The connection to Closed, rather than being pooled after each request.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.