Comments (28)
No worries.
So libxml likes to hold onto memory for things. Once all the workers are done processing, you should try calling the following:
import "github.com/moovweb/gokogiri/help"
help.LibxmlCleanUpParser()
Note, that once you call the above function, you can no longer use gokogiri to parse any more documents. This should clean up a lot of memory, but it renders the parser useless until your program exits.
from gokogiri.
Right, that explains why when my program is done it held onto a few hundred MB.
So are you saying a long running process(parsing tens of thousands of large pages) with gokogiri is not possible, because memory will keep growing?
from gokogiri.
Sorry it took me so long to get back to you. Do you have a copy of the 42k list of URLs you're using? I'd like to replicate the issue if possible to investigate further.
from gokogiri.
No problem, I linked to the 42K url list in the original post.
from gokogiri.
woops, sorry, I'm silly =x
from gokogiri.
Also, what version of go are you using?
from gokogiri.
go1.1.1 amd64
from gokogiri.
Hey @JakeAustwick, sorry, I haven't had too much time to investigate.
However, I did manage to do this:
I ran the program you provided with the urls you provided, and though the memory did keep climbing for a while, it seemed to somewhat stabilize around 300-400MB on my machine. I totally understand that that's a rather high memory footprint for what that program is doing, but my intuition is that perhaps go's garbage collection is not as aggressive as it could be when cleaning up resources.
Have you tried tweaking the runtime garbage collection parameters?
from gokogiri.
I'd accept 300-400mb happily, my problem was that it rose to >1Gb and kept rising.
Did you run through the entire list?
from gokogiri.
So I've noticed the memory jump up all the way up to 700MB at some points, but after going through the entire list of URLs that you provided, the memory settled down to 430MB, at least for my machine, which makes me feel that it's a garbage collection issue.
You could try forcing the garbage collector to run at each iteration and see how that affects it, however, that'll take quite a while to run probably =/
from gokogiri.
Hmm, I seem to remember my memory going much higher. I'll give it another run and let you know.
from gokogiri.
@JakeAustwick @mdayaram Did you find a solution to this? I'm looking at doing something similar with Gokogiri and have ran into a similar issue with other libs in the past.
from gokogiri.
I never continued with this project in Go, I simply rewrote it in Python with gevent and lxml. Sorry.
from gokogiri.
Unfortunately I was never able to replicate the issue. One thing that we never clarified was what operating system @JakeAustwick was running in. I ran all my tests in Linux/Ubuntu, however, I know that there are some memory issues with Windows. Also, this issues could've all disappeared with the introduction to go1.2.
As far as I can tell, it seemed specific to the environment. My advice for you @jwarzech would be to run the test program in this issue with the URLs and see how your memory is handled. If it's doing OK (around the 400MBish range), then I wouldn't be concerned with it.
Here at moovweb we have several production boxes constantly parsing html pages using gokogiri without any major memory issues, but all those boxes are also running Linux and go1.2.
from gokogiri.
Just for reference, I was running on ubuntu 12.04, with my version of libxml installed from the ubuntu repos.
from gokogiri.
would implementing this inside the lib helps? http://golang.org/pkg/runtime/#SetFinalizer
I'm using valgrind to test and it seems to help so far.
from gokogiri.
SetFinalizer
would only help if explicitly calling the Free
helper methods helps alleviate the problems. However, from what I can tell, @JakeAustwick attempted that and still got memory issues which I could not reproduce =(
We did have an implementation of SetFinalizer
a while back but it was decided that it was better form to provide explicit freeing of objects to give more power to the programmer as to when memory should be cleaned up. Unfortunately I was not part of those early conversations so I don't know the specifics as to why the decision was made.
from gokogiri.
Hello, I was wondering if this project is still active? it's been 4 months since any commits with an open memory leak which is a pretty scary things for server use. Either way, thank you for this project, it's great work.
from gokogiri.
No one has been able to reproduce this memory leak (it should probably be closed by @mdayaram for that reason).
from gokogiri.
If it can't be reproduced, we'll close it.
from gokogiri.
Hi @mdayaram , I had the same issue using "doc, err := gokogiri.ParseHtml(page)" on my mac pro with go version go1.4 darwin/amd64
I have done a heavy test parsing more than 46557 websites. My memory usage kept going up to 120GB without declining, then the program process got killed and my machine ran out of memory.
Note: I simply used gokogiri.ParseHtml(page). I tried to use runtime.GC() every min, but It did not help.
from gokogiri.
This still appears to be popping up intermittently, so I'll take a look this weekend and see if I can turn up more info. Reopening for now.
from gokogiri.
@akhleung @JakeAustwick
I just tried to following code, parsing the same url again and again. Memory usage keep going up to a few GBs within a minute.
url = "http://www.bestbuy.com/site/withings-pulse-o2-tracker-black/5686019.p?id=1219148342760&skuId=5686019"
resp, _ := http.Get(url)
page, _ := ioutil.ReadAll(resp.Body)
for {
gokogiri.ParseHtml(page)
runtime.GC()
}
from gokogiri.
Thanks for the test case! I'll try it out.
from gokogiri.
@weil I ran your example, and it did indeed leak very badly. However, when I modified the loop to assign the document to a variable and call its Free()
method, the memory usage stabilized. Can you try that and see if it solves the problem? For reference, here's what I ran:
package main
import (
"net/http"
"io/ioutil"
"gokogiri"
"runtime"
)
func main() {
url := "http://www.bestbuy.com/site/withings-pulse-o2-tracker-black/5686019.p?id=1219148342760&skuId=5686019"
resp, _ := http.Get(url)
page, _ := ioutil.ReadAll(resp.Body)
for {
doc, _ := gokogiri.ParseHtml(page)
doc.Free()
runtime.GC()
}
}
from gokogiri.
If this indeed fixes the leak, the documentation and example for ParseHtml (and ParseXml) should be updated to include a call defer doc.Free()
immediately after a call to ParseHtml/ParseXml. This will make it clearer that freeing the returned document needs to be explicitly managed by the programmer.
from gokogiri.
I'm seeing this as well and can reproduce with weil's code
from gokogiri.
hi
is this issue still happening?
im new to golang and i want to use this library to process some small html page
from gokogiri.
Related Issues (20)
- identifier "_Ctype_struct__xmlDoc" may conflict with identifiers generated by cgo HOT 4
- Get error when start Docker container
- WIndows installation
- Encoding is passed around as byte array instead of string HOT 5
- performance issue of XmlNode.SetContent() HOT 1
- strange behavior in href attribute, uri with parameters HOT 2
- How do I parse xml with a namespace? HOT 2
- XSD Validation? HOT 1
- Issues with go get github.com/moovweb/gokogiri/css HOT 2
- Attempting to cross compile for linux/amd64 on OSX fails using gox HOT 1
- Trim surrounding whitespace for Node.Content() HOT 2
- How to get arttr with CSS HOT 2
- El capitan issue HOT 1
- Crash with custom XPath resolver HOT 3
- Parsing XML dies (stays blocked) when doing in parallel HOT 2
- Go1.6 compatibility HOT 2
- Go 1.6: runtime error: cgo argument has Go pointer to Go pointer HOT 6
- Save DOM model as xml file
- build constraints exclude all Go files in /moovweb/gokogiri/help, failed to build with arch=386 HOT 1
- pkg-config: exec: "pkg-config": executable file not found in %PATH%
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from gokogiri.