Giter Site home page Giter Site logo

Comments (6)

mieczyslaw avatar mieczyslaw commented on August 15, 2024 2

It means that the number of system's file descriptors in use exceeds the limit. This can be changed with 'ulimit' in linux.

It may also be an issue with the code when files are not closed, or temporary files are created on low level (I created a bug report to Python developers as the tempfile documentation doesn't mention that when creating a temporary file, it should be both removed after use and file descriptor closed even though the file was never opened).

from haddock-tools.

amjjbonvin avatar amjjbonvin commented on August 15, 2024 1

from haddock-tools.

iamysk avatar iamysk commented on August 15, 2024 1

Thanks for the prompt response @amjjbonvin . I'll check my implementation. Btw I need to run 735 complexes instead of the above mentioned 100, and it ran pretty well till around 400 complexes, after which I got these issues.

Anyways, thanks again for your response. And also for building and providing such a powerful tool.

from haddock-tools.

amjjbonvin avatar amjjbonvin commented on August 15, 2024 1

from haddock-tools.

iamysk avatar iamysk commented on August 15, 2024

Most like a Python limitation. It is probably more efficient to run fewer complexes at the same time rather than starting all 100 at the same time.

Thanks for the suggestion @amjjbonvin . I am not running 100 complexes simultaneously. My setup looks like the following.

# Input PDB directory
set z="processed_pdb_batch2/*"

@ run=1
# Iterate over all the proteins in the directory
foreach i (`ls -d $z`)
 # Make run.param file for current complex
 python makeParam.py --pdb $i --run $run
 # 1st haddock initialization
 haddock2.4
 # Enter run directory
 cd run$run
 # Edit the run.cns file to use 128 cores
 python ../editCNS.py --filename run.cns
 # 2st haddock initialization
 haddock2.4
 cd ..
 @ run+=1
 echo $i
 echo $run
end

It runs well for the initial few complexes, but arbitrarily freezes with the mentioned error or skips the current run and moves over to the next complex (Inferred from the small size ~400MB of such run directory in comparison to a complete run that weighs exactly 1.1GB). I suspect some temp files are not being closed properly during the three step (it0, it1, water) docking process for these incomplete runs, leading to the eventual bottleneck. Pls suggest a way to overcome or an alternative to automate HADDOCK runs for multiple complexes.

Thanks.

from haddock-tools.

amjjbonvin avatar amjjbonvin commented on August 15, 2024

from haddock-tools.

Related Issues (9)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.