Giter Site home page Giter Site logo

diracgrid / dirac Goto Github PK

View Code? Open in Web Editor NEW
111.0 18.0 173.0 84.67 MB

DIRAC Grid

Home Page: http://diracgrid.org

License: GNU General Public License v3.0

Python 98.90% Shell 1.10%
distributed-computing grid workflow-management workload-management interware

dirac's Introduction

DIRAC

https://img.shields.io/conda/vn/conda-forge/dirac-grid

DIRAC is an interware, meaning a software framework for distributed computing.

DIRAC provides a complete solution to one or more user community requiring access to distributed resources. DIRAC builds a layer between the users and the resources offering a common interface to a number of heterogeneous providers, integrating them in a seamless manner, providing interoperability, at the same time as an optimized, transparent and reliable usage of the resources.

DIRAC has been started by the LHCb collaboration who still maintains it. It is now used by several communities (AKA VO=Virtual Organizations) for their distributed computing workflows.

DIRAC is written in python 3.9.

Status rel-v8r0 series (stable, recommended):

Basic Tests Status Pilot Wrapper Status Integration Tests Status Documentation Status

Status integration branch (devel):

Basic Tests Status Pilot Wrapper Status Integration Tests Status Documentation Status

Important links

Install

There are basically 2 types of installations: client, and server.

For DIRAC client installation instructions, see the web page.

For DIRAC server installation instructions, see the web page.

DIRAC 8.0 drops support for Python 2 based clients and servers.

There are three available options for installation:

  1. DIRACOS2: This is the only fully supported method, see the DIRACOS 2 documentation.

  2. Conda / Mamba from conda-forge : We recommend making a new environment for DIRAC using

    mamba create --name my-dirac-env -c conda-forge dirac-grid
    conda activate my-dirac-env
  3. Pip: Provided suitable dependencies are available DIRAC can be installed with pip install DIRAC. Support for installing the dependencies should be sought from the upstream projects.

Development

For the full development guide see here, some of the most important details are included below.

Contributing

DIRAC is a fully open source project, and you are welcome to contribute to it. A list of its main authors can be found here A detailed explanation on how to contribute to DIRAC can be found in this page. For a quick'n dirty guide on how to contribute, simply:

  • Fork the project inside the GitHub UI

  • Clone locally and create a branch for each change

    git clone [email protected]:$GITHUB_USERNAME/DIRAC.git
    cd DIRAC
    git remote add upstream [email protected]:DIRACGrid/DIRAC.git
    git fetch --all
    git checkout upstream/integration
    git checkout -b my-feature-branch
    git push -u origin my-feature-branch
  • Create a Pull Request, targeting the "integration" branch.

Code quality

To ensure the code meets DIRAC's coding conventions we recommend installing pre-commit system wide using your operating system's package manager. Alteratively, pre-commit is included in the Python 3 development environment, see the development guide for details on how to create one.

Once pre-commit is installed you can enable it by running:

pre-commit install --allow-missing-config

Code formatting will now be automatically applied before each commit.

Testing

Unit tests are provided within the source code and can be ran using pytest. Integration, regression and system tests are instead in the DIRAC/tests/ directory.

Acknowledgements

This work is co-funded by the EOSC-hub project (Horizon 2020) under Grant number 777536

eu-logo eosc-hub-web

dirac's People

Contributors

acasajus avatar aldbr avatar andresailer avatar arrabito avatar atsareg avatar chaen avatar chrisburr avatar cinzialu avatar closier avatar fstagni avatar graciani avatar krzysztofciba avatar lcdgit avatar marianne013 avatar martynia avatar miloszz avatar mirguest avatar phicharp avatar pujanm avatar remenska avatar rupozzi avatar sbalbp avatar sfayer avatar sposs avatar taykyoku avatar taykyoku2 avatar ubeda avatar vipersec avatar wirespecter avatar wkrzemien avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dirac's Issues

default port not defined in Mail.py utility

The bug is manifesting itself by the following traceback:

from DIRAC.Core.Utilities.Mail import Mail
m = Mail()
m._subject ='subject'
m._message = 'body'
m._mailAddress = '[email protected]'
m._send()
Traceback (most recent call last):
File "", line 1, in
File "DIRAC/Core/Utilities/Mail.py", line 25, in _send
self.connect()
File "/opt/dirac/pro/Linux_x86_64_glibc-2.5/lib/python2.6/smtplib.py",
line 293, in connect
if not port: port = self.default_port
AttributeError: Mail instance has no attribute 'default_port'

_secVO in BaseSecurity

Please check if this data member can be removed (it is not used) or is to be taken from Registry, might require checking the proxy first.

dirac-install command switches

The dirac-install command should have the following switches defined ( among others ) :
-l specifies that the software Project is to be installed with the version specified by -r switch
alternatively
-V specifies that dirac-install should install software packages as specified in the community defaults.
The -l and -V switches are not compatible, giving both in the command line must result in a meaningful error message.

Define OPENSSL_CONF in DIRAC env

openssl >= 1.0 requires OPENSSL_CONF env var to exist to avoid looking in built-in directory (which doesn't exist when installed somewhere different than where it was compiled)

Deleting waiting pilots when they do not have the right version

It could be useful to have an agent that cancels pilots that have a dirac version different from the one in the CS. It would cancel only those scheduled or waiting. It could be refined such that it deletes only those that run with a certian role (e.g. Role=production). That would avoid some time to recover when there are plenty of pilots with the wrong version scheduled to a given site.

[Framework] ConfigTemplate.cfg not read

I have created a ConfigTemplate.cfg in LHCbDIRAC/DataManagementSystem to put information about a new Agent but when I tried to installed it the new agent with dirac-admin-sysadmin-cli my ConfigTemplate.cfg is not read. Does it mean that the ConfigTemplate.cfg could be only in DIRAC/DataManagementSystem ? or is it a bug ?

^@[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRACdirac-admin-sysadmin-cli --host volhcb03.cern.chDIRAC Root Path = /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9
volhcb03.cern.ch >install agent DataManagement StorageHistoryAgent
AT >>> agent DataManagement StorageHistoryAgent Certification ['LHCbWeb', 'LHCb']
Loading configuration template /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/DIRAC/DataManagementSystem/ConfigTemplate.cfg
Can not find Agents/StorageHistoryAgent in template
{'Message': 'Can not find Agents/StorageHistoryAgent in template', 'OK': False}
ERROR: Can not find Agents/StorageHistoryAgent in template

[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC> grep StorageHistoryAgent /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC/LHCBDIRAC_v6r5-pre10/LHCbDIRAC/DataManagementSystem/ConfigTemplate.cfg
StorageHistoryAgent

FC: return subset of entries and use "yield" (generator) like method

Again a feature request: For queries returning a very large number of entries, it could be useful to return only a sub set of entries (user parameter, default to all), as sometimes, one does not need all the entries. Also, a method to return the next set would be useful. The point being to gain in packet size (transmit less data per calls).

ResolveSE

Check that returned SE is active.

FC: meta1=one AND NOT meta2=two query

Lets say that meta2 has several values (one, two, three), and meta1 is a meta for a parent directory. Now one want to get all files corresponding to meta1, but exclude those that belong to meta2=two. Not possible now (or I missed something).

FC Failed to create directory

dirac-dms-add-file LFN:/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt test.txt CPPM-disk

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

ReplicaManager.putAndRegister: Failed to register file. /vo.formation.idgrilles.fr/user/v/vhamar/test1.txt {'FileCatalog': 'Failed to create directory for file'}
{'Failed': {'/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt': {'register': {'Addler': '97b75b83',
'GUID': 'ED690EED-DCBD-9EDD-12E4-7EE43A19B6D5',
'LFN': '/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt',
'PFN': 'srm://marsedpm.in2p3.fr/dpm/in2p3.fr/home//vo.formation.idgrilles.fr/user/v/vhamar/test1.txt',
'Size': 247,
'TargetSE': 'CPPM-disk'}}},

'Successful': {'/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt': {'put': 13.008781909942627}}}

2011-06-08 07:53:38 UTC DataManagement/FileCatalog/MySQL DEBUG: _query: SELECT LEVEL,LPATH1,LPATH2,LPATH3,LPATH4,LPATH5,LPATH6,LPATH7,LPATH8,LPATH9,LPATH10,LPATH11,LPATH12,LPATH13,LPATH14,LPATH15 FROM FC_DirectoryLevelTree WHERE DirID=1
2011-06-08 07:53:38 UTC DataManagement/FileCatalog/MySQL DEBUG: _query: Excution failed. 1054: Unknown column 'LPATH11' in 'field list'

[vanessa@mardirac3 DMS]$ dirac-dms-filecatalog-cli

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

Starting DIRAC FileCatalog client

File Catalog Client $Revision: 1.17 $Date:

FC:/> ls
FC:/> mkdir vo.formation.idgrilles.fr
Failed to create directory: Excution failed.: ( 1054: Unknown column 'LPATH11' in 'field list' )

Threaded agents interacting with LFC client

The connections initated by the LCGFileCatalogClient in DIRAC are not thead safe because the lfc module used is not a thread safe one.

The fix put in the current production system is to put a global lock in the LCGFileCataloClient and acquire it and releasing it just after the execution. A better solution is foreseen.

[Framework] dirac-admin-sysadmin-cli crash

If I run the dirac-admin-sysadmin-cli command with the wrong credentials I get this crash with DIRAC v5r13p9. I suspect that we should get an error message instead of a crash.

[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC/LHCBDIRAC_v6r5-pre10> dirac-admin-sysadmin-cli --host volhcb03.cern.ch
DIRAC Root Path = /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9
volhcb03.cern.ch >
volhcb03.cern.ch >
volhcb03.cern.ch >install agent DataManagementSystem StorageUsageAgent
Error: Unauthorized query to Framework/SystemAdministrator:getInfo
Traceback (most recent call last):
File "/afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/InstallArea/scripts/dirac-admin-sysadmin-cli", line 18, in
cli.cmdloop()
File "/afs/cern.ch/sw/lcg/external/Python/2.6.5/x86_64-slc5-gcc43-opt/lib/python2.6/cmd.py", line 142, in cmdloop
stop = self.onecmd(line)
File "/afs/cern.ch/sw/lcg/external/Python/2.6.5/x86_64-slc5-gcc43-opt/lib/python2.6/cmd.py", line 219, in onecmd
return func(arg)
File "/afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/InstallArea/python/DIRAC/FrameworkSystem/Client/SystemAdministratorClientCLI.py", line 380, in do_install
hostSetup = result['Value']['Setup']
KeyError: 'Value'

Changing default lcgBundle version with SystemAdministrator

On the server side we have:
types_updateSoftware = [ StringTypes ]
def export_updateSoftware( self, version, rootPath = "", gridVersion = "2009-08-13" ):

and on the client:

print "Software update can take a while, please wait ..."
result = client.updateSoftware( version )

this means in particular that there is no way to update a server to a newer version of the lcgBundle via the SystemAdministrator.

I would suggest that the default is set to "", in this case whatever is on the dirac.cfg (from the initial server installation will be taken), and that the client is fixed to pass the same arguments that the server can use.

Pilot Monitor web page

It would be nice that from the pilot monitor page, when clicking on a pilot to show the job, it opened a new tab instead of replacing the current tab.

Problem to upload user proxies (v6r0-pre3)

[vanessa@mardirac3 ~]$ proxy-init -g dirac_user -d
Enter Certificate password:
Contacting CS...
New connection -> 127.0.0.1:9135
Checking DN /O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar
Username is vhamar
Creating proxy for vhamar@dirac_user (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar)
Traceback (most recent call last):
File "/home/vanessa/DIRAC-v6r0-pre3/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 135, in
success = uploadProxyToDIRACProxyManager( cliParams )
File "/home/vanessa/DIRAC-v6r0-pre3/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 87, in uploadProxyToDIRACProxyManager
params.debugMsg( "Uploading user pilot proxy with group %s..." % ( params.getDIRACGroup() ) )
AttributeError: CLIParams instance has no attribute 'debugMsg'

Problems to add a file to DIRAC SEs

[vanessa@mardirac3 DMS]$ dirac-dms-add-file LFN:/vo.formation.idgrilles.fr/user/v/vhamar/110611.txt test.txt DIRAC-USER --debug
New connection -> 127.0.0.1:9135

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

New connection -> 127.0.0.1:9152
New connection -> 127.0.0.1:9152
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
ReplicaManager.putAndRegister: Checksum information not provided. Calculating adler32.
ReplicaManager.putAndRegister: Checksum calculated to be 97b75b83.
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Storage/DIPStorage.py
StorageElement.isValid: Determining whether the StorageElement DIRAC-USER is valid for use.
StorageElement.isValid: The 'operation' argument is not supplied. It should be supplied in the future.
StorageElement.getStorageElementName: The Storage Element name is DIRAC-USER.
StorageElement.__executeFunction: Attempting to perform 'putFile' operation with 1 pfns.
StorageElement.isValid: Determining whether the StorageElement DIRAC-USER is valid for use.
StorageElement.isLocalSE: Determining whether DIRAC-USER is a local SE.
StorageElement.__executeFunction: Generating 1 protocol PFNs for DIP.
StorageElement.__executeFunction: Attempting to perform 'putFile' for 1 physical files.
New connection -> 127.0.0.1:9148
New connection -> 127.0.0.1:9133
ReplicaManager.putAndRegister: Sending accounting took 0.5 seconds
ReplicaManager.putAndRegister: Failed to put file to Storage Element. test.txt: Server error while serving fromClient: 'Port'
Problem during putAndRegister call
ERROR ReplicaManager.putAndRegister: Failed to put file to Storage Element. Server error while serving fromClient: 'Port'

[RSS] dirac-rss-renew-token

Script crashes because bug on ResourceStatusHandler.export_extendToken
Variable not declared ( tokenNewExpiration ) properly, must be something like datetime...
( but which datetime by default ? )

Failed to remove directory from the catalog

FC:/vo.formation.idgrilles.fr/user/v/vhamar>mkdir newDir
Successfully created directory: /vo.formation.idgrilles.fr/user/v/vhamar/newDir
FC:/vo.formation.idgrilles.fr/user/v/vhamar>ls -la
-rwxrwxr-x 0 vhamar user 1126 2011-06-11 12:08:46 bashrc
drwxrwxr-x 0 vhamar user 0 2011-06-11 12:36:53 newDir
-rwxrwxr-x 0 vhamar dirac_user 30 2011-06-11 09:30:53 test.txt
FC:/vo.formation.idgrilles.fr/user/v/vhamar>rmdir newDir
lfn: /vo.formation.idgrilles.fr/user/v/vhamar/newDir
Failed to remove directory from the catalog
Server error while serving removeDirectory: string indices must be integers, not str
FC:/vo.formation.idgrilles.fr/user/v/vhamar>ls -la
-rwxrwxr-x 0 vhamar user 1126 2011-06-11 12:08:46 bashrc
-rwxrwxr-x 0 vhamar dirac_user 30 2011-06-11 09:30:53 test.txt

About Proxies (v6r0-pre4)

Some notes about proxies in this version:
#1) proxy-init -g <dirac_group> fails to upload the proxy

proxy-init -g dirac_user -d
Enter Certificate password:
Contacting CS...
New connection -> 127.0.0.1:9135
Checking DN /O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar
Username is vhamar
Creating proxy for vhamar@dirac_user (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar)
Proxy will be uploaded to ProxyManager

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

Uploading user pilot proxy with group dirac_pilot...
Traceback (most recent call last):
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 135, in
success = uploadProxyToDIRACProxyManager( cliParams )
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 89, in uploadProxyToDIRACProxyManager
retVal = uploadProxy( params )
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/Client/ProxyUpload.py", line 106, in uploadProxy
params.debugMsg( "Loading user proxy" )

AttributeError: CLIParams instance has no attribute 'debugMsg'

  1. I created a proxy and uploaded it with dirac-proxy-init and dirac-proxy-upload, the time by default is 24 hours, I submitted a job and I got this error in the TaskQueueDirector logs

2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: Authenticated peer (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=mardirac3.in2p3.fr)
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: New session connecting to server at ('mardirac3.in2p3.fr', 9152)
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector VERB: New connection -> 127.0.0.1:9152
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: Closing socket
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector/gLitePilotDirector ERROR: Can't get a proxy for 432000 seconds: myproxy is disabled
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector/gLitePilotDirector ERROR: No proxy Available User "/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar", Group "dirac_user"
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector INFO: Number of pilots to be Submitted 22
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector ERROR: submitPilot Failed: No proxy Available
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector INFO: Number of pilots Submitted 0
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector/Monitoring DEBUG: Adding mark to CPU

To resolve the problem I created a proxy using validity time for 720 hours (dirac-proxy-init -g dirac_user -v 720:00) and I uploaded it again to the server and the pilot jobs were submitted.

  1. I had to upload also the proxy for dirac_pilot group. Before it was automatically done.

FC: removing directories does not clean metadata tables

When deleting files (and folders) the metadata associated to those directories are not deleted, therefore when queriing for metadata, the system fails to find a directory.
Also the query seems to ignore completely the other metadata tags in that case.

Same sandboxes for different user groups

It happens some times ( especially in tutorials ) that users are submitting identical test jobs while changing their dirac group. In this case if a job is already submitted with one group it has its input sandbox registered with this group. The attempt to submit the same job with another group results in an error saying that this sandbox exists but belongs to another group. The solution can be to make the group name somehow part of the sandbox contents so that the two hashes in this case are different.

Inconsistent (sub)request states in the RMS

The removal transformation 10410 seems stuck with many files still to be deleted. Looking at the associated tasks, I find several of them in the following state:

TaskID: 48 (created 2011-05-04 12:42:30, updated 2011-05-04 13:51:24) - 2 files- Request: 1453087 - Status: Waiting - TargetSE: CERN_M-DST,CNAF_M-DST,IN2P3-DST
Request status: {'SubRequestStatus': 'Done', 'RequestStatus': 'Waiting'}
Request info: (1453087L, 'Waiting', '00010410_00000048', None, '', '', '', None, None, datetime.datetime(2011, 5, 4, 13, 4, 54), datetime.datetime(2011, 5, 4, 13, 4, 54))
Waiting: 2 files

The 2 associated files are not deleted, and the request seems in a Waiting state...
Firstly why is SubRequestStatus "Done"? Which subrequest? :(((
Secondly this request appears in the Dirac portal as Assigned and not Waiting:

What can we do, and why this (apparent) inconsistency...

bug in getSiteSummaryWeb service call

The error message is:
{'Message': 'Server error while serving getSiteSummaryWeb: too many values to unpack', 'OK': False, 'rpcStub':(('WorkloadManagement/WMSAdministrator', {'delegatedDN': 'SKIP', 'timeout': 600, 'skipCACheck': False, 'setup': u'LHCb-Production', 'delegatedGroup': 'lhcb_prod'}), 'getSiteSummaryWeb', ({}, [], 0, 500))

Although on service side the number of arguments is correct:
def export_getSiteSummaryWeb(self, selectDict, sortList, startItem, maxItems):
result = jobDB.getSiteSummaryWeb(selectDict, sortList, startItem, maxItems)

using of native version for externals

It would be nice, if during DIRAC installation procedure one could choose between externals shipped with DIRAC or their native counterparts, that had been already installed on the box.

Up to now DIRAC is using only the first set, which is OK, if you want to install it on the grid box. But for the development process we don't need to install another several dozens MB of binaries doubling those which are already in the system. Hence the installation script (on user's request, not by default) should be cleaver enough to check if a library/application/python module exists on the system with compliant and required version and build only missing ones.

Updated dirac-proxy-init

The functionality now available in proxy-init, dirac-proxy-init and dirac-upload-proxy should be collected altogether in just one command dirac-proxy-init . Other commands will be dropped. The following is to be implemented:

  1. dirac-proxy-init is not uploading long proxy to the ProxyManager by default. To upload the long proxy a special -u -- uploadProxy switch must be given;
  2. dirac-proxy-init without -u switch must nevertheless check the availability and validity period of the proxy in the ProxyManager and of the user certificate. The output of the command must give a warning message if either the long proxy or certificate or both have validity left less than 1 week;
  3. dirac-proxy-init should be able to generate proxy even if there is no local configuration yet defined, e.g. right after installation. This proxy can still be used to access the configuration service with a default group defined on the server;
  4. It should be possible to define in the local configuration a default group which will be taken into account if dirac-proxy-init is given without -g switch;

Error installing Web

2011-06-08 23:13:33 UTC dirac-install [ERROR] Post installation script /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py failed. Check /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py.err

[volhcb13] /home/dirac/scripts > cat /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py.err
Traceback (most recent call last):
File "/opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py", line 60, in ?
zFile.extractall( publicDir )
AttributeError: ZipFile instance has no attribute 'extractall'

ConfigTemplate.cfg vs CS

When installing a new agents or service, if such agent or service is defined in the CS but not in the ConfigTemplate.cfg, we get an error. I think this should be corrected.

bug in RequestClient?

Hi Folks,

During re-factoring of TransferAgent I found a suspicious line in RequestClient:

https://github.com/DIRACGrid/DIRAC/blob/integration/RequestManagementSystem/Client/RequestClient.py#L121

where the server URL is added to returned S_OK["Server"]. I believe it rather should land under S_OK["Value"]["Server"], like in here:

https://github.com/DIRACGrid/DIRAC/blob/integration/RequestManagementSystem/Client/RequestClient.py#L158

This bug is present in master and integration, so you'd better fix it asap. ;)

Cheers,
Krzysztof

Bug in DownloadInputData.py

File "/scratch/ctagrid2361.ccwl0924/tmp/https_3a_2f_2flapp-lb01.in2p3.fr_3a9000_2fFwehStqM7ad5RyNt5yD0gA/DIRAC/WorkloadManagementSystem/Client/DownloadInputData.py", line 234, in __getLFN
fileName = os.path.basename( result['Value'] )

Script Workflow module

It does not support an environmental variable in the executable. It probably needs other fixes and we should create other modules as example.

install_site.sh

When integrated to master the URL of the dirac_install.py download has to be updated to point to master and not integration and the documentation at AdministratorGuide/InstallingDIRACService/index.html also has to be updated.

New issue: mention the dirac version concerned

It would be good practice that the issues reported have in the title the DIRAC version concerned. That would help the devs to prioritize and for users to look up their own version easier. Maybe a tag could be designed for that.

default port not defined in Mail.py utility

The bug is manifesting itself by the following traceback:

from DIRAC.Core.Utilities.Mail import Mail
m = Mail()
m._subject ='subject'
m._message = 'body'
m._mailAddress = '[email protected]'
m._send()
Traceback (most recent call last):
File "", line 1, in
File "DIRAC/Core/Utilities/Mail.py", line 25, in _send
self.connect()
File "/opt/dirac/pro/Linux_x86_64_glibc-2.5/lib/python2.6/smtplib.py",
line 293, in connect
if not port: port = self.default_port
AttributeError: Mail instance has no attribute 'default_port'

APIs job submission error (v6r0-pre4)

[vanessa@mardirac3 APIs]$ python
Python 2.6.6 (r266:84292, Mar 24 2011, 16:35:10)
[GCC 4.1.2 20080704 (Red Hat 4.1.2-50)] on linux2
Type "help", "copyright", "credits" or "license" for more information.

from DIRAC.Interfaces.API.Dirac import Dirac

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

from DIRAC.Interfaces.API.Job import Job

j = Job()
j.setCPUTime(500)
{'OK': True, 'Value': ''}
j.setExecutable('/bin/echo hello')
{'OK': True, 'Value': ''}
j.setExecutable('/bin/ls',arguments='-l')
{'OK': True, 'Value': ''}
j.setExecutable('/bin/echo hello again')
{'OK': True, 'Value': ''}
j.setName('API')
{'OK': True, 'Value': ''}

dirac = Dirac()
result = dirac.submit(j)
2011-06-17 10:17:57 UTC /DiracAPI ERROR: Job submission failure No value for key "
2011-06-17 10:17:57 UTC /DiracAPI ERROR: }
print 'Submission Result: ',result
Submission Result: {'Message': 'No value for key "\n }', 'OK': False, 'rpcStub': (('WorkloadManagement/JobManager', {'skipCACheck': False, 'keepAliveLapse': 150, 'delegatedGroup': 'dirac_user', 'delegatedDN': '/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar', 'timeout': 120}), 'submitJob', ('[ \n Origin = "DIRAC";\n ParametricInputData = "";\n Executable = "$DIRACROOT/scripts/dirac-jobexec";\n OutputSandbox = \n {\n \n };\n "\n };\n JobName = "API";\n StdError = "std.err";\n LogLevel = "info";\n Site = "ANY";\n SystemConfig = "ANY";\n Priority = "1";\n InputSandbox = \n {\n \n };\n "\n };\n ParametricInputSandbox = "";\n Arguments = "jobDescription.xml -o LogLevel=info";\n JobGroup = "vo.formation.idgrilles.fr";\n MaxCPUTime = "500";\n StdOutput = "std.out";\n InputData = "";\n JobType = "User"\n]',))}

CE2CS init exception (v6r0-pre4)

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: Can't load agent Configuration/CE2CSAgent
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: == EXCEPTION ==
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: <type 'exceptions.TypeError'>:init() takes exactly 4 arguments (3 given)
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: File "/opt/dirac/pro/DIRAC/Core/Base/AgentReactor.py", line 126, in loadAgentModule
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: agent = agentClass( fullName, self.__baseAgentName )
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: ===============
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent ERROR: Error while loading agent module Can't load agent Configuration/CE2CSAgent in root modules DIRAC
26204: old priority 0, new priority 19
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent INFO: Loading Configuration/CE2CSAgent
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent VERB: Trying to load from root module DIRAC
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent VERB: Looking for file /opt/dirac/versions/v6r0-pre4_1307882881/DIRAC/ConfigurationSystem/Agent/CE2CSAgent.py
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent DEBUG: Trying to load DIRAC.ConfigurationSystem.Agent.CE2CSAgent

Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.

Support for file format in the DIRAC File Catalog

Already the case in LHCb, files can be written on different formats even if they have the same extension. It would be nice that there is the possibility to make this another default metadata for any file, like size, owner,...

Sort requests for RequestDBFile

When we use the RequestDBFile, the request are not taken in chronological order. This feature is mandatory to treat the request in the LHCb online context.

In the file DIRAC/requestManagementSystem/DB/RequestDBFile.py one solution to fix this issue is to add after the line 119 :

119 requestNames = os.listdir( reqDir )
requestNames.sort()
120 for requestName in requestNames:

Using a single entry point for TransformationType

A list of "TransformationTypes" (sometimes called "TransType") is used by the agents:

  • WorkflowTaskAgent
  • ValidateOutputDataAgent
  • TransformationCleaningAgent
    When there is a new Transformation Type, we have to update all. This can probably go in a single places in the CS. There is a Operation/JobDescription/AllowedJobTypes list that can likely be used.

Better error notifications

It would be helpful to have a better error notification: for the moment there are a lot of services sending their errors to [email protected]. It would be useful to have a notifcation area from within the web portal, for admins. An other solution is that instead of sending e-mail, the Notification service could publish to an rss feed. This should be marked as feature request.

FC: resetting or deleting metadata tag (D v5r12P28)

I made a mistake in the definition of my metadata tags for one directory: I set a tag that should belong to a daughter directory. So I want to remove that tag for that directory. But I cannot as there are no method for that (at least not in the cli).

Missing vo variable in TaskQueueDirector

2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: == EXCEPTION ==
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: <type 'exceptions.NameError'>:global name 'vo' is not defined
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/Core/Base/AgentModule.py", line 307, in am_secureCall
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: result = functor( *args )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 196, in execute
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.__checkSubmitPools()
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 350, in __checkSubmitPools
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.__configureDirector( submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 443, in __configureDirector
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: director.configure( self.am_getModuleParam( 'section' ), submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/gLitePilotDirector.py", line 48, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: GridPilotDirector.configure( self, csSection, submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/GridPilotDirector.py", line 71, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: PilotDirector.configure( self, csSection, submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/PilotDirector.py", line 138, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.installInstallation = gConfig.getValue( '/Operations/%s/%s/Versions/PilotInstallation' % ( vo, setup ),

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.