This shell script tool allows you to run a SAS program or a batch of SAS programs via Bash environment with various interactive and non-interactive execution modes. This is ideal for a SAS environment which is restrictive by design and you need a batch execution script/tool especially when you have a batch of SAS Data Integration Studio (DI) jobs.
This was observed in a production environment. It seems like the skipped jobs are ill-formatted on the console (decorators are missing etc.) until it reaches a job that has previously failed.
Not a major issue, as the core functionality is still uninterrupted and "batch" mode which is likely to be the case for the production environment this is not an issue.
Currently, all the information related to the batch run is captured in file-based key-value stores by runSAS, one can browse these hidden script files under /.tmp directory in the script root.
One of the users has requested to push batch run related info (such as <batchid>, <jobid>, flow/job runtimes ...etc.)to the SAS environment (possibly as a SAS dataset) to be able to view/track and report on the batch runs initiated/managed by runSAS.
NOTE: User can run ./runSAS.sh --last OR ./runSAS.sh --log command to see the last batch run details.
This definitely sounds like a good idea, so here is a possible solution:
Add support for -sysparm to allow parameters to be passed from the bash session to the SAS program/script/job
RunSAS then create a "runSASBatchStatus.sas" SAS program (possibly locally or in the specified deployed directory) to create/update a batch status/info dataset "runsas_batch_status".
The dataset will be only written to by runSAS and not to be used as a control table for runSAS script
<batchid> must be the primary key everywhere to be consistent with runSAS topology. Potentially<jobid> can be used to extend this and store the batch run details at the job/script level, if needed.
Invoke this SAS program to create/update the batch status record(s)