See Decorators for more decorators

Ruffus Functions

There are four functions for Ruffus pipelines:

and a helper function to run jobs via drmaa:

pipeline_run

pipeline_run ( target_tasks = [], forcedtorun_tasks = [], multiprocess = 1, logger = stderr_logger, gnu_make_maximal_rebuild_mode = True, verbose =1, runtime_data = None, one_second_per_job = True, touch_files_only = False, exceptions_terminate_immediately = None, log_exceptions = None, history_file = None, checksum_level = None, multithread = 0, verbose_abbreviated_path = None)

Purpose:

Runs all specified pipelined functions if they or any antecedent tasks are incomplete or out-of-date.

Example:

#
#   Run task2 whatever its state, and also task1 and antecedents if they are incomplete
#   Do not log pipeline progress messages to stderr
#
pipeline_run([task1, task2], forcedtorun_tasks = [task2], logger = blackhole_logger)

Parameters:

  • target_tasks

    Pipeline functions and any necessary antecedents (specified implicitly or with @follows) which should be invoked with the appropriate parameters if they are incomplete or out-of-date.

  • forcedtorun_tasks

    Optional. These pipeline functions will be invoked regardless of their state. Any antecedents tasks will also be executed if they are out-of-date or incomplete.

  • multiprocess

    Optional. The number of processes which should be dedicated to running in parallel independent tasks and jobs within each task. If multiprocess is set to 1, the pipeline will execute in the main process.

  • multithread

    Optional. The number of threads which should be dedicated to running in parallel independent tasks and jobs within each task. Should be used only with drmaa. Otherwise the CPython global interpreter lock (GIL) will slow down your pipeline

  • logger

    For logging messages indicating the progress of the pipeline in terms of tasks and jobs. Defaults to outputting to sys.stderr. Setting logger=blackhole_logger will prevent any logging output.

  • gnu_make_maximal_rebuild_mode

    Warning

    This is a dangerous option. Use rarely and with caution

    Optional parameter governing how Ruffus determines which part of the pipeline is out of date and needs to be re-run. If set to False, ruffus will work back from the target_tasks and only execute the pipeline after the first up-to-date tasks that it encounters. For example, if there are four tasks:

    #
    #   task1 -> task2 -> task3 -> task4 -> task5
    #
    target_tasks = [task5]
    

    If task3() is up-to-date, then only task4() and task5() will be run. This will be the case even if task2() and task1() are incomplete.

    This allows you to remove all intermediate results produced by task1 -> task3.

  • verbose

    Optional parameter indicating the verbosity of the messages sent to logger: (Defaults to level 1 if unspecified)

    • level 0 : nothing
    • level 1 : Out-of-date Task names
    • level 2 : All Tasks (including any task function docstrings)
    • level 3 : Out-of-date Jobs in Out-of-date Tasks, no explanation
    • level 4 : Out-of-date Jobs in Out-of-date Tasks, with explanations and warnings
    • level 5 : All Jobs in Out-of-date Tasks, (include only list of up-to-date tasks)
    • level 6 : All jobs in All Tasks whether out of date or not
    • level 10: logs messages useful only for debugging ruffus pipeline code

    verbose >= 10 are intended for debugging Ruffus by the developers and the details are liable to change from release to release

  • runtime_data

    Experimental feature for passing data to tasks at run time

  • one_second_per_job

    To work around poor file timepstamp resolution for some file systems. Defaults to True if checksum_level is 0 forcing Tasks to take a minimum of 1 second to complete. If your file system has coarse grained time stamps, you can turn on this delay by setting one_second_per_job to True

  • touch_files_only

    Create or update output files only to simulate the running of the pipeline. Does not invoke real task functions to run jobs. This is most useful to force a pipeline to acknowledge that a particular part is now up-to-date.

    This will not work properly if the identities of some files are not known before hand, and depend on run time. In other words, not recommended if @split or custom parameter generators are being used.

  • exceptions_terminate_immediately

    Exceptions cause immediate termination of the pipeline.

  • log_exceptions

    Print exceptions to the logger as soon as they occur.

  • history_file

    The database file which stores checksums and file timestamps for input/output files. Defaults to .ruffus_history.sqlite if unspecified

  • checksum_level

    Several options for checking up-to-dateness are available: Default is level 1.

    • level 0 : Use only file timestamps
    • level 1 : above, plus timestamp of successful job completion
    • level 2 : above, plus a checksum of the pipeline function body
    • level 3 : above, plus a checksum of the pipeline function default arguments and the additional arguments passed in by task decorators
  • verbose_abbreviated_path

    Whether input and output paths are abbreviated. Defaults to 2 if unspecified

    • level 0: The full (expanded, abspath) input or output path
    • level > 1: The number of subdirectories to include. Abbreviated paths are prefixed with [,,,]/
    • level < 0: Input / Output parameters are truncated to MMM letters where verbose_abbreviated_path ==-MMM. Subdirectories are first removed to see if this allows the paths to fit in the specified limit. Otherwise abbreviated paths are prefixed by <???>

pipeline_printout

pipeline_printout (output_stream = sys.stdout, target_tasks = [], forcedtorun_tasks = [], verbose = 1, indent = 4, gnu_make_maximal_rebuild_mode = True, wrap_width = 100, runtime_data = None, checksum_level = None, history_file = None, verbose_abbreviated_path = None)

Purpose:

Prints out all the pipelined functions which will be invoked given specified target_tasks without actually running the pipeline. Because this is a simulation, some of the job parameters may be incorrect. For example, the results of a @split operation is not predetermined and will only be known after the pipelined function splits up the original data. Parameters of all downstream pipelined functions will be changed depending on this initial operation.
Example:
#
#   Simulate running task2 whatever its state, and also task1 and antecedents
#     if they are incomplete
#   Print out results to STDOUT
#
pipeline_printout(sys.stdout, [task1, task2], forcedtorun_tasks = [task2], verbose = 1)

Parameters:

  • output_stream

    Where to printout the results of simulating the running of the pipeline.

  • target_tasks

    As in pipeline_run: Pipeline functions and any necessary antecedents (specified implicitly or with @follows) which should be invoked with the appropriate parameters if they are incomplete or out-of-date.

  • forcedtorun_tasks

    As in pipeline_run:These pipeline functions will be invoked regardless of their state. Any antecedents tasks will also be executed if they are out-of-date or incomplete.

  • verbose

    Optional parameter indicating the verbosity of the messages sent to logger: (Defaults to level 4 if unspecified)

    • level 0 : nothing
    • level 1 : Out-of-date Task names
    • level 2 : All Tasks (including any task function docstrings)
    • level 3 : Out-of-date Jobs in Out-of-date Tasks, no explanation
    • level 4 : Out-of-date Jobs in Out-of-date Tasks, with explanations and warnings
    • level 5 : All Jobs in Out-of-date Tasks, (include only list of up-to-date tasks)
    • level 6 : All jobs in All Tasks whether out of date or not
    • level 10: logs messages useful only for debugging ruffus pipeline code

    verbose >= 10 are intended for debugging Ruffus by the developers and the details are liable to change from release to release

  • indent

    Optional parameter governing the indentation when printing out the component job parameters of each task function.

  • gnu_make_maximal_rebuild_mode

    Warning

    This is a dangerous option. Use rarely and with caution

    See explanation in pipeline_run.

  • wrap_width

    Optional parameter governing the length of each line before it starts wrapping around.

  • runtime_data

    Experimental feature for passing data to tasks at run time

  • history_file

    The database file which stores checksums and file timestamps for input/output files. Defaults to .ruffus_history.sqlite if unspecified

  • checksum_level

    Several options for checking up-to-dateness are available: Default is level 1.

    • level 0 : Use only file timestamps
    • level 1 : above, plus timestamp of successful job completion
    • level 2 : above, plus a checksum of the pipeline function body
    • level 3 : above, plus a checksum of the pipeline function default arguments and the additional arguments passed in by task decorators
  • verbose_abbreviated_path

    Whether input and output paths are abbreviated. Defaults to 2 if unspecified

    • level 0: The full (expanded, abspath) input or output path
    • level > 1: The number of subdirectories to include. Abbreviated paths are prefixed with [,,,]/
    • level < 0: Input / Output parameters are truncated to MMM letters where verbose_abbreviated_path ==-MMM. Subdirectories are first removed to see if this allows the paths to fit in the specified limit. Otherwise abbreviated paths are prefixed by <???>

pipeline_printout_graph

pipeline_printout_graph (stream, output_format = None, target_tasks = [], forcedtorun_tasks = [], ignore_upstream_of_target = False, skip_uptodate_tasks = False, gnu_make_maximal_rebuild_mode = True, test_all_task_for_update = True, no_key_legend = False, minimal_key_legend = True, user_colour_scheme = None, pipeline_name = “Pipeline”, size = (11,8), dpi = 120, runtime_data = None, checksum_level = None, history_file = None)

Purpose:

Prints out flowchart of all the pipelined functions which will be invoked given specified target_tasks without actually running the pipeline.

See Flowchart colours

Example:
pipeline_printout_graph("flowchart.jpg", "jpg", [task1, task16],
                            forcedtorun_tasks = [task2],
                            no_key_legend = True)

Customising appearance:

The user_colour_scheme parameter can be used to change flowchart colours. This allows the default Colour Schemes to be set. An example of customising flowchart appearance is available (see code) .

Parameters:

  • stream

    The file or file-like object to which the flowchart should be printed. If a string is provided, it is assumed that this is the name of the output file which will be opened automatically.

  • output_format

    If missing, defaults to the extension of the stream file name (i.e. jpg for a.jpg)

    If the programme dot can be found on the executio path, this can be any number of formats supported by Graphviz, including, for example, jpg, png, pdf, svg etc.
    Otherwise, ruffus will only output without error in the dot format, which is a plain-text graph description language.
  • target_tasks

    As in pipeline_run: Pipeline functions and any necessary antecedents (specified implicitly or with @follows) which should be invoked with the appropriate parameters if they are incomplete or out-of-date.

  • forcedtorun_tasks

    As in pipeline_run:These pipeline functions will be invoked regardless of their state. Any antecedents tasks will also be executed if they are out-of-date or incomplete.

  • draw_vertically

    Draw flowchart in vertical orientation

  • ignore_upstream_of_target

    Start drawing flowchart from specified target tasks. Do not draw tasks which are downstream (subsequent) to the targets.

  • ignore_upstream_of_target

    Do not draw up-to-date / completed tasks in the flowchart unless they are lie on the execution path of the pipeline.

  • gnu_make_maximal_rebuild_mode

    Warning

    This is a dangerous option. Use rarely and with caution

    See explanation in pipeline_run.

  • test_all_task_for_update
    Indicates whether intermediate tasks are out of date or not. Normally Ruffus will stop checking dependent tasks for completion or whether they are out-of-date once it has discovered the maximal extent of the pipeline which has to be run.
    For displaying the flow of the pipeline, this is hardly very informative.
  • no_key_legend

    Do not include key legend explaining the colour scheme of the flowchart.

  • minimal_key_legend

    Do not include unused task types in key legend.

  • user_colour_scheme

    Dictionary specifying colour scheme for flowchart

    See complete list of Colour Schemes.

    Colours can be names e.g. "black" or quoted hex e.g. '"#F6F4F4"' (note extra quotes)
    Default values will be used unless specified
    key Subkey  
    • 'colour_scheme_index'
    index of default colour scheme,
    0-7, defaults to 0 unless specified
     
    • 'Final target'
    • 'Explicitly specified task'
    • 'Task to run'
    • 'Down stream'
    • 'Up-to-date Final target'
    • 'Up-to-date task forced to rerun'
    • 'Up-to-date task'
    • 'Vicious cycle'
    • 'fillcolor'
    • 'fontcolor'
    • 'color'
    • 'dashed' = 0/1
    Colours / attributes for each task type
    • 'Vicious cycle'
    • 'Task to run'
    • 'Up-to-date'
    • 'linecolor'
    Colours for arrows between tasks
    • 'Pipeline'
    • 'fontcolor'
    Flowchart title colour
    • 'Key'
    • 'fontcolor'
    • 'fillcolor'
    Legend colours

    Example:

    Use colour scheme index = 1

    pipeline_printout_graph ("flowchart.svg", "svg", [final_task],
                             user_colour_scheme = {
                                                    "colour_scheme_index" :1,
                                                    "Pipeline"      :{"fontcolor" : '"#FF3232"' },
                                                    "Key"           :{"fontcolor" : "Red",
                                                                      "fillcolor" : '"#F6F4F4"' },
                                                    "Task to run"   :{"linecolor" : '"#0044A0"' },
                                                    "Final target"  :{"fillcolor" : '"#EFA03B"',
                                                                      "fontcolor" : "black",
                                                                      "dashed"    : 0           }
                                                   })
    
  • pipeline_name

    Specify title for flowchart

  • size

    Size in inches for flowchart

  • dpi

    Resolution in dots per inch. Ignored for svg output

  • runtime_data

    Experimental feature for passing data to tasks at run time

  • history_file

    The database file which stores checksums and file timestamps for input/output files. Defaults to .ruffus_history.sqlite if unspecified

  • checksum_level

    Several options for checking up-to-dateness are available: Default is level 1.

    • level 0 : Use only file timestamps
    • level 1 : above, plus timestamp of successful job completion
    • level 2 : above, plus a checksum of the pipeline function body
    • level 3 : above, plus a checksum of the pipeline function default arguments and the additional arguments passed in by task decorators

pipeline_get_task_names

pipeline_get_task_names ()

Purpose:

Returns a list of all task names in the pipeline without running the pipeline or checking to see if the tasks are connected correctly

Example:

Given:

from ruffus import *

@originate([])
def create_data(output_files):
    pass

@transform(create_data, suffix(".txt"), ".task1")
def task1(input_files, output_files):
    pass

@transform(task1, suffix(".task1"), ".task2")
def task2(input_files, output_files):
    pass

Produces a list of three task names:

>>> pipeline_get_task_names ()
['create_data', 'task1', 'task2']

run_job

Note

drmaa_wrapper is not exported automatically by ruffus and must be specified explicitly:

# imported ruffus.drmaa_wrapper explicitly
from ruffus.drmaa_wrapper import run_job, error_drmaa_job

run_job (cmd_str, job_name, job_other_options, job_script_directory = None, job_environment, working_directory, logger, drmaa_session, retain_job_scripts, run_locally, output_files, touch_only)

Purpose:

ruffus.drmaa_wrapper.run_job dispatches a command with arguments to a cluster or Grid Engine node and waits for the command to complete.

It is the semantic equivalent of calling os.system or subprocess.check_output.

Example:

from ruffus.drmaa_wrapper import run_job, error_drmaa_job
import drmaa
my_drmaa_session = drmaa.Session()
my_drmaa_session.initialize()

run_job("ls",
        job_name = "test",
        job_other_options="-P mott-flint.prja -q short.qa",
        job_script_directory = "test_dir",
        job_environment={ 'BASH_ENV' : '~/.bashrc' },
        retain_job_scripts = True, drmaa_session=my_drmaa_session)
run_job("ls",
        job_name = "test",
        job_other_options="-P mott-flint.prja -q short.qa",
        job_script_directory = "test_dir",
        job_environment={ 'BASH_ENV' : '~/.bashrc' },
        retain_job_scripts = True,
        drmaa_session=my_drmaa_session,
        working_directory = "/gpfs1/well/mott-flint/lg/src/oss/ruffus/doc")

#
#   catch exceptions
#
try:
    stdout_res, stderr_res  = run_job(cmd,
                                      job_name          = job_name,
                                      logger            = logger,
                                      drmaa_session     = drmaa_session,
                                      run_locally       = options.local_run,
                                      job_other_options = get_queue_name())

# relay all the stdout, stderr, drmaa output to diagnose failures
except error_drmaa_job as err:
    raise Exception("\n".join(map(str,
                        ["Failed to run:",
                         cmd,
                         err,
                         stdout_res,
                         stderr_res])))

my_drmaa_session.exit()

Parameters:

  • cmd_str

    The command which will be run remotely including all parameters

  • job_name

    A descriptive name for the command. This will be displayed by SGE qstat, for example. Defaults to “ruffus_job”

  • job_other_options

    Other drmaa parameters can be passed verbatim as a string.

    Examples for SGE include project name (-P project_name), parallel environment (-pe parallel_environ), account (-A account_string), resource (-l resource=expression), queue name (-q a_queue_name), queue priority (-p 15).

    These are parameters which you normally need to include when submitting jobs interactively, for example via SGE qsub or SLURM (srun)

  • job_script_directory

    The directory where drmaa temporary script files will be found. Defaults to the current working directory.

  • job_environment

    A dictionary of key / values with environment variables. E.g. "{'BASH_ENV': '~/.bashrc'}"

  • working_directory

    • Sets the working directory.
    • Should be a fully qualified path.
    • Defaults to the current working directory.
  • retain_job_scripts

    If set to True, will not delete temporary script files containg drmaa commands. Useful for debugging, running on the command line directly, and can provide a useful record of the commands.

  • logger

    For logging messages indicating the progress of the pipeline in terms of tasks and jobs. Takes objects with the standard python logging module interface.

  • drmaa_session

    A shared drmaa session created and managed separately.

    In the main part of your Ruffus pipeline script somewhere there should be code looking like this:

    #
    #   start shared drmaa session for all jobs / tasks in pipeline
    #
    import drmaa
    drmaa_session = drmaa.Session()
    drmaa_session.initialize()
    
    
    #
    #   pipeline functions
    #
    
    if __name__ == '__main__':
        cmdline.run (options, multithread = options.jobs)
        drmaa_session.exit()
    
  • run_locally

    If set to True, will run commands locally using the standard python subprocess module rather than dispatching remotely. This allows scripts to be debugged easily

  • touch_only

    If set to True, will create or update Output files only to simulate the running of the pipeline. Does not dispatch commands remotely or locally. This is most useful to force a pipeline to acknowledge that a particular part is now up-to-date.

    See also: pipeline_run(touch_files_only=True)

  • output_files

    A list of output file names which will be created or updated if touch_only =True