Python client to SMRT Link Services and Service Data Models
- class
pbcommand.services.
ServiceAccessLayer
(base_url, port, debug=False, sleep_time=2)[source]¶Bases:
object
General Client Access Layer for interfacing with the job types on SMRT Link Analysis Services. This API only supports insecure (HTTP) access to localhost.
As of 10-02-2018, this should only be used (minimally) for internal purposes. All access to the Services should be done via SmrtLinkAuthClient.
Parameters:
- base_url – base url of the SL Server. This MUST be either ‘localhost’ or ‘http://localhost’
- port – port of the SL server
- debug – set improved debugging output on Services request failures
- sleep_time – sleep time (in seconds) between polling for job status
JOB_DEFAULT_TIMEOUT
= 1800¶
ROOT_DS
= '/smrt-link/datasets'¶
ROOT_JM
= '/smrt-link/job-manager'¶
ROOT_JOBS
= '/smrt-link/job-manager/jobs'¶
ROOT_MJOBS
= '/smrt-link/job-manager/multi-jobs'¶
ROOT_PT
= '/smrt-link/resolved-pipeline-templates'¶
ROOT_RUNS
= '/smrt-link/runs'¶
ROOT_SAMPLES
= '/smrt-link/samples'¶
ROOT_SL
= '/smrt-link'¶
create_by_pipeline_template_id
(name, pipeline_template_id, epoints, task_options=(), tags=())[source]¶Creates and runs a pbsmrtpipe pipeline by pipeline template id
Parameters: tags – Tags should be a set of strings
get_analysis_job_by_id
(job_id)[source]¶Get an Analysis job by id or UUID or return None
Return type: ServiceJob
get_analysis_job_datastore_file_download
(job_id, dsf_uuid, output_file=None)[source]¶Download an DataStore file to an output file
Parameters:
- job_id –
- dsf_uuid –
- output_file – if None, the file name from the server (content-disposition) will be used.
Returns:
get_analysis_job_report_obj
(job_id, report_uuid)[source]¶Fetch a SMRT Link Report Instance from a Job Id and Report UUID
There’s inconsistencies in the API, hence the naming of the method is a bit verbose. :rtype Report
get_analysis_job_reports
(job_id)[source]¶Get list of DataStore ReportFile types output from (pbsmrtpipe) analysis job
get_analysis_job_reports_objs
(job_id)[source]¶Get a List of Report Instances
Parameters: job_id – :rtype list[Report] :return: List of Reports
get_analysis_job_tasks
(job_id_or_uuid)[source]¶Get all the Task associated with a Job by UUID or Int Id
get_analysis_jobs
()[source]¶
Return type: list[ServiceJob]
get_dataset_by_id
(dataset_type, int_or_uuid)[source]¶Get a Dataset using the DataSetMetaType and (int|uuid) of the dataset
get_dataset_by_uuid
(int_or_uuid, ignore_errors=False)[source]¶The recommend model is to look up DataSet type by explicit MetaType
Returns None if the dataset was not found
get_dataset_children_jobs
(dataset_id)[source]¶Get a List of Children Jobs for the DataSet
Parameters: dataset_id (int | string) – DataSet Int or UUID :rtype list[ServiceJob]
get_fasta_convert_jobs
()[source]¶
Return type: list[ServiceJob]
get_import_dataset_job_datastore
(job_id)[source]¶Get a List of Service DataStore files from an import DataSet job
get_import_dataset_jobs
()[source]¶
Return type: list[ServiceJob]
get_merge_dataset_jobs
()[source]¶
Return type: list[ServiceJob]
import_fasta
(fasta_path, name, organism, ploidy)[source]¶Convert fasta file to a ReferenceSet and Import. Returns a Job
log_progress_update
(job_type_id, job_id, message, level, source_id)[source]¶This is the generic job logging mechanism
run_by_pipeline_template_id
(name, pipeline_template_id, epoints, task_options=(), time_out=1800, tags=())[source]¶Blocks and runs a job with a timeout
run_import_fasta
(fasta_path, name, organism, ploidy, time_out=1800)[source]¶Import a Reference into a Block
run_import_local_dataset
(path)[source]¶Import a file from FS that is local to where the services are running
Returns a JobResult instance
Return type: JobResult
to_summary
()[source]¶Returns a summary of System status, DataSets, and Jobs in the system
Return type: str
uri
¶
- class
pbcommand.services.
ServiceJob
(ix, job_uuid, name, state, path, job_type, created_at, settings, is_active=True, smrtlink_version=None, created_by=None, updated_at=None, error_message=None, imported_at=None, job_updated_at=None, created_by_email=None, is_multi_job=False, tags='', parent_multi_job_id=None, workflow=None, project_id=1, job_started_at=None, job_completed_at=None, sub_job_type_id=None)[source]¶Bases:
object
Parameters:
- ix (int) – Job Integer Id
- job_uuid (str) – Globally unique Job UUID
- name (str) – Display name of job
- state (str) – Job State
- path – Absolute Path to the job directory
- job_type (str) – Job Type
- created_at (DateTime) – when the job was created at
- settings (dict) – dict of job specific settings
- is_active (bool) – If the Job is active (only active jobs are displayed in the SL UI)
- smrtlink_version (str | None) – SMRT Link Version (if known)
- created_by (str | None) – User that created the job
- updated_at (DateTime | None) – when the last update of the job occurred
- error_message (str | None) – Error message if the job has failed
- job_started_at – Job start time (if the job has started running)
- job_completed_at – Job completed time (if the job has completed)
execution_time_sec
¶Return the Job Execution time (in sec) for completed jobs or return None for non-completed jobs
Note, for Jobs from SL > 6.0.0, this was not defined and will always return None
Return type: None | int
run_time_sec
¶Note, prior to SL 6.0.X, jobs did not have a well defined job start/complete mechanism and the Job entity timestamps were used. This has assumptions that the Job is started when the job is created. This is often not true.
For completed jobs from SL version >= 6.0.x, use execution_time_sec.
Return type: None | int
- class
pbcommand.services.
JobTypes
[source]¶Bases:
object
SMRT Link Analysis JOb Types
CONVERT_FASTA
= 'convert-fasta-reference'¶
IMPORT_DS
= 'import-dataset'¶
IMPORT_DSTORE
= 'import-datastore'¶
MERGE_DS
= 'merge-datasets'¶
MOCK_PB_PIPE
= 'mock-pbsmrtpipe'¶
PB_PIPE
= 'pbsmrtpipe'¶
- class
pbcommand.services.
JobStates
[source]¶Bases:
object
Allowed SMRT Link Service Job states
ALL
= ('RUNNING', 'CREATED', 'FAILED', 'SUCCESSFUL', 'SUBMITTED')¶
ALL_COMPLETED
= ('FAILED', 'SUCCESSFUL')¶
CREATED
= 'CREATED'¶
FAILED
= 'FAILED'¶
RUNNING
= 'RUNNING'¶
SUBMITTED
= 'SUBMITTED'¶
SUCCESSFUL
= 'SUCCESSFUL'¶