A high level client to the SMRT Link Services is accessible from ServiceAccessLayer in pbcommand.services.
Example:
In [1]: from pbcommand.services import ServiceAccessLayer
In [2]: s = ServiceAccessLayer("smrtlink-alpha", 8081)
In [3]: s.get_status()
Out[3]:
{u'id': u'smrtlink_analysis',
u'message': u'Services have been up for 141 hours, 37 minutes and 13.138 seconds.',
u'uptime': 509833138,
u'user': u'secondarytest',
u'uuid': u'12e1c62a-99a4-46c1-b616-a327dc38525f',
u'version': u'0.1.8-3a66e4a'}
In [4]: jobs = s.get_analysis_jobs()
In [5]: j = s.get_analysis_job_by_id(3)
In [6]: j.state, j.name
Out[6]: ('SUCCESSFUL', 'sirv_isoseq')
In [7]: import pbcommand; pbcommand.get_version()
Out[7]: '0.4.9'
Warning
This is/has been migrated to scala in smrtflow. Support for the python Client layer API will remain, however the python commandline tool will be deprecated in a future version.
Tool to import datasets, convert/import fasta file and run analysis jobs
usage: pbservice [-h] [--version] [--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL} | --debug | --quiet | -v]
{status,import-dataset,import-fasta,run-analysis,emit-analysis-template,get-job,get-jobs,get-dataset,get-datasets}
...
--version | show program’s version number and exit |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
-v, --verbose | Set the verbosity level. |
Get System Status, DataSet and Job Summary
usage: pbservice status [-h] [--host HOST] [--port PORT] [--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Import Local DataSet XML. The file location must be accessible from the host where the Services are running (often on a shared file system)
usage: pbservice import-dataset [-h] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
xml_or_dir
xml_or_dir | Directory or XML file. |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Import Fasta (and convert to ReferenceSet). The file location must be accessible from the host where the Services are running (often on a shared file system)
usage: pbservice import-fasta [-h] --name NAME --organism ORGANISM --ploidy
PLOIDY [--block] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
fasta_path
fasta_path | Path to Fasta File |
--name | Name of ReferenceSet |
--organism | Organism |
--ploidy | Ploidy |
--block=False | Block during importing process |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Run Secondary Analysis Pipeline using an analysis.json
usage: pbservice run-analysis [-h] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet] [--block]
json_path
json_path | Path to analysis.json file |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
--block=False | Block during importing process |
Emit an analysis.json Template to stdout that can be run using ‘run-analysis’
usage: pbservice emit-analysis-template [-h] [--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Get Job Summary by Job Id
usage: pbservice get-job [-h] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
job_id
job_id | Job id or UUID |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Get Job Summary by Job Id
usage: pbservice get-jobs [-h] [-m MAX_ITEMS] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
-m=25, --max-items=25 | |
Max Number of jobs | |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Get DataSet Summary by DataSet Id or UUID
usage: pbservice get-dataset [-h] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet]
id_or_uuid
id_or_uuid | DataSet Id or UUID |
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
Get DataSet List Summary by DataSet Type
usage: pbservice get-datasets [-h] [--host HOST] [--port PORT]
[--log-file LOG_FILE]
[--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--debug] [--quiet] [-m MAX_ITEMS]
[-t DATASET_TYPE]
--host=http://localhost | |
Server host. Override the default with env PB_SERVICE_HOST | |
--port=8070 | Server Port. Override default with env PB_SERVICE_PORT |
--log-file | Write the log to file. Default(None) will write to stdout. |
--log-level=INFO | |
Set log level Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL | |
--debug=False | Alias for setting log level to DEBUG |
--quiet=False | Alias for setting log level to CRITICAL to suppress output. |
-m=25, --max-items=25 | |
Max number of Datasets to show | |
-t=subreads, --dataset-type=subreads | |
DataSet Meta type |