From 76a245ce93a6c5506b9e273da1498131c8c71ce5 Mon Sep 17 00:00:00 2001 From: marcodeltufo <marco.deltufo@exact-lab.it> Date: Tue, 4 Jul 2023 15:34:11 +0200 Subject: [PATCH] . --- docs/index.rst | 3 +- .../custom-import.md | 60 +++ .../legacy-server-side-extensions/index.rst | 10 + .../processing-plugins.md | 247 ++++++++++ .../reporting-plugins.md | 449 ++++++++++++++++++ .../search-domain-services.md | 129 +++++ 6 files changed, 897 insertions(+), 1 deletion(-) create mode 100644 docs/software-developer-documentation/legacy-server-side-extensions/custom-import.md create mode 100644 docs/software-developer-documentation/legacy-server-side-extensions/index.rst create mode 100644 docs/software-developer-documentation/legacy-server-side-extensions/processing-plugins.md create mode 100644 docs/software-developer-documentation/legacy-server-side-extensions/reporting-plugins.md create mode 100644 docs/software-developer-documentation/legacy-server-side-extensions/search-domain-services.md diff --git a/docs/index.rst b/docs/index.rst index adba874b85f..057f54f64ab 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -23,6 +23,7 @@ The complete solution for managing your research data. APIS </software-developer-documentation/apis/index> Server-Side Extensions </software-developer-documentation/server-side-extensions/index> Client-Side Extensions </software-developer-documentation/client-side-extensions/index> + Legacy Server-Side Extensions </software-developer-documentation/legacy-server-side-extensions/index> User Documentation ^^^^^^^^^^^^^^^^^^ @@ -37,7 +38,7 @@ Software Developer Documentation - :doc:`APIS </software-developer-documentation/apis/index>` - :doc:`Server-Side Extensions </software-developer-documentation/server-side-extensions/index>` - :doc:`Client-Side Extensions </software-developer-documentation/client-side-extensions/index>` - - Legacy Server-side Extensions + - :doc:`Legacy Server-Side Extensions </software-developer-documentation/legacy-server-side-extensions/index>` System Admin Documentation ^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/docs/software-developer-documentation/legacy-server-side-extensions/custom-import.md b/docs/software-developer-documentation/legacy-server-side-extensions/custom-import.md new file mode 100644 index 00000000000..8cc4d6d1508 --- /dev/null +++ b/docs/software-developer-documentation/legacy-server-side-extensions/custom-import.md @@ -0,0 +1,60 @@ +Custom Import +============= + +### Introduction + +`Custom Import` is a feature designed to give web users a chance to +import a file via `Jython Dropboxes`. + +### Usage + +To upload a file via `Custom Import`, the user should +choose `Import -> Custom Import` in openBIS top menu. The +`Custom Import` tab will be opened, and the user will get the combo box +filled with the list of configured imports. After selecting the desired +`Custom Import, the` user will be asked to select a file. After +selecting a file and clicking `the Save` button, the import will start. +The user should be aware, that the import is done in a synchronous way, +sometimes it might take a while to import data (it depends on the +dropbox code). + +If a template file has been configured a download link will appear. The +downloaded template file can be used to create the file to be imported. + +### Configuration + +To have the possibility to use a `Custom Import` functionality, this +needs an AS [core plugin](/display/openBISDoc2010/Core+Plugins) of type +custom-imports. The `plugin.properties` of each plugin has several +parameters: + +|parameter name |description | +|------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +|name |The value of this parameter will be used as a name of Custom Import in web UI. | +|dss-code |This parameter needs to specify the code of the datastore server running the dropbox which should be used by the Custom Import. | +|dropbox-name |The value is the name of the dropbox that is used by the Custom Import. | +|description |Specifies a description of the Custom Import. The description is shown as a tooltip in the web UI. | +|template-entity-kind |Custom import templates are represented in OpenBIS as entity attachments. To make a given file available as a custom import template create an attachment with this file and refer to this attachment with template-entity-kind, template-entity-permid, template-attachment-name parameters, where: template-entity-kind is the kind of the entity the attachment has been added to (allowed values: PROJECT, EXPERIMENT, SAMPLE), template-entity-permid is the perm id of that entity and template-attachment-name is the file name of the attachment.| +|template-entity-permid | | +|template-attachment-name| | + + +#### Example configuration + +**plugin.properties** + + name = Example custom import + dss-code = DSS1 + dropbox-name = jython-dropbox-1 + description = This is an example custom import + template-entity-kind = PROJECT + template-entity-permid = 20120814111307034-82319 + template-attachment-name = project_custom_import_template.txt + +The dropbox needs to be defined on `the DSS` side as a `RPC dropbox`: + +**service.properties** + + dss-rpc.put.<DATA_SET_TYPE> = jython-dropbox-1 + + \ No newline at end of file diff --git a/docs/software-developer-documentation/legacy-server-side-extensions/index.rst b/docs/software-developer-documentation/legacy-server-side-extensions/index.rst new file mode 100644 index 00000000000..2066db32b9f --- /dev/null +++ b/docs/software-developer-documentation/legacy-server-side-extensions/index.rst @@ -0,0 +1,10 @@ +Legacy Server-Side Extensions +============================= + +.. toctree:: + :maxdepth: 4 + + custom-import + processing-plugins + reporting-plugins + search-domain-services \ No newline at end of file diff --git a/docs/software-developer-documentation/legacy-server-side-extensions/processing-plugins.md b/docs/software-developer-documentation/legacy-server-side-extensions/processing-plugins.md new file mode 100644 index 00000000000..063ce7e4586 --- /dev/null +++ b/docs/software-developer-documentation/legacy-server-side-extensions/processing-plugins.md @@ -0,0 +1,247 @@ +Processing Plugins +================== + +## Introduction + +A processing plugin runs on the DSS. It processes a specified set of +data sets. The user can trigger a processing plugin in the openBIS Web +application. After processing an e-mail is sent to the user. + +A processing plugin is configured on the DSS best by introducing a [core +plugin](/display/openBISDoc2010/Core+Plugins) of type +`processing-plugins`. All processing plugins have the following +properties in common: + +|Property Key|Description| +|--- |--- | +|class|The fully-qualified Java class name of the reporting plugin. The class has to implement IProcessingPluginTask.| +|label|The label. It will be shown in the GUI.| +|dataset-types|Comma-separated list of regular expressions. The plugin can process only data sets of types matching one of the regular expressions.  If new data set types are registered with openBIS, the DSS will need to be restarted before the new data set types are known to the processing plugins.| +|properties-file|Path to an optional file with additional properties.| +|allowed-api-parameter-classes|A comma-separated list of regular expression for fully-qualified class names. Any classes matching on of the regular expressions is allowed as a class of a Java parameter object of a remote API call. For more details see API Security.| +|disallowed-api-parameter-classes|A comma-separated list of regular expression for fully-qualified class names. Any classes matching on of the regular expressions is not allowed as a class of a Java parameter object of a remote API call. For more details see API Security.| + +## Multiple Processing Queues + +By default only one processing plugin task is processed. All other +scheduled tasks have to wait in a queue. This can be inconvenient if +there is a mixture of long tasks (taking hours or even days) and short +tasks (taking only seconds or minutes). + +DSS can be configured two run more than one processing queue. Each queue +(except the default one) has a name (which also appears in the log +file). Also a regular expression is associated with the queue. When a +processing plugin task is scheduled the appropriate queue is selected by +the ID of the processing plugin (this is either a name in the +property `processing-plugins` of `service.properties` of DSS or the name +of the core-plugin folder). If the ID matches the regular expression the +task is added to the corresponding queue. If non of the regular +expression matches the default queue is used. + +The queues have to be specified by the +property `data-set-command-queue-mapping`. It contains a comma-separated +list of queue definitions. Each definition has the form + +`<queue name>:<regular expression>` + +### Archiving + +If archiving is enable (i.e. `archiver.class` in `service.properties` of +DSS is defined or a core-plugin of type `miscellaneous` with +ID `archiver` is defined) there will be three processing plugins with +the following IDs: `Archiving`, `Copying data sets to archive`, and +`Unarchiving` + +## Generic Processing Plugins + +### RevokeLDAPUserAccessMaintenanceTask + +**NOTE: This Maintenance Task should only be used if the server uses +LDAP only, it will take users from other authentication services as +missing. +** + +**Description**: Renames, deactivates and delete all roles from users +that are no longer available on LDAP following the next algorithm. + +- Grabs all active users. +- The users that follow all the points of the next criteria are + renamed to userId-YYYY.MM.DD and deactivated: + - Are not a system user. + - Don't have the ETL\_SERVER role. + - Don't have a LDAP principal. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|server-url|LDAP server URL.| +|security-principal-distinguished-name|LDAP principal distinguished name.| +|security-principal-password|LDAP principal password.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.generic.server.task.RevokeLDAPUserAccessMaintenanceTask + interval = 60 s + server-url = ldap://d.ethz.ch/DC=d,DC=ethz,DC=ch + security-principal-distinguished-name = CN=cisd-helpdesk,OU=EthUsers,DC=d,DC=ethz,DC=ch + security-principal-password = ****** + +### DataSetCopierForUsers + +### DataSetCopier + +**Description**: Copies all files of the specified data sets to another +(remote) folder. The actual copying is done by the rsync command. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|destination|Path to the destination folder. This can be a path to a local/mounted folder or to a remote folder accessible via SSH. In this case the name of the host has to appear as a prefix. General syntax: [<host>:][<rsync module>:]<path>| +|hard-link-copy|If true hard links are created for each file of the data sets. This works only if the share which stores the data set is in the same local file system as the destination folder. Default: false.| +|rename-to-dataset-code|If true the copied data set will be renamed to the data set code. Default: false.| +|rsync-executable|Optional path to the executable command rsync.| +|rsync-password-file|Path to the rsync password file. It is only needed if an rsync module is used.| +|ssh-executable|Optional path to the executable command ssh. SSH is only needed for not-mounted folders which are accessible via SSH.| +|ln-executable|Optional path to the executable command ln. The ln command is only needed when hard-link-copy = true.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DataSetCopier + label = Copy to analysis incoming folder + dataset-types = MS_DATA, UNKNOWN + destination = analysis-server:analysis-incoming-data + rename-to-dataset-code = true + +### DataSetCopierForUsers + +**Description**: Copies all files of the specified data sets to a +(remote) user folder. The actual copying is done by the rsync command. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|destination|Path template to the destination folder. It should contain ${user} as a placeholder for the user ID. +The path can point to a local/mounted folder or to a remote folder accessible via SSH. In this case the name of the host has to appear as a prefix. General syntax: [<host>:][<rsync module>:]<path>| +|hard-link-copy|If true hard links are created for each file of the data sets. This works only if the share which stores the data set is in the same local file system as the destination folder. Default: false.| +|rename-to-dataset-code|If true the copied data set will be renamed to the data set code. Default: false.| +|rsync-executable|Optional path to the executable command rsync.| +|rsync-password-file|Path to the rsync password file. It is only needed if an rsync module is used.| +|ssh-executable|Optional path to the executable command ssh. SSH is only needed for not-mounted folders which are accessible via SSH.| +|ln-executable|Optional path to the executable command ln. The ln command is only needed when hard-link-copy = true.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DataSetCopierForUsers + label = Copy to user playground + dataset-types = MS_DATA, UNKNOWN + destination = tmp/playground/${user}/data-sets + hard-link-copy = true + rename-to-dataset-code = true + +### JythonBasedProcessingPlugin + +**Description**: Invokes a Jython script to do the processing. For more +details see [Jython-based Reporting and Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.jython.JythonBasedProcessingPlugin + label = Calculate some numbers + dataset-types = MS_DATA, UNKNOWN + script-path = script.py + +### ReportingBasedProcessingPlugin + +**Description**: Runs a Jython-based reporting plugin of type +TABLE\_MODEL and sends the result table as a TSV file to the user. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| +|single-report|If true only one report will be sent. Otherwise a report for each data set will be sent. Default: false| +|email-subject|Subject of the e-mail to be sent. Default: None| +|email-body|Body of the e-mail to be sent. Default: None| +|attachment-name|Name of the attached TSV file. Default: report.txt| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.jython.ReportingBasedProcessingPlugin + label = Create monthly report + dataset-types = MS_DATA, UNKNOWN + script-path = script.py + email-subject = DSS Monthly Report + +### DataSetAndPathInfoDBConsistencyCheckProcessingPlugin + +**Description**: The processing task checks the consistency between the +data store and the meta information stored in the `PathInfoDB`. It will +check for: + +- existence (i.e. exists in PathInfoDB but not on file system or + exists on file system but not in PathInfoDB) +- file size +- CRC32 checksum + +If it finds any deviations, it will send out an email which contains all +differences found. + +**Configuration**: Properties common for all processing plugins (see +Introduction) + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DataSetAndPathInfoDBConsistencyCheckProcessingPlugin + label = Check consistency between data store and path info database + dataset-types = .* + creening Processing Plugins + +### ScreeningReportingBasedProcessingPlugin + +**Description**: Runs a Jython-based reporting plugin of type +TABLE\_MODEL and sends the result table as a TSV file to the user. There +is some extra support for screening. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| +|single-report|If true only one report will be sent. Otherwise a report for each data set will be sent. Default: false| +|email-subject|Subject of the e-mail to be sent. Default: None| +|email-body|Body of the e-mail to be sent. Default: None| +|attachment-name|Name of the attached TSV file. Default: report.txt| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.screening.server.plugins.jython.ScreeningReportingBasedProcessingPlugin + label = Create monthly report + dataset-types = HCS_IMAGE + script-path = script.py + email-subject = DSS Monthly Report \ No newline at end of file diff --git a/docs/software-developer-documentation/legacy-server-side-extensions/reporting-plugins.md b/docs/software-developer-documentation/legacy-server-side-extensions/reporting-plugins.md new file mode 100644 index 00000000000..fd962f4f88e --- /dev/null +++ b/docs/software-developer-documentation/legacy-server-side-extensions/reporting-plugins.md @@ -0,0 +1,449 @@ +Reporting Plugins +================= + +Introduction +------------ + +A reporting plugin runs on the DSS. It creates a report as a table or an +URL for a specified set of data sets or key-value pairs. The user can +invoke a reporting plugin in the openBIS Web application. The result +will be shown as a table or a link. + +A reporting plugin is one of the three following types. The differences +are the type of input and output: + +- TABLE\_MODEL: *Input*: A set of data sets. *Output*: A table +- DSS\_LINK: *Input*: One data set. *Output*: An URL +- AGGREGATION\_TABLE\_MODEL: *Input*: A set of key-value pairs. + *Output*: A table + +A reporting plugin is configured on the DSS best by introducing a [core +plugin](/display/openBISDoc2010/Core+Plugins) of type +`reporting-plugins`. All reporting plugins have the following properties +in common: + +|Property Key|Description| +|--- |--- | +|class|The fully-qualified Java class name of the reporting plugin. The class has to implement IReportingPluginTask.| +|label|The label. It will be shown in the GUI.| +|dataset-types|Comma-separated list of regular expressions. The plugin can create a report only for the data sets of types matching one of the regular expressions. If new data set types are registered with openBIS, the DSS will need to be restarted before the new data set types are known to the processing plugins. This is a mandatory property for reporting plugins of type TABLE_MODEL and DSS_LINK. It will be ignored if the type is AGGREGATION_TABLE_MODEL.| +|properties-file|Path to an optional file with additional properties.| +|servlet.<property>|Properties for an optional servlet. It provides resources referred by URLs in the output of the reporting plugin. +This should be used if the servlet is only needed by this reporting plugin. If other plugins also need this servlet it should be configured as a core plugin of type services.| +|allowed-api-parameter-classes|A comma-separated list of regular expression for fully-qualified class names. Any classes matching on of the regular expressions is allowed as a class of a Java parameter object of a remote API call. For more details see API Security.| +|disallowed-api-parameter-classes|A comma-separated list of regular expression for fully-qualified class names. Any classes matching on of the regular expressions is not allowed as a class of a Java parameter object of a remote API call. For more details see API Security.| + +Generic Reporting Plugins +------------------------- + +### DecoratingTableModelReportingPlugin + +**Type**: TABLE\_MODEL + +**Description**: Modifies the output of a reporting plugin of type +TABLE\_MODEL + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|reporting-plugin.class|The fully-qualified Java class name of the wrapped reporting plugin of type TABLE_MODEL| +|reporting-plugin.<property>|Property of the wrapped reporting plugin.| +|transformation.class|The fully-qualified Java class name of the transformation. It has to implement ITableModelTransformation.| +|transformation.<property>|Property of the transformation to be applied.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DecoratingTableModelReportingPlugin + label = Analysis Summary + dataset-types = HCS_IMAGE_ANALYSIS_DATA + reporting-plugin.class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.TSVViewReportingPlugin + reporting-plugin.separator = , + transformation.class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.EntityLinksDecorator + transformation.link-columns = BARCODE, GENE + transformation.BARCODE.entity-kind = SAMPLE + transformation.BARCODE.default-space = DEMO + transformation.GENE.entity-kind = MATERIAL + transformation.GENE.material-type = GENE + +##### Transformations + +###### EntityLinksDecorator + +**Description**: Changes plain columns into entity links. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|link-columns|Comma-separated list of column keys.| +|<column key>.entity-kind|Entity kind of column <column key>. Possible values are MATERIAL and SAMPLE.| +|<column key>.default-space|Optional space code for SAMPLE columns. It will be used if the column value contains only the sample code.| +|<column key>.material-type|Mandatory type code for MATERIAL columns.| + +### GenericDssLinkReportingPlugin + +**Type**: DSS\_LINK + +**Description**: Creates an URL for a file inside the data set. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|download-url|Base URL. Contains protocol, domain, and port.| +|data-set-regex|Optional regular expression which specifies the file.| +|data-set-path|Optional relative path in the data set to narrow down the search.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.GenericDssLinkReportingPlugin + label = Summary + dataset-types = MS_DATA + download-url = https://my.domain.org:8443 + data-set-regex = summary.* + data-set-path = report + +### AggregationService + +Import Note on Authorization + +In AggregationServices and IngestionServices, the service programmer +needs to ensure proper authorization by himself. He can do so by using +the methods from +[IAuthorizationService](http://svnsis.ethz.ch/doc/openbis/current/ch/systemsx/cisd/openbis/dss/generic/shared/api/internal/authorization/IAuthorizationService.html). +The user id, which is needed when calling these methods, can be obtained +from `DataSetProcessingContext` (when using Java), or the +variable `userId` (when using Jython). + +** +** + +**Type: **AGGREGATION\_TABLE\_MODEL + +**Description**: An abstract superclass for aggregation service +reporting plugins. An aggregation service reporting plugin takes a hash +map containing user parameters as an argument and returns tabular data +(in the form of a TableModel). The +JythonBasedAggregationServiceReportingPlugin below is a subclass that +allows for implementation of the logic in Jython. + +**Configuration**: Dependent on the subclass. + +To implement an aggregation service in Java, define a subclass +of `ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.AggregationService`. +This subclass must implement the method + + TableModel createReport(Map<String, Object>, DataSetProcessingContext). + +**Example**: + +**ExampleAggregationServicePlugin** + + package ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard; + import java.io.File; + import java.util.Map; + import java.util.Properties; + import ch.systemsx.cisd.openbis.dss.generic.shared.DataSetProcessingContext; + import ch.systemsx.cisd.openbis.generic.shared.basic.dto.TableModel; + import ch.systemsx.cisd.openbis.generic.shared.util.IRowBuilder; + import ch.systemsx.cisd.openbis.generic.shared.util.SimpleTableModelBuilder; + /** + * @author Chandrasekhar Ramakrishnan + */ + public class ExampleAggregationServicePlugin extends AggregationService + { + private static final long serialVersionUID = 1L; + /** + * Create a new plugin. + * + * @param properties + * @param storeRoot + */ + public ExampleAggregationServicePlugin(Properties properties, File storeRoot) + { + super(properties, storeRoot); + } + @Override + public TableModel createReport(Map<String, Object> parameters, DataSetProcessingContext context) + { + SimpleTableModelBuilder builder = new SimpleTableModelBuilder(true); + builder.addHeader("String"); + builder.addHeader("Integer"); + IRowBuilder row = builder.addRow(); + row.setCell("String", "Hello"); + row.setCell("Integer", 20); + row = builder.addRow(); + row.setCell("String", parameters.get("name").toString()); + row.setCell("Integer", 30); + return builder.getTableModel(); + } + } + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.ExampleAggregationServicePlugin + label = My Report + +#### JythonAggregationService + +**Type:** AGGREGATION\_TABLE\_MODEL + +**Description**: Invokes a Jython script to create an aggregation +service report. For more details see [Jython-based Reporting and +Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.jython.JythonAggregationService + label = My Report + script-path = script.py + +### IngestionService + +**Type: **AGGREGATION\_TABLE\_MODEL + +**Description**: An abstract superclass for aggregation service +reporting plugins that modify entities in the database. A db-modifying +aggregation service reporting plugin takes a hash map containing user +parameters and a transaction as arguments and returns tabular data (in +the form of a TableModel). The transaction is an +[IDataSetRegistrationTransactionV2](http://svnsis.ethz.ch/doc/openbis/current/ch/systemsx/cisd/etlserver/registrator/api/v2/IDataSetRegistrationTransactionV2.html), +the same interface that is used by +[dropboxes](/display/openBISDoc2010/Dropboxes) to register and modify +entities. The JythonBasedDbModifyingAggregationServiceReportingPlugin +below is a subclass that allows for implementation of the logic in +Jython. + +**Configuration**: Dependent on the subclass. + +To implement an aggregation service in Java, define a subclass +of `ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.IngestionService`. +This subclass must implement the method + + TableModel process(IDataSetRegistrationTransactionV2 transaction, Map<String, Object> parameters, DataSetProcessingContext context) + +**Example**: + +**ExampleDbModifyingAggregationService.java** + + package ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard; + import java.io.File; + import java.util.Map; + import java.util.Properties; + import ch.systemsx.cisd.etlserver.registrator.api.v2.IDataSetRegistrationTransactionV2; + import ch.systemsx.cisd.openbis.dss.generic.shared.DataSetProcessingContext; + import ch.systemsx.cisd.openbis.dss.generic.shared.dto.DataSetInformation; + import ch.systemsx.cisd.openbis.generic.shared.basic.dto.TableModel; + import ch.systemsx.cisd.openbis.generic.shared.util.IRowBuilder; + import ch.systemsx.cisd.openbis.generic.shared.util.SimpleTableModelBuilder; + /** + * An example aggregation service + * + * @author Chandrasekhar Ramakrishnan + */ + public class ExampleDbModifyingAggregationService extends IngestionService<DataSetInformation> + { + private static final long serialVersionUID = 1L; + /** + * @param properties + * @param storeRoot + */ + public ExampleDbModifyingAggregationService(Properties properties, File storeRoot) + { + super(properties, storeRoot); + } + @Override + public TableModel process(IDataSetRegistrationTransactionV2 transaction, + Map<String, Object> parameters, DataSetProcessingContext context) + { + transaction.createNewSpace("NewDummySpace", null); + SimpleTableModelBuilder builder = new SimpleTableModelBuilder(true); + builder.addHeader("String"); + builder.addHeader("Integer"); + IRowBuilder row = builder.addRow(); + row.setCell("String", "Hello"); + row.setCell("Integer", 20); + row = builder.addRow(); + row.setCell("String", parameters.get("name").toString()); + row.setCell("Integer", 30); + return builder.getTableModel(); + } + } + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.ExampleDbModifyingAggregationService + label = My Report + +#### JythonIngestionService + +**Type: **AGGREGATION\_TABLE\_MODEL + +**Description**: Invokes a Jython script to register and modify entities +and create an aggregation service report. The script receives a +transaction as an argument. For more details see [Jython-based Reporting +and Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| +|share-id|Optional, defaults to 1 when not stated otherwise.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.jython.JythonIngestionService + label = My Report + script-path = script.py + +### JythonBasedReportingPlugin + +**Type:** TABLE\_MODEL** +** + +**Description**: Invokes a Jython script to create the report. For more +details see [Jython-based Reporting and Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.jython.JythonBasedReportingPlugin + label = My Report + dataset-types = MS_DATA, UNKNOWN + script-path = script.py + +### TSVViewReportingPlugin + +**Type:** TABLE\_MODEL** +** + +**Description**: Presents the main data set file as a table. The main +file is specified by the Main Data Set Pattern and the Main Data Set +Path of the data set type. The file can be a CSV/TSV file or an Excel +file. This reporting plugin works only for one data set. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|separator|Separator character. This property will be ignored if the file is an Excel file. Default: TAB character| +|ignore-comments|If true all rows starting with '#' will be ignored. Default: true| +|ignore-trailing-empty-cells|If true trailing empty cells will be ignored. Default: false| +|excel-sheet|Name or index of the Excel sheet used. This property will only be used if the file is an Excel file. Default: 0| +|transpose|If true transpose the original table, that is exchange rows with columns. Default: false| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.TSVViewReportingPlugin + label = My Report + dataset-types = MS_DATA, UNKNOWN + separator = ; + +Screening Reporting Plugins +--------------------------- + +### ScreeningJythonBasedAggregationServiceReportingPlugin + +**Type:** AGGREGATION\_TABLE\_MODEL** +** + +**Description**: Invokes a Jython script to create an aggregation +service report. For more details see [Jython-based Reporting and +Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). +There is some extra support for screening. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.screening.server.plugins.jython.ScreeningJythonBasedReportingPlugin + label = My Report + dataset-types = HCS_IMAGE + script-path = script.py + +### ScreeningJythonBasedDbModifyingAggregationServiceReportingPlugin + +**Type: **AGGREGATION\_TABLE\_MODEL** +** + +**Description**: Invokes a Jython script to register and modify entities +and create an aggregation service report. The screening-specific version +has access to the screening facade for queries to the imaging database +and is given a screening transaction that supports registering plate +images and feature vectors. For more details see [Jython-based Reporting +and Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.screening.server.plugins.jython.ScreeningJythonBasedReportingPlugin + label = My Report + dataset-types = HCS_IMAGE + script-path = script.py + +### ScreeningJythonBasedReportingPlugin + +**Type:** TABLE\_MODEL** +** + +**Description**: Invokes a Jython script to create the report. For more +details see [Jython-based Reporting and Processing +Plugins](/display/openBISDoc2010/Jython-based+Reporting+and+Processing+Plugins). +There is some extra support for screening. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|script-path|Path to the jython script.| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.screening.server.plugins.jython.ScreeningJythonBasedAggregationServiceReportingPlugin + label = My Report + script-path = script.py \ No newline at end of file diff --git a/docs/software-developer-documentation/legacy-server-side-extensions/search-domain-services.md b/docs/software-developer-documentation/legacy-server-side-extensions/search-domain-services.md new file mode 100644 index 00000000000..ce8a5bc63de --- /dev/null +++ b/docs/software-developer-documentation/legacy-server-side-extensions/search-domain-services.md @@ -0,0 +1,129 @@ +Search Domain Services +====================== + +A search domain service is a DSS plugin which allows to query some +domain specific search services. For example, a search service on a +database of nucleotide acid sequences. Currently only one search service +is supported: Searching of local BLAST databases for nucleotide and/or +protein sequences. + +## Configuring a Service + +To configure a service a +[core-plugin](/display/openBISDoc2010/Core+Plugins) of +type `search-domain-services` has to be created. The minimum +configuration for `plugin.properties` reads: + +||Description| +|--- |--- | +|class|Fully qualified name of a Java class implementing ch.systemsx.cisd.openbis.dss.generic.shared.api.internal.v2.ISearchDomainService| +|label|The label. Can be used in user interfaces.| + +## Querying a Service + +Search domain services can be accessed via `IGeneralInformationService`. +The method `listAvailableSearchDomains` returns all available services. + +A service can be queried by the method `searchOnSearchDomain`. Beside of +the `sessionToken` it has the following parameters: + +- `preferredSearchDomainOrNull`: This can be `null` If there is only + one service configured. Otherwise the name of the core-plugin + specifies the preferred services. If no such service hasn't been + configured or it isn't be available the first available service will + be used. If there is no available service the search will return an + empty list. +- `searchString`: This is the string to search for. +- `optionalParametersOrNull`: This is a map of string-string key-value + pairs of optional parameters. Can be `null`. The semantics of these + parameters depends on the used service. + +The method returns a list of `SearchDomainSearchResult` instances which +contain the following attributes: A description of the search domain +(class `SearchDomain`), the location +(interface `ISearchDomainResultLocation`), and a score. The result list +is sorted by score in descending order. The location has information +where the sequence is stored in openBIS and where it matches the search +string. + +## Service Implementations + +### BlastDatabase + +**Description**: This implementations requires the +[BLAST+](http://blast.ncbi.nlm.nih.gov/Blast.cg) tools. The latest +versions can be downloaded from +[here](ftp://ftp.ncbi.nlm.nih.gov/blast/executables/blast+/LATEST/). +Note, that this service is only available if the BLAST+ tools have been +installed. Only the tools `blastn` (for nucleotide search) and `blastp` +(for protein search) are used. + +In order to build up a local BLAST database the maintenance task +[BlastDatabaseCreationMaintenanceTask](/display/openBISDoc2010/Maintenance+Tasks#MaintenanceTasks-BlastDatabaseCreationMaintenanceTask) +has to be configured. + +Because the maintenance task to create the BLAST databases runs often +only once per day a change in entity properties or a registration of a +data sets will not immediately be reflected by the search results. That +is, new sequences aren't found and changed/deleted sequences are still +found. + +**Configuration**: + +|Property Key|Description| +|--- |--- | +|blast-tools-directory|Path to the directory with BLAST+ command line tools. If defined it will be prepended to the commands blastn and blastp. If undefined it is assumed that the path is in the PATH environment variable.| +|blast-databases-folder|Path to the folder where all BLAST databases are stored. Default: <data store root>/blast-databases| + +**Example**: + +**plugin.properties** + + class = ch.systemsx.cisd.openbis.dss.generic.server.api.v2.sequencedatabases.BlastDatabase + label = BLAST database + +#### **Optional Query Parameters** + +The following optional query parameters (i.e. service method +parameter `optionalParametersOrNull` as described above) are understood +and used as command line parameters of the BLAST+ tools: + +|Name |Description | +|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +|evalue |=HYPERLINK("http://www.ncbi.nlm.nih.gov/blast/Blast.cgi?CMD=Web&PAGE_TYPE=BlastDocs&DOC_TYPE=FAQ#expect","Defines the threshold of so-called ""Expect Value"" of found matches (for details see http://www.ncbi.nlm.nih.gov/blast/Blast.cgi?CMD=Web&PAGE_TYPE=BlastDocs&DOC_TYPE=FAQ#expect and http://homepages.ulb.ac.be/~dgonze/TEACHING/stat_scores.pdf). Higher values means more found matches. Default value is 10.") | +|word_size|Word size for initial match. Decreasing word size results in increasing number of matches. Default values (if task parameter hasn't been specified): 11 for blastn and 3 for blastp. | +|task |Defines values for a set of parameters of the tools blastn and blastp. Possible values are blastn: Default value blastn Value Description Default value of word_size blastn Traditional blastn requiring an exact match of 11 11 blastn-short blastn program optimized for sequences shorter than 50 bases 7 megablast Traditional megablast used to find very similar (e.g., intraspecies or closely related species) sequences 28 dc-megablast Discontiguous megablast used to find more distant (e.g., interspecies) sequences 11 blastp: Default value blastp Value Description Default value of word_size blastp Traditional blastp to compare a protein query to a protein database 3 blastp-short blastp optimized for queries shorter than 30 residues 2| +|ungapped |If specified (with an empty string value) only ungapped matches are returned. Will be ignored for blastp. | + + +For more details about these parameters see +<http://www.ncbi.nlm.nih.gov/books/NBK1763/>. + +#### Search Results + +A search result has either a `DataSetFileBlastSearchResultLocation` or +an `EntityPropertyBlastSearchResultLocation` instance depending on +whether the result has been found in a sequence of a FASTA or FASTQ file +of a data set or in a sequence stored as a property of an experiment, a +sample or a data set. In any case the following informations can be +retrieved for each match: + +|BLAST output column|Access in Java|Description| +|--- |--- |--- | +|score|SearchDomainSearchResult.getScore().getScore()|Score. See http://homepages.ulb.ac.be/~dgonze/TEACHING/stat_scores.pdf for an explanation of score, bit-score and evalue.| +|bitscore|SearchDomainSearchResult.getScore().getBitScore()|| +|evalue|SearchDomainSearchResult.getScore().getEvalue()|| +|sstart|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getSequenceStart()|Start of alignment in + found sequence| +|send|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getSequenceEnd()|End of alignment in + found sequence| +|qstart|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getQueryStart()|Start of alignment in + search string.| +|qend|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getQueryEnd()|End of alignment in + search string.| +|mismatch|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getNumberOfMismatches()|Number of mismatches.| +|gaps|SearchDomainSearchResult.getResultLocation().getAlignmentMatch().getTotalNumberOfGaps()|Total number of gap.| + + + + \ No newline at end of file -- GitLab