User Guide

Service Parameters

To run the service, specify the source object storage and identify the input data set.

REQUIRED: "source"

Identify the transform source object storage, where the input resides. The source object storage details appear in the Model9 agent configuration file.

Required Keywords for "source"

{
  "source": {
    "url": "<URL>",
    "api": "<API>",
    "bucket": "<USER_BUCKET>",
    "user": "<USERID>",
    "password": "<PASSWORD>"
  }
}

Optional Keywords for "source"

{
  "source": {
    "useS3V4Signatures": "false|true"
  }
}

OPTIONAL: "target"

Identify the transform target object storage. Values not specified will be taken from the "source" parameter.

{
  "target": {
    "url": "<URL>",
    "api": "<API>",
    "bucket": "<USER-BUCKET>",
    "user": "<USERID>",
    "password": "<PASSWORD>",
    "useS3V4Signatures": "false|true"
  }
}

REQUIRED: "input"

If you specify VSAM keywords for a sequential input data set, the transform will be performed and a warning message will be issued

Required Keywords for "input"

{
  "input": {
    "name": "<DSN>",
    "complex": "<group-SYSPLEX>"
  }
}

Optional Keywords for "input"

{
  "input": {
    "type": "backup|archive|import",
    "entry": "0|<N>",
    "prefix": "model9|<USER-PREFIX>",
    "recordBinary": "false|true",
    "recordCharset": "<CHARSET>",
    "vsam": {
      "keyBinary": "false|true",
      "keyCharset": "<CHARSET>"
    }
  }
}

OPTIONAL: "output"

The output is the transformed data of the MF data set, accessible as S3 object

  • When transforming a file with the same name as an existing file in the target, the existing file will be replaced by the newly transformed file.

    Note that the service does not delete previously transformed files but rather overwrites files with the same name, so when re-transforming a file using the "split" function, ensure to remove any previously transformed files to avoid having split files of different versions.

  • When splitting a file, wait for the successful completion of the transform function before continuing with the processing, to insure all the parts of a the file were created.

  • Specifying "text" format for a "binary" input will cause the transform to fail.

{
  "output": {
    "prefix": "model9|<USER-PREFIX>",
    "compression": "none|gzip",
    "format": "JSON|text|CSV|RAW",
    "charset": "UTF8",
    "endWithNewLine": "false|true",
    "splitBySize": "<nnnnb/m/g>",
    "splitByRecords": "<n>"
  }
} 

Service parameters samples

Transforming a plain text data set

Transform the latest backup of a plain text data set, charset IBM-1047, converted to UTF8 and compressed.

{
  "input": {
    "name": "SAMPLE.TEXT",
    "complex": "group-PLEX1"
  },
  "output": {
    "format": "text"
  },
  "source": {
    "url": "https://s3.amazonaws.com",
    "api": "aws-s3",
    "bucket": "prod-bucket",
    "user": "sdsdDVDCsxadA43TERVGFBSDSSDff",
    "password": "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
  }
}

Transforming an unloaded DB2 table

Transform the latest backup of an unloaded DB2 table, charset IBM-1047, converted to UTF8 and compressed, located with a specific prefix:

{
  "input": {
    "name": "DB2.UNLOADED.SEQ",
    "complex": "group-PLEX1"
  },
  "output": {
    "format": "text"
  },
  "source": {
    "url": "https://s3.amazonaws.com",
    "api": "aws-s3",
    "bucket": "prod-bucket",
    "user": "sdsdDVDCsxadA43TERVGFBSDSSDff",
    "password": "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
  },
  "output": {
    "prefix": "DBprodCustomers"
  }
}

Transforming a VSAM file using the defaults

When transforming a VSAM file, the defaults are a text key and binary data, transforming to a JSON output file:

{
  "input": {
    "name": "SAMPLE.VSAM",
    "complex": "group-PLEX1"
  },
  "source": {
    "url": "https://s3.amazonaws.com",
    "api": "aws-s3",
    "bucket": "prod-bucket",
    "user": "sdsdDVDCsxadA43TERVGFBSDSSDff",
    "password": "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
  }
}

Transforming a VSAM text file to CSV

Specify a text data, transforming to a CSV output file:

{
  "input": {
    "name": "SAMPLE.VSAM",
    "complex": "group-PLEX1"
  },
  "vsam": {
    "keyBinary": "false|true",
    "keyCharset": "<CHARSET>"
  },
  "output": {
    "format": "CSV"
  },
  "source": {
    "url": "https://s3.amazonaws.com",
    "api": "aws-s3",
    "bucket": "prod-bucket",
    "user": "sdsdDVDCsxadA43TERVGFBSDSSDff",
    "password": "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
  }
}

Transforming on Azure Storage using OAuth2

When transforming data on Azure blob storage with OAuth2 set the "api" to azureblob-oauth2 and use the azureOauth section to specify Azure OAuth arguments as follows:

{
  "input": {
    "name": "SAMPLE.PS",
    "complex": "group-PLEX1"
  },
  "vsam": {
    "keyBinary": "false|true",
    "keyCharset": "<CHARSET>"
  },
  "output": {
    "format": "CSV"
  },
  "source": {
    "api": "azureblob-oauth2",
    "url": "https://<azure-storage-account>.blob.core.windows.net",
    "bucket": "<azure-container-name>",
    "user": "<azure-application-uuid>",
    "password": "<azure-application-client-secret>",
    "azureOauth": {
      "oauthEndpoint": "<azure-oauth-endpoint>",
      "storageAccount": "<azure-storage-account>",
      "oauthAudience": "<azure-oauth-audience>",
      "credentialType": "<azure-credential-type>"
    }
  }
}

Table: Azure OAuth2 Arguments

Service response and log

The transform service is invoked as an HTTP request. It returns:

HTTP status

HTTP response

{
  "status": "OK|WARNING|ERROR",
  "outputName": "<OUTPUT-NAME>",
  "inputName": "<DSN>",
  "outputCompression": "none|gzip",
  "outputSizeInBytes": "<SIZE-IN_BYTES>",
  "outputFormat": "JSON|text|CSV"
}

In case of a WARNING or an ERROR - the HTTP response will also contain log messages.

Informational messages are printed only to service log and not to the HTTP response. The service log can be viewed on the AWS console when executing the service from AWS, or the docker log, when executing the service on-premises.

Log

{
  "log": [
    "<INFO-MESSAGE>",
    "<WARNING-MESSAGE>",
    "<ERROR-MESSAGE>"
  ]
} 

Service response and log samples

Status OK sample

{
  "status": "OK",
  "outputName": "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=a641d670-2d05-41e7-9dd3-7815e1b2d4c4",
  "inputName": "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS",
  "outputCompression": "NONE",
  "outputSizeInBytes": 97,
  "outputFormat": "JSON"
}

Status WARNING sample

{
  "log": [
    "ZM9K001I Transform service started",
    "ZM9K108W Specifying input parameter vsam is ignored for input data set with DSORG PS",
    "ZM9K002I Transform service completed successfully, output is transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19"
  ],
  "status": "WARNING",
  "outputName": "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19",
  "inputName": "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS",
  "outputCompression": "NONE",
  "outputSizeInBytes": 97,
  "outputFormat": "JSON"
}

Status ERROR sample

{
  "status": "ERROR",
  "log": [
    "ZM9K001I Transform service started",
    "ZM9K008E The input was not found: name QA.SMS.MCBK.DSERV.TXT.NON, archive false, entry (0)"
  ]
}

Input format support

Supported formats

  • SMS-managed data sets

  • Non-SMS managed data sets

  • Sequential and extended-sequential data sets with the following RECFM:

    • V

    • VB

    • F

    • FB

    • FBA

  • Non-extended VSAM KSDS data sets

Unsupported formats

  • RRDS, VRRDS, LINEAR, ESDS

  • Extended format data sets with compression or encryption

  • PDS data sets

  • RECFM not mentioned above (U)

Output format support

  • Text

  • JSON

  • CSV

DB2 Image Copy Transform Guide

Configuration

  1. Make sure that <M9_HOME>/scripts/transform-service.sh has execute permissions. If not, add it by using chmod a+x <M9_HOME>/scripts/transform-service.sh.

  2. Copy M9XFDB2 from Model9's SAMPLIB data set to a PDS data set of your choosing.

  3. Edit M9XFDB2 and replace the placeholders enclosed with angle brackets with the following:

Table: Placeholders

  1. Replace the remaining placeholders in the JCL as described in this manual.

Execute and verify results

When done, submit the job and make sure it ends with MAXCC of 0.

Via SDSF, verify that the transform service was in fact called and completed successfully. Successful output would look something like this:

{
  "status": "OK",
  "outputNames": [
    "transform-output/M9.SHY.DB2.IMGCPY.M9DB.M9SEG4"
  ],
  "inputName": "M9.SHY.DB2.IMGCPY.M9DB.M9SEG4",
  "outputCompression": "NONE",
  "outputSizeInBytes": 1064,
  "outputFormat": "CSV"
}

Supported DB2 Column Types

Table 5. Supported DB2 Column Types for Transformation

Last updated