Difference between revisions of "EdgeServerCompiler"

From Fusion Registry Wiki
Jump to navigation Jump to search
(Report)
(Compile Content)
Line 169: Line 169:
  
 
= Compile Content =
 
= Compile Content =
 +
'''Note''': Arguments can be provided by the command line, or alternatively via a JSON properties file, or a mix of both.
 +
 +
{| class="wikitable"
 +
|-
 +
! Argument !! Example !! Description
 +
|-
 +
| prop || -prop “/home/props.json” || A reference to one or more properties files (separated by a space)
 +
|-
 +
| src || -src “/home/compiler/source” || The source directory that contains the files to be compiled (this is the tgt directory in the build file system script)
 +
|-
 +
| tgt || -tgt “/home/compiler/compiled” || The target directory to write the compiled environment to
 +
|-
 +
| lgr || -lgr “s3:mybucket” || The location of the current live ledger, this is required for the following purposes:
 +
# The last compile time of the ledger will be used to determine which files to include in the compile. This is done by comparing timestamps on the file with the last compile
 +
# The ledger is used to build the next ledger entry, and determine the version number given to the new environment
 +
# If running an update compile (not a full replace) any live environment files that are required and unchanged in the new environment will be pulled from the live environments
 +
# If updating a dataset, the live dataset will be pulled from the live environment as the base dataset to update with the new series / observations
 +
|-
 +
| sgn || -sgn “my_signature” || A secret signature to sign the generated files with.  The same signature should be used for all compilations.  The Fusion Edge Server must be given the same secret signature via its properties file so it is able to verify the content in an environment was not corrupted or tampered with.
 +
|-
 +
| liv || -liv “2020-01-30T00:00.00” || The go live time.  This is used if the new environment is to go live at a particular point in time.  The Fusion Edge Server will not make the environment live until this timestamp.
 +
|-
 +
| f || -f || If present, a full recompile of all the files in the src directory will be performed – note this will still merge datasets with those read from the ledger, unless the ‘rd’ argument is also present
 +
|-
 +
| rd || -rd || If present any data files read in will be used to replace any existing datasets read from the ledger, if not in the ledger, or if no ledger provided the datasets will be created in the output
 +
|-
 +
| s3rgn || -s3rgn “us-east-1” || Amazon S3 region – required if the Ledger is hosted on Amazon S3
 +
|-
 +
| s3sec || -s3sec “azxzcvbnm”  || Amazon S3 Secret – required if the Ledger is hosted on Amazon S3
 +
|-
 +
| s3acc || -s3acc “azxzcvbnm”  || Amazon S3 Access Key – required if the Ledger is hosted on Amazon S3
 +
|-
 +
| tmp || -tmp /tmp || Temporary directory to use for transient files. If not provided, the java.io.tmpdir JVM variable is used, usually defaulting to the user tmp directory
 +
|-
 +
| rmtmp || -rmtmp || If present will delete all files in the tmp directory before start
 +
|-
 +
| h || -h || Display the help information
 +
|}

Revision as of 00:24, 20 October 2021

Fusion Edge Compiler

Overview

The Fusion Edge Compiler is a command line client, written in Java and can run on Windows or UNIX operating systems. Its responsibility is to compile SDMX data, structure, and metadata files for dissemination by the Fusion Edge Server. The Fusion Edge Compiler provides three functions:


  1. To pull content from SDMX web services (example Fusion Registry web services) in order to populate a local file system of content to publish
  2. To compile content in the local file system to create a new ‘environment’ which can be consumed by the Fusion Edge Server
  3. To publish the environment to an Amazon S3 bucket from which distributed Fusion Edge Servers can take their content, if configured to do so

The second function, compile, is the main function of the compiler. The other two functions can be performed manually if required, however the Fusion Edge Compiler provides these functions to allow the full data extract, transform, and load process to be fully automated.

General Arguments

The command line client provides three scripts for pull, compile, and publish. Each script has a UNIX (.sh) file and a windows (.bat) file. Each script can take a number of command line arguments, some arguments are common to all scripts these are:

  1. The properties file (prop argument). Each script can read one or more properties files which is a JSON file that contains configuration options. The properties file contains all the same configuration options that can be passed directly to script as a command line argument. This allows the script to read arguments from a file, and/or as a direct argument. It is possible to provide both command line arguments in addition to a properties file, the script will merge the arguments from the properties file, with the command line arguments. If a configuration option is passed as a the command line argument but also appears in the properties file, the command line argument will take precedence. As both the command line arguments and properties files can be used in conjunction with one another, the arguments are always marked as optional. However, this document will note which arguments are required, and must exist as either a command line argument, or a properties file argument.
  1. Ledger location (lgr argument). This must be the root location of the ledger, it can be provided as either a path to a folder on a file system, the http(s) URL to the root folder if hosting the ledger on a web service, or prefixed with s3:bucketname if using Amazon S3 as the file store.


​ Pull Content

buildFileSystem.sh (UNIX) or buildFileSystem.bat (Windows)

The Fusion Edge Compiler queries and SDMX web service for structural metadata, data, and reference metadata content based on what it has been requested to pull. It can work against a Fusion Registry web service as well as any other SDMX web service that complies with the SDMX specification.

The Fusion Edge Compiler pulls the content to build a target directory of files in the correct structure for the compile process to operate.

​ Command Line Arguments

Note: Arguments can be provided by the command line, or alternatively via a JSON properties file, or a mix of both.

Argument Example Description
prop -prop “/home/props.json” A reference to one or more properties files (separated by a space)
api -api “https://yourorg.org/sdmx” The URL of the web service to pull the content from
apict -apict 500 Connect Timeout to the API in seconds (default 200)
apirt -apirt 500 Read Timeout from the API in seconds (default 200)
apiua -apiua “EdgeCL” User Agent sent in HTTP Header request to API
tgt -tgt “/home/compiler/target” The target directory to write the files and folders to
lgr -lgr “s3:mybucket” The location of the current live ledger. If this is provided then the last compile time of the ledger will be used as the updated after time to use when pulling data
df -df “ACY:DF_ID(1.0)” “ACY:DF2(1.0)” A reference to one or more Dataflows to pull data for (separated by a space). If Dimension filters are to be applied to the dataflow, then the properties file should be used. The keyword all can be used to pull data for all Dataflows. A Dataflow argument will include both the data and structural metadata (Dataflow plus all descendants) in the output.
str -str “Codelist=ACY:CL_FREQ(1.0),ACY_CL_AGE(1.0) CategoryScheme=ACY:*(*)”

-str all
A list of structures to include in the output, in addition to those that are included automatically based on the Dataflows included in the output. The structure and all descendants of that structure will be included in the output.

The * are used to wildcard Agency, Id, and Version parameters. All Structures can be obtained using the all keyword in the class type.

upd upd “2020-01-30T00:00.00” Pull data that was updated after this time. This will be applied by using the updatedAfter web service query parameter against the target web service

replace

replace -replace If present, all the files in the the target directory will be deleted before the pull content is run
metadata -metadata If present, the pull process will query for all Reference Metadata and include this in the output
zip -zip If present, the output files will all be in zip format
usr -usr “myusername” Username to authenticate with the REST API, if using the Fusion Registry it should correspond to a user account in the Fusion Registry
pwd -pwd “mypassword” Password to authenticate with the REST API
s3rgn -s3rgn “us-east-1” Amazon S3 region – required if the Ledger is hosted on Amazon S3
s3sec -s3sec “azxzcvbnm” Amazon S3 Secret – required if the Ledger is hosted on Amazon S3
s3acc -s3acc “azxzcvbnm” Amazon S3 Access Key – required if the Ledger is hosted on Amazon S3
tmp -tmp /tmp Temporary directory to use for transient files. If not provided, the java.io.tmpdir JVM variable is used, usually defaulting to the user tmp directory
rmtmp -rmtmp If present will delete all files in the tmp directory before start
report -report If present will output the report to a File called report.json in the target (tgt) directory
h -h Display the help information


Properties File

If providing configuration parameter via a properties file, it must be in JSON format, and have the following structure

{
   "Ledger" : 	"s3:mybucket",
   "TgtDir" :  	"/home/compiler/target",
   "SdmxAPI" : 	"https://demo.metadatatechnology.com/FusionRegistry/ws/public/sdmxapi/rest",
   "UpdatedAfter" :	 "2010",
   "Username" : 	"myuser",
   "Password" : 	"pwd",
   "AllData" : 	true,
   "FullReplace" : 	true,
   "Zip" : 		true,
   "Metadata" : 	true,
   "S3Region":	"us-east-1",
   "S3SecretKey":	"azxasdasfcvbn",
   "S3AccessKey":	"sxcvbnmu",
   "SubCubes":{
      "ECB:EXR(1.0)" : {
         "SubCube1" : {
            "Include" : {
               "FREQ":["A","M"],
               "REF_AREA":["UK"]
            }
         }
      },
      "WB:POVERTY(1.0)":{ }
   },
   "Structures":{
      "Codelist": ["ECB,EXR,1.0"]
      "HierarchicalCodelist": ["ECB", "BIS"]
      "all": ["SDMX"]
      ]
   }
}

Note: the properties file supports the same information as the command line arguments, in addition it supports the ability to define subcubes of data, and support for additional structures.


Sub Cubes

The SubCubes section of the properties file can be used to define Dimension filters for each Dataflow.. If the df argument of the command line is used to define a single Dataflow to be exported, and it is also given a properties file containing subcubes definitions for multiple Dataflows, then the compiler will honour the df argument and only export data for the single Dataflow – it will however use the sub-cube filters if they are present in the properties file.

A sub-cube with no filters, as shown in the WB:POVERTY example above, will result in the full dataset being exported.

When the compiler is compiling data for all Dataflows (AllData : true or -df “all” as a command line argument) it will still use the sub-cube definitions if they exist, to filter the Dataflow contents. Structures

The Structures section of the properties file defines which structural metadata should be included in the outputs. Note, when outputting data for a Dataflow, the Dataflow and all descendants (DSD, Codelist, Concept Scheme, Agency Scheme) will be automatically included in the structure metadata that is generated and do not need to be explicitly specified. The Structure section provides the means to include additional structures that are not directly related to the Dataflow, or if exporting structures only, then this section must be present. The arguments are the structure type (this is the same as the path parameter on the REST API, i.e. Codelist – each structure type can then take an array of structure filters in the format AgencyId,Id,Version, where each argument is optional, the absence of which meaning all. The keyword all can be used as a structure type to indicate all structures, which can also take the filters for agency, id and version.

Note: Whenever a structure is included in the export, all the descendants of that structure will be included automatically. For example, if a Hierarchical Codelist is included in the export, the related Codelists and Agency Scheme(s) will also be included, without having to be explicitly mentioned.

Report

A JSON report will be output to the System.out, or if the -report argument is present it will be written to a file called report.json. The report can be used to determine which settings were used, and which datasets were successfully obtained from the API.

An example report is given below (note all durations are in milliseconds)

{
	"Header": {
		"Prepared": "2021-09-07T08:37:32.104Z",
		"API": "https://server/ws/public/sdmxapi/rest",
		"Ledger": null
	},
	"RESTSettings": {
		"ConnectTimeout": 700,
		"ReadTimeout": 500,
		"UserAgent": "FusionEdgeCL"
	},
	"ExplicitStructures": [],
	"Datasetstructures": ["Dataflow=all:all(all)"],
	"Datasets": {
		"ECB:EXR(1.0)": {
			"all": {
				"Success": true,
				"Duration": 9808
			}
		},
		"ECB:TRD(1.0)": {
			"all": {
				"Success": false,
				"Duration": 673
			}
		},
       },
	"Duration": 20724
}

Compile Content

Note: Arguments can be provided by the command line, or alternatively via a JSON properties file, or a mix of both.

Argument Example Description
prop -prop “/home/props.json” A reference to one or more properties files (separated by a space)
src -src “/home/compiler/source” The source directory that contains the files to be compiled (this is the tgt directory in the build file system script)
tgt -tgt “/home/compiler/compiled” The target directory to write the compiled environment to
lgr -lgr “s3:mybucket” The location of the current live ledger, this is required for the following purposes:
  1. The last compile time of the ledger will be used to determine which files to include in the compile. This is done by comparing timestamps on the file with the last compile
  2. The ledger is used to build the next ledger entry, and determine the version number given to the new environment
  3. If running an update compile (not a full replace) any live environment files that are required and unchanged in the new environment will be pulled from the live environments
  4. If updating a dataset, the live dataset will be pulled from the live environment as the base dataset to update with the new series / observations
sgn -sgn “my_signature” A secret signature to sign the generated files with. The same signature should be used for all compilations. The Fusion Edge Server must be given the same secret signature via its properties file so it is able to verify the content in an environment was not corrupted or tampered with.
liv -liv “2020-01-30T00:00.00” The go live time. This is used if the new environment is to go live at a particular point in time. The Fusion Edge Server will not make the environment live until this timestamp.
f -f If present, a full recompile of all the files in the src directory will be performed – note this will still merge datasets with those read from the ledger, unless the ‘rd’ argument is also present
rd -rd If present any data files read in will be used to replace any existing datasets read from the ledger, if not in the ledger, or if no ledger provided the datasets will be created in the output
s3rgn -s3rgn “us-east-1” Amazon S3 region – required if the Ledger is hosted on Amazon S3
s3sec -s3sec “azxzcvbnm” Amazon S3 Secret – required if the Ledger is hosted on Amazon S3
s3acc -s3acc “azxzcvbnm” Amazon S3 Access Key – required if the Ledger is hosted on Amazon S3
tmp -tmp /tmp Temporary directory to use for transient files. If not provided, the java.io.tmpdir JVM variable is used, usually defaulting to the user tmp directory
rmtmp -rmtmp If present will delete all files in the tmp directory before start
h -h Display the help information