Difference between revisions of "EdgeServerCompiler"
(→ Command Line Arguments) |
(→ Command Line Arguments) |
||
Line 78: | Line 78: | ||
| h || -h || Display the help information | | h || -h || Display the help information | ||
|} | |} | ||
+ | |||
+ | |||
+ | == Properties File == | ||
+ | If providing configuration parameter via a properties file, it must be in JSON format, and have the following structure | ||
+ | <pre> | ||
+ | { | ||
+ | "Ledger" : "s3:mybucket", | ||
+ | "TgtDir" : "/home/compiler/target", | ||
+ | "SdmxAPI" : "https://demo.metadatatechnology.com/FusionRegistry/ws/public/sdmxapi/rest", | ||
+ | "UpdatedAfter" : "2010", | ||
+ | "Username" : "myuser", | ||
+ | "Password" : "pwd", | ||
+ | "AllData" : true, | ||
+ | "FullReplace" : true, | ||
+ | "Zip" : true, | ||
+ | "Metadata" : true, | ||
+ | "S3Region": "us-east-1", | ||
+ | "S3SecretKey": "azxasdasfcvbn", | ||
+ | "S3AccessKey": "sxcvbnmu", | ||
+ | "SubCubes":{ | ||
+ | "ECB:EXR(1.0)" : { | ||
+ | "SubCube1" : { | ||
+ | "Include" : { | ||
+ | "FREQ":["A","M"], | ||
+ | "REF_AREA":["UK"] | ||
+ | } | ||
+ | } | ||
+ | }, | ||
+ | "WB:POVERTY(1.0)":{ } | ||
+ | }, | ||
+ | "Structures":{ | ||
+ | "Codelist": ["ECB,EXR,1.0"] | ||
+ | "HierarchicalCodelist": ["ECB", "BIS"] | ||
+ | "all": ["SDMX"] | ||
+ | ] | ||
+ | } | ||
+ | } | ||
+ | </pre> | ||
+ | '''Note''': the properties file supports the same information as the command line arguments, in addition it supports the ability to define subcubes of data, and support for additional structures. | ||
+ | |||
+ | |||
+ | '''Sub Cubes''' | ||
+ | <p>The SubCubes section of the properties file can be used to define Dimension filters for each Dataflow.. If the df argument of the command line is used to define a single Dataflow to be exported, and it is also given a properties file containing subcubes definitions for multiple Dataflows, then the compiler will honour the df argument and only export data for the single Dataflow – it will however use the sub-cube filters if they are present in the properties file. </p> | ||
+ | |||
+ | <p>A sub-cube with no filters, as shown in the WB:POVERTY example above, will result in the full dataset being exported.</p> | ||
+ | |||
+ | <p>When the compiler is compiling data for all Dataflows ('''AllData : true''' or ''-df “all”'' as a command line argument) it will still use the sub-cube definitions if they exist, to filter the Dataflow contents. | ||
+ | Structures</p> | ||
+ | |||
+ | <p>The Structures section of the properties file defines which structural metadata should be included in the outputs. Note, when outputting data for a Dataflow, the Dataflow and all descendants (DSD, Codelist, Concept Scheme, Agency Scheme) will be automatically included in the structure metadata that is generated and do not need to be explicitly specified. The Structure section provides the means to include additional structures that are not directly related to the Dataflow, or if exporting structures only, then this section must be present. The arguments are the structure type (this is the same as the path parameter on the REST API, i.e. Codelist – each structure type can then take an array of structure filters in the format AgencyId,Id,Version, where each argument is optional, the absence of which meaning all. The keyword all can be used as a structure type to indicate all structures, which can also take the filters for agency, id and version.</p> | ||
+ | |||
+ | <p>'''Note''': Whenever a structure is included in the export, all the descendants of that structure will be included automatically. For example, if a Hierarchical Codelist is included in the export, the related Codelists and Agency Scheme(s) will also be included, without having to be explicitly mentioned.</p> |
Revision as of 00:18, 20 October 2021
Contents
Fusion Edge Compiler
Overview
The Fusion Edge Compiler is a command line client, written in Java and can run on Windows or UNIX operating systems. Its responsibility is to compile SDMX data, structure, and metadata files for dissemination by the Fusion Edge Server. The Fusion Edge Compiler provides three functions:
- To pull content from SDMX web services (example Fusion Registry web services) in order to populate a local file system of content to publish
- To compile content in the local file system to create a new ‘environment’ which can be consumed by the Fusion Edge Server
- To publish the environment to an Amazon S3 bucket from which distributed Fusion Edge Servers can take their content, if configured to do so
The second function, compile, is the main function of the compiler. The other two functions can be performed manually if required, however the Fusion Edge Compiler provides these functions to allow the full data extract, transform, and load process to be fully automated.
General Arguments
The command line client provides three scripts for pull, compile, and publish. Each script has a UNIX (.sh) file and a windows (.bat) file. Each script can take a number of command line arguments, some arguments are common to all scripts these are:
- The properties file (prop argument). Each script can read one or more properties files which is a JSON file that contains configuration options. The properties file contains all the same configuration options that can be passed directly to script as a command line argument. This allows the script to read arguments from a file, and/or as a direct argument. It is possible to provide both command line arguments in addition to a properties file, the script will merge the arguments from the properties file, with the command line arguments. If a configuration option is passed as a the command line argument but also appears in the properties file, the command line argument will take precedence. As both the command line arguments and properties files can be used in conjunction with one another, the arguments are always marked as optional. However, this document will note which arguments are required, and must exist as either a command line argument, or a properties file argument.
- Ledger location (lgr argument). This must be the root location of the ledger, it can be provided as either a path to a folder on a file system, the http(s) URL to the root folder if hosting the ledger on a web service, or prefixed with s3:bucketname if using Amazon S3 as the file store.
Pull Content
buildFileSystem.sh (UNIX) or buildFileSystem.bat (Windows)
The Fusion Edge Compiler queries and SDMX web service for structural metadata, data, and reference metadata content based on what it has been requested to pull. It can work against a Fusion Registry web service as well as any other SDMX web service that complies with the SDMX specification.
The Fusion Edge Compiler pulls the content to build a target directory of files in the correct structure for the compile process to operate.
Command Line Arguments
Note: Arguments can be provided by the command line, or alternatively via a JSON properties file, or a mix of both.
Argument | Example | Description |
---|---|---|
prop | -prop “/home/props.json” | A reference to one or more properties files (separated by a space) |
api | -api “https://yourorg.org/sdmx” | The URL of the web service to pull the content from |
apict | -apict 500 | Connect Timeout to the API in seconds (default 200) |
apirt | -apirt 500 | Read Timeout from the API in seconds (default 200) |
apiua | -apiua “EdgeCL” | User Agent sent in HTTP Header request to API |
tgt | -tgt “/home/compiler/target” | The target directory to write the files and folders to |
lgr | -lgr “s3:mybucket” | The location of the current live ledger. If this is provided then the last compile time of the ledger will be used as the updated after time to use when pulling data |
df | -df “ACY:DF_ID(1.0)” “ACY:DF2(1.0)” | A reference to one or more Dataflows to pull data for (separated by a space). If Dimension filters are to be applied to the dataflow, then the properties file should be used. The keyword all can be used to pull data for all Dataflows. A Dataflow argument will include both the data and structural metadata (Dataflow plus all descendants) in the output. |
str | -str “Codelist=ACY:CL_FREQ(1.0),ACY_CL_AGE(1.0) CategoryScheme=ACY:*(*)” -str all |
A list of structures to include in the output, in addition to those that are included automatically based on the Dataflows included in the output. The structure and all descendants of that structure will be included in the output. The * are used to wildcard Agency, Id, and Version parameters. All Structures can be obtained using the all keyword in the class type. |
upd | upd “2020-01-30T00:00.00” | Pull data that was updated after this time. This will be applied by using the updatedAfter web service query parameter against the target web service
replace |
replace | -replace | If present, all the files in the the target directory will be deleted before the pull content is run |
metadata | -metadata | If present, the pull process will query for all Reference Metadata and include this in the output |
zip | -zip | If present, the output files will all be in zip format |
usr | -usr “myusername” | Username to authenticate with the REST API, if using the Fusion Registry it should correspond to a user account in the Fusion Registry |
pwd | -pwd “mypassword” | Password to authenticate with the REST API |
s3rgn | -s3rgn “us-east-1” | Amazon S3 region – required if the Ledger is hosted on Amazon S3 |
s3sec | -s3sec “azxzcvbnm” | Amazon S3 Secret – required if the Ledger is hosted on Amazon S3 |
s3acc | -s3acc “azxzcvbnm” | Amazon S3 Access Key – required if the Ledger is hosted on Amazon S3 |
tmp | -tmp /tmp | Temporary directory to use for transient files. If not provided, the java.io.tmpdir JVM variable is used, usually defaulting to the user tmp directory |
rmtmp | -rmtmp | If present will delete all files in the tmp directory before start |
report | -report | If present will output the report to a File called report.json in the target (tgt) directory |
h | -h | Display the help information |
Properties File
If providing configuration parameter via a properties file, it must be in JSON format, and have the following structure
{ "Ledger" : "s3:mybucket", "TgtDir" : "/home/compiler/target", "SdmxAPI" : "https://demo.metadatatechnology.com/FusionRegistry/ws/public/sdmxapi/rest", "UpdatedAfter" : "2010", "Username" : "myuser", "Password" : "pwd", "AllData" : true, "FullReplace" : true, "Zip" : true, "Metadata" : true, "S3Region": "us-east-1", "S3SecretKey": "azxasdasfcvbn", "S3AccessKey": "sxcvbnmu", "SubCubes":{ "ECB:EXR(1.0)" : { "SubCube1" : { "Include" : { "FREQ":["A","M"], "REF_AREA":["UK"] } } }, "WB:POVERTY(1.0)":{ } }, "Structures":{ "Codelist": ["ECB,EXR,1.0"] "HierarchicalCodelist": ["ECB", "BIS"] "all": ["SDMX"] ] } }
Note: the properties file supports the same information as the command line arguments, in addition it supports the ability to define subcubes of data, and support for additional structures.
Sub Cubes
The SubCubes section of the properties file can be used to define Dimension filters for each Dataflow.. If the df argument of the command line is used to define a single Dataflow to be exported, and it is also given a properties file containing subcubes definitions for multiple Dataflows, then the compiler will honour the df argument and only export data for the single Dataflow – it will however use the sub-cube filters if they are present in the properties file.
A sub-cube with no filters, as shown in the WB:POVERTY example above, will result in the full dataset being exported.
When the compiler is compiling data for all Dataflows (AllData : true or -df “all” as a command line argument) it will still use the sub-cube definitions if they exist, to filter the Dataflow contents. Structures
The Structures section of the properties file defines which structural metadata should be included in the outputs. Note, when outputting data for a Dataflow, the Dataflow and all descendants (DSD, Codelist, Concept Scheme, Agency Scheme) will be automatically included in the structure metadata that is generated and do not need to be explicitly specified. The Structure section provides the means to include additional structures that are not directly related to the Dataflow, or if exporting structures only, then this section must be present. The arguments are the structure type (this is the same as the path parameter on the REST API, i.e. Codelist – each structure type can then take an array of structure filters in the format AgencyId,Id,Version, where each argument is optional, the absence of which meaning all. The keyword all can be used as a structure type to indicate all structures, which can also take the filters for agency, id and version.
Note: Whenever a structure is included in the export, all the descendants of that structure will be included automatically. For example, if a Hierarchical Codelist is included in the export, the related Codelists and Agency Scheme(s) will also be included, without having to be explicitly mentioned.