Documentation

gitHub

Command Line Interface

With its Command Line Interface (CLI), Hackolade truly supports an agile development approach leveraging the flexibility of NoSQL dynamic schemas.  Some Hackolade customers use this capability nightly in a batch process to identify attributes and structures that may have appeared in the data since the last run.  During a nightly run of the reverse-engineering function, a much larger dataset can be queried as a basis for document sampling, hence making schema inference more precise. Such capability can be useful in a data governance context to properly document the semantics and publish a thorough data dictionary for end users.  It can also be used in the context of compliance with privacy laws to make sure that the company does not store data that it is not supposed to store.  There are many more examples of how to use this functionality.

 

It is often helpful, in order to understand how to set the different CLI arguments, to first run the equivalent function in the GUI application.

 

It is easier to run the CLI from the directory where the Hackolade executable is installed, from a terminal program:

  • Windows: 
    • cmd command line: all commands below should be preceded by *start /wait hackolade *
    • PowerShell: all commands below should be preceded by *Start-Process -wait hackolade  *
  • Mac: start Terminal and execute: all commands below should be preceded by */Applications/Hackolade.app/Contents/MacOS/Hackolade *
  • Linux: terminal emulator or common shell programs: all commands below should be preceded by ./Hackolade  or *path-to-where-hackolade-was-unzipped/Hackolade *

 

To take full advantage of the capability, you may run the CLI from scheduled batch files.  

 

Note: a properly licensed copy of Hackolade is required to run the CLI.  The CLI functionality is only available in the Professional and Trial editions of Hackolade.

 

Below is the current list of Hackolade CLI commands.  Additional commands may be added at a later date.

 

CommandPurpose
genDocGenerate documentation for a Hackolade model, in HTML, Markdown, or PDF format
revEngReverse-engineer a database instance or script file to infer the schema of the selected collections/tables
revEngJSONReverse-engineer JSON Schema or documents
revEngYAMLReverse-engineer YAML or YAML Schema files
revEngDDLReverse-engineer RDBMS data definition language files with .sql extensions
revEngXSDReverse-engineer XSD files with .xsd extensions
compModCompare two Hackolade models to detect differences, and optionally merge them
forwEngForward-engineer structure created with the application to dynamically generate the schema of the selected entities.  Or forward-engineer JSON Schema or a sample JSON data document
forwEngAPIForward-engineer Swagger or OpenAPI model from merge of a source data model and a user-defined template
forwEngDataDictionaryPublish Hackolade data model to Data Dictionary instance
obfuscCopy a model, and garble sensitive properties: business name, technical name, description, comments, enumeration.  Use if you need to send a model for troubleshooting but don't want to disclose sensitive aspects of the model.
helpDisplay commands and their arguments
performanceRecords timestamps of application startup steps for performance troubleshooting
versionDisplay application version

 

 

Usage:     hackolade command [--arguments]

 

genDoc

The genDoc command lets you invoke the generation of a documentation file, either in HTML or PDF format, for a specified existing Hackolade model.

 

Usage:    hackolade genDoc [--arguments]

 

ArgumentRequiredPurpose
--format=<HTML | Markdown  |  PDF>YSpecify format, currently either HTML, MARKDOWN, or PDF
--model=<file>*YFull path and file name for source Hackolade model for which documentation is being generated.  Extension .json is optional
--doc=<file>*YFull path and file name for documentation.  Extension .html , .md, or .pdf is optional
--logo=<file>*NFull path and file name for custom logo file.  If omitted, Hackolade logo is used
--displayName=<business | technical>NDefine whether to display business or technical names [values: "business", "technical"] [default: business] 
--diagram=<true | false>NInclude model diagram [default: true]
--diagramViews=["<ERDV1>","<ERDV2>",…]"NSpecify an array of diagram views to be included.  The value is a string surrounded by double quotes (").  ER Diagram View names are represented as an array surrounded by square brackets ([]), and are separated by a comma (,).  [default: all]
--sepContDiag=<true | false>NInclude separate container diagrams (database, region, buckets, keyspaces...)  [default: true]
--entityDiagrams=<true | false>NInclude entity hierarchical schema diagrams [default: true]
--attribTree=<all | complex | none>NInclude individual hierarchical schema view for all attributes [all], for complex attributes only [complex], or for no attributes [none]. [default: all]
--properties=<all | notNull | none>NInclude all field properties [all], only field properties with information [notnull], or no field properties [none]. [default: all]
--relationships=<true | false>NInclude Relationships [default: true]
--JSONSchema=<true | false>NInclude JSON Schema code [default: true]
--JSONData=<true | false>NInclude JSON Data sample [default: true]
--entities="<containerName>: [<entity1>,<entity2>,…]"NSpecify container(s) to include in documentation [default: all] and an array of entities [default: all] - MongoDB: container = dbs; entity = collection - DynamoDB: container = not applicable; entity = table - Couchbase: container = bucket; entity = document kind - Cosmos DB: container = collection; entity = document type - plain JSON: container = group; entity = document The value is a string surrounded by double quotes (").  Entities are represented as an array surrounded by square brackets ([]), and are separated by a comma (,).  The entities array is separated from the container name by a colon (:).   Containers are separated by semi-colons (;).
--views="[<view1>, <view2>, ...]"NIn MongoDB only, specify view(s) to include in documentation [default: all] The value is a string surrounded by double quotes (").  Views are represented as an array surrounded by square brackets ([]), and are separated by a comma (,).
--openDoc=<true | false>NSpecify whether to open the generated document or not upon completion of generation. [default: false]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Simple example: using all default options:  

start /wait hackolade genDoc --format=HTML --model=C:\Users\Public\Bitbucket\hackolade\schemas\MongoDB\noaa.json \

--doc=C:\Users\Public\Documents\Hackolade\Documentation\noaa.html

 

More complex example: 

start /wait C:\PROGRA~1\Hackolade\hackolade genDoc --format=HTML --model=C:\Users\Public\Bitbucket\hackolade\schemas\Couchbase\travel-sample.json \

--doc=C\Users\Public\Documents\Hackolade\Documentation\travel-sample.html --logo="C\Users\Public\Documents\Hackolade\Documentation\couchbase logo.png"
--attribTree=complex --JSONSchema=false --JSONData=false --properties=notNull --entities="travel-sample: airport, airline, route"

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

revEng

The revEng command allows to trigger a reverse-engineering process of a database instance, as described in this page, or script file.

 

Usage:    hackolade revEng [--arguments]

 

ArgumentRequiredPurpose
--target=<target>YNative target for model: JSON, MONGODB, DYNAMODB, COUCHBASE, or plugin target: Avro, CASSANDRA, COSMOSDB-SQL, COSMOSDB-MONGO, ELASTICSEARCH, EventBridge, Glue, HBase, HIVE, JOI, MSSQLServer; NEO4J, OPENAPI, PARQUET, ScyllaDB, Snowflake, SWAGGER, Synapse, TinkerPop, etc..
--connectName=<connection>Y if from instanceName of connection settings saved in the Hackolade instance where CLI is invoked. Or use --connectFile instead.
--connectFile=<file>*NFull file path of connection config file (you don't need to use it when connect name is specified).  The simplest way to create a connection file is to create a connection in the GUI application, then export the connection settings to file, encrypted or not.
--files="<file1>;<file2>..."Y if from fileList of full file path of target scripts (MongoDB validator, Cassandra CQL, Hive HQL, Avro schema, Neo4j Cypher, Swagger documentation, etc...)  from which reverse-engineering process has to be performed. Accepts paths divided by semicolon (schema1.avcs;schema2.avcs) when reverse-engineering from Avro/Parquet with combining schemas into one. 
--model=<file>*YFull path and file name for target Hackolade model into which reverse-engineering process has to be converted.  Extension .json is optional
--samplingMethod=< abs | re l>NSpecify sampling method, either absolute number of documents or relative percentage.  [default: abs]
--samplingValue=<x>NIf samplingMethod=abs: positive integer between 1 and 100000 [default: 1000] If samplingMethod=rel: positive integer between 1 and 100 [default: 1] Warning: for obvious performance and response time reasons, be reasonable with these parameters when used on large entities!  What's the point of sampling 10% of 1B+ identical documents?...
--fieldOrder=< keep | alpha >NSpecify whether to preserve order of fields in sampled document or rearrange in alpha order  [default: keep]
--inferRelationships=< true | false >NFor MongoDB only, optionally specify whether to attempt relationship inference based on ObjectID data type.  
--query="{<aggregation pipeline expression>}"NFor MongoDB only, specify query criteria for sampling.  [string] [default: "{}"]
--sort="{<aggregation pipeline expression>}"NFor MongoDB only, specify sort criteria for sampling.  [default: "{}"]
--update=< true | false >NSpecify whether to update existing model.  If false, existing model will be overwritten [default: false]
--conflictResolution< keepBoth | replace | merge | cancel >NSpecify conflict resolution strategy for containers and entities [default: keepBoth]  [values: "keepBoth", "replace", "merge", "cancel"] [default: "keepBoth"]
--selectedObjects= "<containerName>: [<entity1>,<entity2>,…]"NSpecify container(s) to include in documentation [default: all] and an array of entities [default: all] - MongoDB: container = dbs; entity = collection - DynamoDB: container = not applicable; entity = table - Couchbase: container = bucket; entity = document kind - Cosmos DB: container = collection; entity = document type - Elasticsearch: container = index: entity = type - HBase: container =  namespace; entity = table The value is a string surrounded by double quotes (").  Entities are represented as an array surrounded by square brackets ([]), and are separated by a comma (,).  The entities array is separated from the container name by a colon (:).  Containers are separated by semi-colons (;).
--documentKinds= "<bucketName>: <docKindField>"NIn Couchbase only, for each bucket, you must specify the field used to distinguish the different objects stored in the same bucket.
--entityVersion=<version number>NIn EventBridge only, specify schema version [default: latest]
--includeEmpty=< true | false >NIn MongoDB only, specify whether to include empty collections [default: false]
--includeSystem=< true | false >NIn MongoDB only, specify whether to include system collections [default: false] 
--includeViews=< true | false >NIn MongoDB only, specify whether to include views [default: true]
--selectedDB=<database name>NIn Cosmos DB, you must specify the name of the database which should be reverse-engineered. [default: the first database in the list]
--pagination=< 0 | 1 | ...>NIn Couchbase, you must specify the number of documents per page for pagination. If value = 0, pagination is disabled  [default: 0]
--detectPattern=< true | false >NSpecify whether to automatically convert pattern fields [default: false]
--combineSchemas=< true | false >NFor Avro and Parquet only.  Specify whether to combine the schemas of multiple files, or keep one schema per file reverse-engineered [default: false]
--namingConventions=< business | technical >NIf application parameters are set to enable Naming Conventions, specify whether to reverse-engineer source attributes as business names or as technical names.  Conversions will be applied according to your Naming Conventions parameters.   [default: business, if Naming Conventions are disabled; or technical, if Naming Conventions are enabled in application Tools > Options]
--distribution=< true | false >NSpecify whether to perform orthogonal distribution of model entities on ERD. [default: true]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

MongoDB example:   

hackolade revEng --connectName=local --target=MONGODB --samplingMethod=abs --samplingValue=1000 \

--model=C:\Users\Public\Bitbucket\hackolade\schemas\MongoDB\yelp.json --includeEmpty=true --selectedObjects="yelp" --fieldOrder=keep 

 

DynamoDB example:   

"C:\Program Files\Hackolade\hackolade" revEng --connectName=local --target=DYNAMODB --samplingMethod=abs --samplingValue=1000 \

--model=C:\Users\Public\Bitbucket\hackolade\schemas\DynamoDB\entertainment.json --includeEmpty=false --selectedObjects="[Movies, Music]" --fieldOrder=alpha 

 

Couchbase example:   

"C:\Program Files\Hackolade\hackolade" revEng --connectName=local --target=COUCHBASE --samplingMethod=abs --samplingValue=2500 \

--model=C:\Users\Public\Bitbucket\hackolade\schemas\Couchbase\travel-sample.json --includeEmpty=false --selectedObjects="travel-sample: [airline, airport]" 
--documentKinds="travel-sample: type" --fieldOrder=keep

 

Cosmos DB example:

start /wait hackolade reveng --target=COSMOSDB-DOC --connectName=azure --model=travel.json --selectedObjects="travel" --documentKinds="travel:type"

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

revEngJSON

The revEngJSON command allows to trigger a reverse-engineering process of a JSON document or JSON Schema file, as described in this page.

 

Usage:    hackolade revEngJSON [--arguments]

 

ArgumentRequiredPurpose
--target=<target>YNative target for model: JSON, MONGODB, DYNAMODB, COUCHBASE, or plugin target: ArangoDB, Avro, CASSANDRA, COSMOSDB-SQL, COSMOSDB-MONGO, ELASTICSEARCH, EventBridge, Glue, HBase, HIVE, JOI, MSSQLServer; NEO4J, OPENAPI, PARQUET, ScyllaDB, Snowflake, SWAGGER, Synapse, TinkerPop
--files="<file1>;<file2>..."YSpecify the directory path and file names to be reverse-engineered
--model=<file>*YFull path and file name for target Hackolade model into which reverse-engineering process has to be converted.  Extension .json is optional
--entityHandling=< ERD | definitions >NSpecify whether must be reverse-engineered as entities in ERD or as model definitions [default: ERD]
--container=<containerName>NSpecify a container name into which reverse-engineered entities will be inserted [default=""]
--normalization=< true | false >NOnly applicable to RDBMS.  Specify whether complex data types should be normalized into separate entities [default: true]
--conflictResolution=< keepBoth | replace | merge | cancel >NSpecify conflict strategy for containers and entities [default: keepBoth]
--ndjson=< true | false >NSpecify whether the file is contains NDJSON to leverage sampling options [default: false]
--namingConventions=< business | technical >NIf application parameters are set to enable Naming Conventions, specify whether to reverse-engineer source attributes as business names or as technical names.  Conversions will be applied according to your Naming Conventions parameters.   [default: business, if Naming Conventions are disabled; or technical, if Naming Conventions are enabled in application Tools > Options]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

revEngYAML

The revEngYAML command allows to trigger a reverse-engineering process of a YAML document file, as described in this page.

 

Usage:    hackolade revEngJSON [--arguments]

 

ArgumentRequiredPurpose
--target=<target>YNative target for model: JSON, MONGODB, DYNAMODB, COUCHBASE, or plugin target: ArangoDB, Avro, CASSANDRA, COSMOSDB-SQL, COSMOSDB-MONGO, ELASTICSEARCH, EventBridge, Glue, HBase, HIVE, JOI, MSSQLServer; NEO4J, OPENAPI, PARQUET, ScyllaDB, Snowflake, SWAGGER, Synapse, TinkerPop
--files="<file1>;<file2>..."YSpecify the directory path and file names to be reverse-engineered
--model=<file>*YFull path and file name for target Hackolade model into which reverse-engineering process has to be converted.  Extension .json is optional
--entityHandling=< ERD | definitions >NSpecify whether must be reverse-engineered as entities in ERD or as model definitions [default: ERD]
--container=<containerName>NSpecify a container name into which reverse-engineered entities will be inserted [default=""]
--normalization=< true | false >NOnly applicable to RDBMS.  Specify whether complex data types should be normalized into separate entities [default: true]
--conflictResolution=< keepBoth | replace | merge | cancel >NSpecify conflict strategy for containers and entities [default: keepBoth]
--ndjson=< true | false >NSpecify whether the file is contains NDJSON to leverage sampling options [default: false]
--namingConventions=< business | technical >NIf application parameters are set to enable Naming Conventions, specify whether to reverse-engineer source attributes as business names or as technical names.  Conversions will be applied according to your Naming Conventions parameters.   [default: business, if Naming Conventions are disabled; or technical, if Naming Conventions are enabled in application Tools > Options]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

revEngDDL

The revEngDDL command allows to trigger a reverse-engineering process of a Data Definition Language file from a database instance, as described in this page.

 

Usage:    hackolade revEngDDL [--arguments]

 

ArgumentRequiredPurpose
--target=<target>YNative target for model: MONGODB, DYNAMODB, COUCHBASE, or plugin target: ArangoDB, Avro, CASSANDRA, COSMOSDB-SQL, COSMOSDB-MONGO, ELASTICSEARCH, EventBridge, Glue, HBase, HIVE, JOI, MSSQLServer; NEO4J, OPENAPI, PARQUET, ScyllaDB, Snowflake, SWAGGER, Synapse, TinkerPop
--file=<file>*YSpecify the directory path and file name where the schema file to be reverse-engineered is located (file type must be compatible with selected target.) 
--model=<file>*YFull path and file name for target Hackolade model into which reverse-engineering process has to be converted.  Extension .json is optional
--entityHandling=< ERD | definitions >NSpecify whether must be reverse-engineered as entities in ERD or as model definitions [default: ERD]
--container=<containerName>NSpecify a container name into which reverse-engineered entities will be inserted [default=""]
--database=< oracle | mysql | mssqlserver | db2 | postgres | informix | snowflake | teradata >YName of database of DDL script 
--update=< true | false >NSpecify whether to update existing model.  If false, existing model will be overwritten [default: false]
--conflictResolution=< keepBoth | replace | merge | cancel >NSpecify conflict strategy for containers and entities [default: keepBoth]
--namingConventions=< business | technical >NIf application parameters are set to enable Naming Conventions, specify whether to reverse-engineer source attributes as business names or as technical names.  Conversions will be applied according to your Naming Conventions parameters.   [default: business, if Naming Conventions are disabled; or technical, if Naming Conventions are enabled in application Tools > Options]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

revEngXSD

The revEngXSD command allows to trigger a reverse-engineering process of an XMS schema (XSD).

 

Usage:    hackolade revEngXSD [--arguments]

 

ArgumentRequiredPurpose
--target=<target>YNative target for model: MONGODB, DYNAMODB, COUCHBASE, or plugin target: ArangoDB, Avro, CASSANDRA, COSMOSDB-SQL, COSMOSDB-MONGO, ELASTICSEARCH, EventBridge, Glue, HBase, HIVE, JOI, MSSQLServer; NEO4J, OPENAPI, PARQUET, ScyllaDB, Snowflake, SWAGGER, Synapse, TinkerPop
--file=<file>*YSpecify the directory path and file name where the schema file to be reverse-engineered is located (file type must be compatible with selected target.) 
--model=<file>*YFull path and file name for target Hackolade model into which reverse-engineering process has to be converted.  Extension .json is optional
--entityHandling=< ERD | definitions >NSpecify whether must be reverse-engineered as entities in ERD or as model definitions [default: ERD]
--container=<containerName>NSpecify a container name into which reverse-engineered entities will be inserted [default=""]
--conflictResolution=< keepBoth | replace | merge | cancel >NSpecify conflict strategy for containers and entities [default: keepBoth]
--namingConventions=< business | technical >NIf application parameters are set to enable Naming Conventions, specify whether to reverse-engineer source attributes as business names or as technical names.  Conversions will be applied according to your Naming Conventions parameters.   [default: business, if Naming Conventions are disabled; or technical, if Naming Conventions are enabled in application Tools > Options]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

compMod

The compMod command lets you compare two (2) Hackolade models and derive a delta model file with additions, deletions, and modifications of fields and structures.

 

When merging into a targetModel, you should keep in mind that:

- merging is an "all or nothing" proposition

- handling of modifications is impacted by the order in which model1 and model 2 are assigned, with model2 values being kept in the merged model

 

Usage:    hackolade compMod [--arguments]

 

ArgumentRequiredPurpose
--model1=<file>*YFull path and file name for baseline Hackolade model with which comparison will be performed.  Extension .json is optional
--model2=<file>*YFull path and file name for comparison Hackolade model.  Extension .json is optional.  Both models need to be of the same DB target.  Extension .json is optional
--deltaModel=<file>*YFull path and file name for differences in model comparisons.  The resulting file is a Hackolade model.  Extension .json is optional
--ignoreGUIDs=<true | false>NSpecify whether to include GUIDs in comparison.  [default: true]
--ignoreExtraProperties= <true | false>NSpecify whether to include in comparison the Hackolade properties that cannot typically be derived from the data, such as descriptions, comments, samples, defaults, and foreign key-related infos.  [default: true]
--ignoreOrder=<true | false>NSpecify whether to detect changes in order of fields [default: true]
--targetModel=<file>*NFull path and file name for the Hackolade model resulting from the merge of model1 and model2.  If specified, a new Hackolade model is created with a merge of the 2 original models.  Extension .json is optional
--mergedeletesasdeactivated= <true | false>NSpecify whether to deactivate deleted objects [default: false]
--gui=<true | false>NSpecify whether to open Model Compare & Merge in GUI  [default: false]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Note: unless GUID’s are used for the comparison, a change in a field name will result in an addition plus a deletion.  Same for a change in an entity name.

 

Example:

*C:\PROGRA~1\Hackolade\hackolade compMod --model1=yesterdaysModel --model2=todaysModel --deltaModel=todaysDeltaModel --ignoreGUIDs=true *
*--ignoreExtraProperties=true --ignoreOrder=true *

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

Structure of resulting delta model file:

Image

A delta model file may contain any combination of multiple additions, deletions, and modifications.  Deeply nested fields are referenced through their structure.  To merge newly added fields, open your baseline model in Hackolade, then either copy/paste from the delta model, or reference the new field via an external reference to the delta model, then convert the reference into attributes.

 

forwEng

Forward engineer structure created with the application to dynamically generate the schema of the selected entities.

 

Usage:    hackolade forwEng [--arguments]

 

ArgumentRequiredPurpose
--model=<file>*YFull path and file name for target Hackolade model. Extension .json is optional 
--path=<file>*YSpecify the directory path where the forward-engineered files will be created.
--outputType=< jsonschema | jsondata yamldata | script | schemaregistry >YSpecify the type of output (JSON Schema, sample JSON data, sample YAML data, script, or schemaregistry) [default: jsonschema] This allows output of the script corresponding to the target of the specified model, e.g.: CQL for Cassandra, HQL for Hive, etc... It also allows the publication of Avro and JSON schemas to schema registry instances
–jsonSchemaCompliance=< standard | full | extended >NSpecify JSON Schema compliance: standard for only JSON Schema keywords and data types, full for additional custom properties, or extended for target-specific data types and internal properties.  [default: standard]
--jsonschemaversion=< draft-04 | draft 06 | draft-07 | 2019-09 >NSpecify JSON Schema specification version [default: draft-04]   [values: draft-04, draft-06, draft-07, 2019-09] [default: "draft-04"]
--format=< format >NOutput target-specific schema format: MongoDB: ["shell", "mongoose","js", $jsonchema"] Couchbase: ["ottoman", "n1ql"] Avro: ["avroSchema", "schemaRegistry"] Glue: ["awsCLI", "HiveQL"] OpenAPI:   ["json", "yaml"] Swagger:   ["json", "yaml"]
--connectName=<connection>Y if to instanceName of connection settings saved in the Hackolade instance where CLI is invoked. Or use --connectFile instead.
--connectFile=<file>*NFull file path of connection config file (you don't need to use it when connect name is specified).  The simplest way to create a connection file is to create a connection in the GUI application, then export the connection settings to file, encrypted or not.
--selectedObjects= "<containerName>: [<entity1>,<entity2>,…]"NSpecify array of entities to include to result from model [default: all]
--scriptType=<create | update >NFor Cassandra, if outputType=script, specify type of forward engineering script [default:create]
(deprecated) --resolvDefs=<true | false>NIf outputType=jsonschema, specify whether to output resolved definitions (true) or referenced definitions (false) [default=true]
--defsStrategy== <resolved | referenced | internal>NIf outputType=jsonschema, specify whether to output resolved, referenced or internal definitions  [values: "resolved", "referenced", "internal"] [default: "resolved"]
--updateExtDefs=<true | false>NWhen reference to external definition, update current model to ensure latest changes are included. [default: false]
--insertSampleData=<true | false>NInclude sample data to the output if it supports. [default: false]
--minify=<true | false>NFor JSON document/schema and Avro Schema, minify output instead of default beautifying format [default=false]
--batchScriptSeparator=< ; | GO >NFor SQL Server, Azure SQL, and Synapse, specify if you wish a different separator than the default semi-column ";" or the "GO" separator [choices: ";", "GO"] [default: ";"]
--validateSchema=<true | false>NIf output script supports validation (JSON Schema, Avro Schema, Swagger, OpenAPI,...), run respective validator, and generate error message if validation failed. [default: false]
--ignoredoc=<true | false>NIgnore description and comment properties which may be too wordy for this purpose [default: false]
--definitions=<true | false>NJSON Schema only, specify whether to generate json schema of model definitions [default: false]
--level=<entity | container | model>NSpecify the forward-engineering level [values: "entity", "container", "model"]
--skipUndefinedLevel=<true | false>NFor JSON document/schema, skip extraneous folder level when container is undefined [default: false]
--structuredpath=<true | false>NUse a structured path for naming a model folder [default: true]
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Example:

C:\PROGRA~1\Hackolade\hackolade forwEng --model=masterdata.json --path=masterdata.cql  

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

 

forwEngAPI

The forwEngAPI command lets you generate a Swagger or OpenAPI model from the merging of a source data model and a user-defined template

 

Usage:    hackolade forwEngAPI [--arguments]

 

ArgumentRequiredPurpose
--sourcemodel=<file>*YFull path and file name for the Hackolade model to to serve as a basis for the API generation.  Extension .json is optional
--selectedObjects= "<containerName>: [<entity1>,<entity2>,…]"NSpecify array of entities to include to result from model [default: all]
--APItemplate=<file>*YFull path and file name for the template to be used during API generation to create the specified resources for each selected entity of the source model.  The template can be a Hackolade model, or a Swagger or OpenAPI documentation file in either JSON or YAML.
--targetModelFormat=< Swagger | OpenAPI >NSpecify the target of the target model.  ["Swagger" or "OpenAPI"] [default: OpenAPI]
--targetmodel=<file>*YFull path and file name for the obfuscated Hackolade model.  Extension .json is optional.  
--APIdocFile=<file>*NIf specified, the corresponding documentation file will be generated in the chose model format.  Specify the full path and file name for the generated documentation file.
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Example:

C:\PROGRA~1\Hackolade\hackolade forwEngAPI --sourcemodel=masterdata --APItemplate=OAS_generation --targetmodel=masterdataAPI  --APIdocFile=masterdata_OpenAPI 

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

forwEngDataDictionary

The forwEngDataDictionary command lets you publish Hackolade data models to a Data Dictionary instance (currently Collibra only.)

 

Usage:    hackolade forwEngDataDictionary [--arguments]

 

ArgumentRequiredPurpose
--model=<file>*YFull path and file name for the  Hackolade model to be published to the Data Dictionary instance. Extension .json is optional.
--connectName=<connection>Y if to instanceName of connection settings saved in the Hackolade instance where CLI is invoked. Or use --connectFile instead.
--connectFile=<file>*NFull file path of connection config file (you don't need to use it when connect name is specified).  The simplest way to create a connection file is to create a connection in the GUI application, then export the connection settings to file, encrypted or not.
--selectedObjects= "<containerName>: [<entity1>,<entity2>,…]"NSpecify array of entities to include to result from model [default: all]
--targetResource=<resource>YName of a target resource (domain) in Data Dictionary instance.
--forceconfig=<true | false>NSpecify whether to create necessary config in Data Dictionary instance
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Example:

C:\PROGRA~1\Hackolade\hackolade forwEngDataDictionary --model=yel --connectioName=Collibra_instance --targetResource=Yelp  

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

obfusc 

The obfusc command lets you garble sensitive properties: business name, technical name, description, comments, enumeration.  Use if you need to send a model for troubleshooting but don't want to disclose sensitive aspects of the model.

 

Usage:    hackolade obfusc [--arguments]

 

ArgumentRequiredPurpose
--sourcemodel=<file>*YFull path and file name for the Hackolade model to be obfuscated.  Extension .json is optional
--targetmodel=<file>*YFull path and file name for the obfuscated Hackolade model.  Extension .json is optional.  
--logLevel=< 1 | 2 | 3 | 4 >N1 = no spinner, no info, no error messages 2 = no spinner, no info 3 = no spinner 4 = full output [default: 4]

*: If path and/or file name contains blanks, the value must be surrounded by double quotes (“)  Path can be ignored if file is in local directory.

 

Example:

*C:\PROGRA~1\Hackolade\hackolade obfusc --sourcemodel=masterdata --targetmodel=garbeledmasterdata  *

 

Note: If the path contains spaces, Windows generates an error message when running the CLI from another directory than the one where the Hackolade executable was installed, even if using quotes, e.g.: "C:\Program Files\Hackolade\hackolade" .  The workaround, assuming:

Image

is to to use the 8.3 command C:\PROGRA~1\Hackolade\hackolade, as displayed above.

 

help

The --help argument displays the version number of Hackolade being invoked.

 

Usage:    start /wait hackolade --help

 

performance

The --performance argument records timestamps of application startup steps for performance troubleshooting.  Log can be found in file HackoladePerformance.log in directory C:\Users\%username%\AppData\Roaming\HackoladeLogs (Windows) or folder Users/$USER/Documents/HackoladeLogs (Mac/Linux)

 

Usage:    start /wait hackolade --performance

 

version

The --version argument displays the version number of Hackolade being invoked.

 

Usage:    start /wait hackolade --version