Confluent Schema Registry
The Confluent Schema Registry is a central repository with a RESTful interface for developers to define standard schemas and register applications to enable compatibility. Schema Registry is available as a software component of Confluent Platform or as a managed component of Confluent Cloud. Confluent Schema Validation provides a direct interface between the Kafka broker and Schema Registry to validate and enforce schemas programmatically. Schema Validation can be configured at the Kafka topic level.
A key component of event streaming is to enable broad compatibility between applications connecting to Kafka. In a large organizations, trying to ensure data compatibility can be difficult and ultimately ineffective, so schemas should be handled as “contracts” between producers and consumers. Kafka cannot prevent unformatted data from being published. As a Kafka deployment scales and the number of connected systems and apps increases, so does the amount of risk and uncertainty regarding data quality.
Hackolade supports Avro schema and JSON Schema maintenance in Confluent Schema Registry. Unlike many other Hackolade targets, this does not require a special plugin. It is done respectively through the Avro, Parquet or ProtoBuf plugins, or the JSON native target.
In effect, Hackolade provides a graphical interface for the design and maintenance of Avro and JSON schemas stored in Confluent Schema Registry.

For the details on how to define Avro schema definitions in Hackolade, you should consult this Avro schema page.