Confluent Schema Registry provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving your Avro®, JSON Schema, and Protobuf schemas. It stores a versioned history of all schemas based on a specified subject name strategy, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility settings and expanded support for these schema types. It provides serializers that plug into Apache Kafka® clients that handle schema storage and retrieval for Kafka messages that are sent in any of the supported formats.
This README includes the following sections:
- Documentation
- Quickstart API Usage examples
- Installation
- Deployment
- Development
- OpenAPI Spec
- Contribute
- License
Here are a few links to Schema Registry pages in the Confluent Documentation.
- Installing and Configuring Schema Registry
- Schema Management overview
- Schema Registry Tutorial
- Schema Registry API reference
- Serializers, Deserializers for supported schema types
- Kafka Clients
- Schema Registry on Confluent Cloud
The following assumes you have Kafka and an instance of the Schema Registry running using the default settings. These examples, and more, are also available at API Usage examples on docs.confluent.io.
# Register a new version of a schema under the subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\": \"string\"}"}' \
http://localhost:8081/subjects/Kafka-key/versions
{"id":1}
# Register a new version of a schema under the subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\": \"string\"}"}' \
http://localhost:8081/subjects/Kafka-value/versions
{"id":1}
# List all subjects
$ curl -X GET http://localhost:8081/subjects
["Kafka-value","Kafka-key"]
# List all schema versions registered under the subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions
[1]
# Fetch a schema by globally unique id 1
$ curl -X GET http://localhost:8081/schemas/ids/1
{"schema":"\"string\""}
# Fetch version 1 of the schema registered under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/1
{"subject":"Kafka-value","version":1,"id":1,"schema":"\"string\""}
# Fetch the most recently registered schema under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/latest
{"subject":"Kafka-value","version":1,"id":1,"schema":"\"string\""}
# Delete version 3 of the schema registered under subject "Kafka-value"
$ curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/3
3
# Delete all versions of the schema registered under subject "Kafka-value"
$ curl -X DELETE http://localhost:8081/subjects/Kafka-value
[1, 2, 3, 4, 5]
# Check whether a schema has been registered under subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\": \"string\"}"}' \
http://localhost:8081/subjects/Kafka-key
{"subject":"Kafka-key","version":1,"id":1,"schema":"\"string\""}
# Test compatibility of a schema with the latest schema under subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\": \"string\"}"}' \
http://localhost:8081/compatibility/subjects/Kafka-value/versions/latest
{"is_compatible":true}
# Get top level config
$ curl -X GET http://localhost:8081/config
{"compatibilityLevel":"BACKWARD"}
# Update compatibility requirements globally
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"compatibility": "NONE"}' \
http://localhost:8081/config
{"compatibility":"NONE"}
# Update compatibility requirements under the subject "Kafka-value"
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"compatibility": "BACKWARD"}' \
http://localhost:8081/config/Kafka-value
{"compatibility":"BACKWARD"}
You can download prebuilt versions of the schema registry as part of the Confluent Platform. To install from source, follow the instructions in the Development section.
The REST interface to schema registry includes a built-in Jetty server. The
wrapper scripts bin/schema-registry-start
and bin/schema-registry-stop
are the recommended method of starting and stopping the service.
To build a development version, you may need a development versions of common and rest-utils. After installing these, you can build the Schema Registry with Maven.
This project uses the Google Java code style to keep code clean and consistent.
To build:
mvn compile
To run the unit and integration tests:
mvn test
To run an instance of Schema Registry against a local Kafka cluster (using the default configuration included with Kafka):
mvn exec:java -pl :kafka-schema-registry -Dexec.args="config/schema-registry.properties"
To create a packaged version, optionally skipping the tests:
mvn package [-DskipTests]
It produces:
- Schema registry in
package-schema-registry/target/kafka-schema-registry-package-$VERSION-package
- Serde tools for avro/json/protobuf in
package-kafka-serde-tools/target/kafka-serde-tools-package-$VERSION-package
Each of the produced contains a directory layout similar to the packaged binary versions.
You can also produce a standalone fat JAR of schema registry using the standalone
profile:
mvn package -P standalone [-DskipTests]
This generates package-schema-registry/target/kafka-schema-registry-package-$VERSION-standalone.jar
, which includes all the dependencies as well.
OpenAPI (formerly known as Swagger) specifications are built automatically using swagger-maven-plugin
on compile
phase.
Thanks for helping us to make Schema Registry even better!
- Source Code: https://github.com/confluentinc/schema-registry
- Issue Tracker: https://github.com/confluentinc/schema-registry/issues
The project is licensed under the Confluent Community License, except for the client-*
and avro-*
libs,
which are under the Apache 2.0 license.
See LICENSE file in each subfolder for detailed license agreement.