[Webinar] How to Protect Sensitive Data with CSFLE | Register Today

Getting Started with Schemas and Schema Registries

Écrit par

Implementing schemas over your data is essential for any enduring event streaming system, particularly ones that share data between different microservices or teams. Schemas enforce the implied contract between applications that produce your data and downstream applications that consume your data.

Schema Registry 101 is an introductory course during which you will learn how to use schemas and a schema registry to establish this contract. 

The key concepts of Schema Registry 

In the first course module, you will learn how the schema registry provides what you need to keep client applications in sync with the data changes in your organization or business.

This module is followed by a hands-on exercise during which you will learn how to configure applications to connect with a Kafka cluster, Schema Registry, and ksqlDB in Confluent Cloud. This will prepare you for the hands-on exercises that follow several course modules.

The schema workflow

In this module you will learn about the workflow of using schemas including writing schema files, adding them to a project, and leveraging tools such as Maven and Gradle to generate the model objects that schemas represent as well as register and update them in a schema registry.

In the hands-on exercise that follows, you will build, configure, and register Protobuf and Avro schemas. During the exercise you will:

  1. Examine the settings in a Gradle configuration file

  2. Configure Protobuf and Avro schema definitions

  3. Generate model objects from the schema definitions using Gradle

  4. Register the schemas in Confluent Cloud Schema Registry

Schema formats

In the schema formats module, you will learn about Protobuf and Avro schema definition formats and how to work with generated objects that are built from each.

Managing schemas

In the managing schemas module, you’ll learn that schema management in large part revolves around registering schemas in the schema registry and you will learn about several methods for doing so. You will also learn how schema IDs are automatically assigned when schemas are registered. Also how schema version numbers are assigned as schemas evolve and the resulting new schema versions are registered. This module also shows how you can view and retrieve schemas from the schema registry.

Integrate Schema Registry with clients

In this module you will take what you’ve learned so far about schemas and schema registry and put it into action—working with client applications. You will start with the Confluent CLI and the console Kafka producer and consumer clients that ship with Schema Registry. You will then learn how to integrate KafkaProducer and KafkaConsumer clients as well as ksqlDB.

In the hands-on exercise that follows you will practice what you just learned.

Schema subjects

In this module you will learn about the concept of the schema subject, the different strategies for subject naming, and how to apply them. You will also learn how the schema subject name is used for compatibility checks as well as schema versioning.

Testing schema compatibility

In the final course module you will learn about schema compatibility, the compatibility modes you have at your disposal when using Confluent Schema Registry, and how to make use of them. You will also learn how the Confluent Schema Registry verifies schema compatibility based upon the compatibility modes that you assign to each schema subject. These checks establish guardrails that guide you to safely update schemas and allow you to keep your clients operational as they are updated with the changes.

In the hands-on exercise that follows, you will evolve Protobuf and Avro schemas that you created in prior exercises. You will verify the compatibility of the evolved schemas, identify the cause when a compatibility check fails, make the required correction and verify the resulting successful compatibility check.

Next steps

Learn more about Confluent Schema Registry by taking the full course on Confluent Developer

Start the Course

Here are some additional resources where you can learn more about schemas:

  • Dave Shook is a senior curriculum developer at Confluent. He previously worked as an instructor for Confluent and as a curriculum developer and instructor at CA Technologies. Most recently, Dave collaborated with Jun Rao in writing the Apache Kafka Internal Architecture course. In his spare time, Dave enjoys many outdoor activities including hiking, cycling, and kayaking as well as spending time with his grandchildren.

Avez-vous aimé cet article de blog ? Partagez-le !