Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more
Imagine you are developing a streaming application that consumes data from a topic that has a defined schema. You are finally ready to test the application and realize you need a way to produce a sample schematized message. What do you do?
Today, there are a few ways to approach this scenario. You might scramble and try to spin up a lightweight producer with the right schema information to produce sample messages. Or, you might use the Confluent CLI and try to get the right file or ID for the schema you need. However, each of these methods have their drawbacks—mainly in the time it takes to set them up from scratch. At Confluent, we’ve learned through internal and external reviews that many developers use Confluent Cloud Console’s message producing functionality to create sample messages; however, the console lacked the functionality to produce schematized messages directly from the UI.
To make application testing for topics with schemas easier, we are excited to announce that Confluent Cloud Console can now produce messages that are serialized with schemas. Developers can send sample messages by simply selecting their schema and entering a compliant message in the input. Furthermore, the console can also validate messages ad hoc to ensure compliant messages are produced to the topic, avoiding issues for downstream services. Let’s take a closer look.
After logging in to Confluent Cloud Console, navigate to <your cluster> -> Topics -> <your topic> -> Messages. This screen shows the most recent messages that are in your topic. To produce a new message, click on the Actions button and select Produce new message.
In our new interface, you can now individually enter headers, the key, and the value. For the key and value, a dropdown is available to select the schema subject, or produce without a schema. The functionality only supports subjects using TopicNameStrategy, however, it also supports different schema contexts. Note that if there is no schema available in the topic, the dropdown is locked to Produce without schema. In this example, the message will be produced with no schema for the key, and the schema labeled Default - purchase_orders-value. In the label, “Default” refers to the default schema context and “purchase_orders-value” is the schema subject name. If the topic has more than one context, other options will appear for those contexts. At any time, clicking on the View schema link will open the schema menu in a new tab with the schema definition of the selected schema.
While entering the key or value, the Validate button can be clicked at any time to determine the validity of the message against the chosen schema. The schema selected in this example expects the “orderid” field to be the integer type. If the value is changed to the string “59” instead of the integer, then the following would appear if Validate is clicked. Note that the exact output of the validation will vary depending on if you use Avro, Protobuf, or JSON Schema formats. The Produce button will also perform a validation check prior to actually producing the message, and you can produce a message without needing to manually click Validate every time.
Once the message is adjusted to become valid, the Validate button will return a valid confirmation, and the Produce button will successfully produce the message. After a short delay, you should be able to see the newly produced message in the message browser.
For developers looking for an easier way to produce messages with schemas, Confluent Cloud Console now has you covered. You no longer have to spin up some custom code or create a long command with many flags to produce a few simple test messages. Furthermore, the validation feature ensures consistency during your testing and avoids incorrect messages polluting your topic. Visit Confluent Cloud today to get started!
Continuing our discussion of JVM microservices frameworks used with Apache Kafka, we introduce Micronaut. Let’s integrate a Micronaut microservice with Confluent Cloud—using Stream Governance—and test the Kafka integration with TestContainers.
With both Confluent and Amazon Redshift supporting mTLS, streaming developers and architects are able to take advantage of a native integration that allows Amazon Redshift to query Confluent Cloud topics.