Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming On-Demand
A good command line interface is essential for developer productivity. If you look at any of the major cloud providers, they all have a robust CLI API that enables you to achieve high productivity. The key benefits of a CLI include:
Increased productivity: You can execute complex workflows faster than if you were using a UI. Additionally, you can chain commands together to accomplish complex tasks quickly.
Automation with scripting: Developers can script CLI commands to automate repetitive tasks for even more significant gains in productivity
Flexibility: With a CLI, you have complete flexibility in performing any task since you can specify many parameters simultaneously.
Confluent offers a powerful CLI that lets you quickly create and manage Apache Kafka® clusters and Apache Flink® compute pools and all associated operations with both.
The mark of a great CLI API design is that each function does one thing only, and there’s only one command to do any one thing. In that respect, the CLI is a collection of building blocks you can assemble for more complex situations.
For example, consider deleting API keys. There is a command to delete a single API key. But to clean up all API keys you’ll need to list them all and then remove each one with a single command. Automation with scripting will quickly solve this issue. A user could write a script that uses the CLI to list each API key, then execute the delete command for each one. While this approach will solve the problem, there are a couple of drawbacks.
First, an automated script exists outside the CLI and involves a context switch to locate and execute it. Second, for any problem you encounter, you can rest assured others on your team will face the same issue. You could share your script, but that’s outside the CLI. A better solution is to put custom commands directly in the CLI. By providing the ability to use custom commands, they are immediately discoverable to all users of the CLI.
The solution for allowing custom commands is plugins. Plugins can use existing CLI commands as building blocks to produce more complex behavior and provide users with an elegant means to execute dynamic workflows within the CLI.
The Confluent CLI now offers a framework for extending functionality with plugins. What’s a plugin? Plugins are standalone executable files that begin with confluent-
and can be written in any supported language, which currently are Go, Python, and Bash. Because of this, plugins open the door for simple and very complex scripting of CLI workflows, conveniently available with one command. For help with writing your first plugin, see the documentation.
To install a plugin, place your executable file (its name starting with confluent-
) on your $PATH
. You can run confluent plugin list
, which searches your $PATH
for plugin executables and lists them. Confluent also maintains a repository of publicly available plugins. Running the command confluent plugin search
lists all plugins available for installation from the Confluent CLI plugin repository. The search command presents results like this:
Then, to install the plugin, run the command confluent plugin install <PLUGIN NAME>
.
The Confluent plugin repo also contains instructions for contributing a plugin.
We’ve discussed the benefits of using plugins for developing a complex CLI workflow, but let’s look at a concrete example. Confluent Cloud for Apache Flink is currently available for preview. Confluent Cloud for Apache Flink provides a cloud-native experience for Flink. This means you can entirely focus on your business logic, encapsulated in SQL statements.
To use Flink in Confluent Cloud from the CLI, here’s a summary of the steps you’ll take:
Create a Flink compute pool
Specify the number of Confluent Flink units
Enable Schema Registry in the environment containing the Flink compute pool
Create a Kafka cluster, which is a Flink database—Flink can query and join data that are in different clusters/databases
Get the API-Key and password for the cluster
Specify the database to use for your Flink queries
Topics that are tables. Note that creating tables in Flink creates a topic and associated schema
Start a Flink shell session
Optionally create one or more Confluent datagen connectors that will provide sample data for you to experiment with in Flink SQL
While simple enough, you must execute a series of commands to get going, sometimes using the output of one command as the input of another (e.g., API keys). But by using the confluent-flink-quickstart
plugin, it will seamlessly handle all of these steps for you with one command like this:
In summary, the Confluent CLI is a powerful tool enabling Kafka and Flink developers and administrators to accomplish their tasks quickly. The Confluent CLI plugin framework helps those developers and administrators to extend the CLI functionality and share those productivity gains.
GitHub: Stream Processing with Confluent Cloud for Apache Flink
Documentation: Install the Confluent CLI
Documentation: Getting Started with Confluent CLI
GitHub: Confluent CLI plugins
Confluent is pleased to announce that the Confluent CLI—the leading command-line tool for managing enterprise Kafka deployments and modern data flow—is now source available under the Confluent Community License.
Stepping into the world of Apache Kafka® can feel a bit daunting at first. Get started with the top resources for beginners to start building your first Kafka application!