In modern IT architectures, where events must be processed and distributed in real time, more and more companies are turning to Apache Kafka — or its commercial variant, Confluent Kafka.
Example: A global corporation wants to distribute status information from one system to multiple microservices almost instantly — including validation of data formats via Schema Registry. The twist: data should be processed and sent directly through Automic Automation.
What sounds simple in theory comes with several challenges in practice — especially when integrating with Confluent Cloud.
What is Kafka?
Kafka was originally developed by LinkedIn and donated to the Apache Software Foundation in 2012. It is a real-time messaging platform.
The original developers later founded Confluent, a commercial provider that builds on Kafka and adds a variety of features, including:
- Easy cloud deployment
- Intuitive web interface
- Extended authentication (e.g., OAuth via Microsoft Entra ID)
- Schema Registry for AVRO serialization
These features make Confluent Kafka particularly attractive for regulated and distributed environments.
Requirements
- Kafka variant: Confluent Kafka
- Authentication
- opic content: OAuth Microsoft Entra ID (ServicePrincipal)
- Schema Registry: BasicAuth Confluent Kafka
- Schema Registry: AVRO (mandatory)
Automic Automation Kafka Agent?
The native Kafka agent of Automic Automation does not currently support the advanced features of Confluent Cloud, particularly Schema Registry with AVRO.
The solution: A custom-built Kafka Producer based on Python, fully controlled by Automic Automation.

Interface Architecture

The final integration was implemented entirely via the Automic Automation Unix Agent.
The process follows these steps:
- Install Confluent Kafka libraries (including dependencies) on the Automic Automation Unix Agent.
- Install Python modules on the Automic Automation Unix Agent
- Python script – generically generated
- Automic Automation Job – This generates the Python script generically and executes it
Installing the Confluent Kafka Library

Installation of the Confluent Kafka libraries and dependencies on the Unix Agent:
yum install -y python3 python3-pip python3-devel gcc make cyrus-sasl-gssapi librdkafka-devel
Installing Python Modules

Next, the necessary Python modules must also be installed on the Automic Automation agent.
A dedicated job plan was created in Automic Automation to automatically install them across all active agents of the relevant host group.
Note: Python modules can be installed in the user space of the executing account — typically required when installation is automated via an Automic Automation job plan.
- generate requirements file e.g. (/tmp/kafka_python_requirements.txt).
cat <<< 'confluent-kafka == 2.4.0
fastavro
pydantic
pydantic_avro' >/tmp/kafka_python_requirements.txt
- Installation of the modules
- General installation:
pip3 install -r /tmp/kafka_python_requirements.txt - Installation within the current user space:
pip3 install -r /tmp/kafka_python_requirements.txt --user
- General installation:
Python Script Confluent Kafka Producer

Here a generic Python script is generated by Automic Automation – including all parameters.
Important exception: Please do not pass passwords into the script; they are transferred as parameters during execution.
Note: An example Producer for Confluent Kafka can be found below.
AA Job

Generation of an individual Python Producer through an Automic Automation Job, including dynamic transfer of all parameters.
Sensitive data (e.g. passwords) are managed in protected Automic Automation objects and securely passed at runtime.
The execution of the script is done via the “UC4 JOBMELDER” mechanism. This ensures that sensitive passwords remain protected.
Download Confluent Kafka Producer – Python
An example Producer script for Confluent Kafka can be seen in the attached script.
Usage is described in the included README.
Producer Details:
- Authentication Kafka Topic: OAuth via Microsoft Entra ID
- Authentication Schema Registry: Access with BasicAuth Confluent Cloud
- Schema Registry: Definition of the data structure via AVRO schema
Conclusion
This solution shows how tailored integration between commercial platforms like Confluent Kafka and established automation tools like Automic Automation is possible — even when the native agent reaches its functional limits.
By combining Python with modern authentication, we built a robust, secure, and extensible interface that’s both technically sound and operationally safe.
About the Author
Jonathan Koch is Managing Consultant at setis and was recognized by Broadcom as a Knight für Automic Automation. He brings many years of experience in delivering complex automation projects, with a strong focus on the banking industry.

