Dev
Mock Kafka Message Generator
Building reliable Apache Kafka consumers and event-driven microservices starts with realistic test data. This mock Kafka message generator produces complete, schema-valid Kafka message objects — including topic name, partition, offset, message key, headers, and a typed JSON value payload — so you can test consumer logic without spinning up a real broker. Pick a domain-appropriate event type like user.created, order.placed, or payment.processed and get back properly structured messages in seconds. Each generated message mirrors what a real Kafka consumer receives from a broker: an integer partition assignment, a monotonically increasing offset, a UUID-based record key for partition routing, and standard headers such as content-type and correlation-id. The JSON payload is tailored to the selected event type, so a payment.processed message carries amount, currency, and status fields rather than generic placeholder values. These fake Kafka messages slot directly into unit tests for Kafka Streams topologies, Apache Flink jobs, and Spring Kafka listener methods. Paste a generated record into a MockConsumer, a TestInputTopic, or a WireMock stub and your consumer under test sees data that behaves like production traffic. You can also pipe batches into kafka-console-producer to drive integration tests against a local or containerised broker. The generator supports common event domains out of the box, covering e-commerce, identity, and payment workflows. Adjust the topic name to match your actual topic naming convention, choose the event type closest to your schema, and set the count to generate as many records as your test scenario requires. The result is copy-paste-ready JSON that saves the tedious manual work of crafting realistic Kafka payloads by hand.
How to Use
- Enter your Kafka topic name in the Topic Name field to match your actual topic, e.g. order-service.events.
- Select the Event Type that matches the domain you are testing — user.created, order.placed, or payment.processed.
- Set the How Many counter to the number of records your test scenario requires, then click Generate.
- Copy the generated JSON array and paste it into your unit test, MockConsumer setup, or kafka-console-producer input file.
- Adjust individual field values in the copied payload to cover edge cases like null amounts or missing user IDs.
Use Cases
- •Seeding a Kafka Streams TestInputTopic with realistic domain events
- •Writing unit tests for Spring Kafka @KafkaListener consumer methods
- •Simulating dead-letter queue payloads for error-handling logic tests
- •Populating a MockConsumer when testing offset commit and rebalance behaviour
- •Generating order.placed events to stress-test an inventory update service
- •Creating payment.processed messages to validate idempotency in billing consumers
- •Mocking Kafka records for Flink DataStream source testing
- •Demonstrating Kafka message structure during team onboarding or documentation
Tips
- →Match the topic name exactly to your environment's naming convention so generated messages need no editing before use in tests.
- →Generate a batch of 10 or more, then cherry-pick specific records to cover both happy-path and failure scenarios in the same test suite.
- →The message key is a UUID by default — replace it with a fixed value when testing partition-ordered processing to ensure all test records land on the same partition.
- →Copy the headers object into a Spring Kafka MessageHeaders or a Flink KafkaRecordSerializationSchema to test header-aware consumer branches.
- →Combine payment.processed and user.created events in a single test to verify that your consumer correctly routes multiple event types from the same topic.
- →When testing offset-commit logic, edit the offset field in consecutive generated records to be sequential integers starting from a non-zero value, simulating a mid-stream consumer restart.
FAQ
What fields does a Kafka message actually contain?
A Kafka record has a key, a value (the payload), headers, and broker-assigned metadata: topic name, partition number, and offset. The key is used for partition routing; the value carries the event data, usually serialised as JSON or Avro. Headers hold cross-cutting concerns like content-type, trace IDs, or schema version.
How does a Kafka partition key affect message ordering?
Kafka guarantees ordering only within a single partition. When a producer sends a message with the same key — say, a user UUID — it always lands on the same partition, keeping all events for that user in sequence. Without a key, messages are round-robined across partitions and ordering is lost across them.
How do I use these generated messages in a JUnit Kafka Streams test?
Copy the generated JSON value into a TopologyTestDriver test using TestInputTopic.pipeInput(). Use the generated key as the record key and the headers map to populate a Headers object. The TestInputTopic accepts these directly, so you can drive your topology without a real broker.
Can I use these mock messages with kafka-console-producer?
Yes. Extract the key and value fields from the generated JSON, then pass them to kafka-console-producer --property parse.key=true --property key.separator=: — format each line as key:value. For headers, use a producer client such as the kafka-python or confluent-kafka library, which expose a headers parameter per message.
What is a Kafka offset and why does it matter for testing?
The offset is a sequential integer assigned by the broker to each message within a partition. Consumers commit offsets to track which messages they have processed. In tests, realistic offset values let you verify that your consumer correctly handles offset commits, seeks, and replay scenarios without hitting a real cluster.
How do I simulate a dead-letter queue scenario with mock messages?
Generate a batch of messages using an event type that your consumer is known to fail on — for example, a malformed payment.processed payload. Feed those into your consumer's input topic in tests, then assert that the consumer publishes them to the DLQ topic with an error-reason header attached.
Do the generated payloads match real schemas used in production?
The payloads are realistic approximations of common event shapes — they include the fields you would expect for each event type — but they are not tied to any specific schema registry or Avro IDL. Treat them as a starting point: copy the generated JSON and adjust field names and types to match your actual Avro or JSON Schema definition.
How many messages should I generate for a load test?
For unit tests, three to ten records is usually enough to cover happy path and edge cases. For load or throughput tests, generate a large batch here, write them to a file, then replay the file with kafka-console-producer or a custom producer loop. Kafka can sustain millions of messages per second, so the bottleneck is usually your consumer, not the producer.