Skip to content

Instantly share code, notes, and snippets.

@mackwic
Created April 28, 2017 08:06
  • Star 9 You must be signed in to star a gist
  • Fork 6 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save mackwic/e68ccf10d9a27d1e1d7196bac5790831 to your computer and use it in GitHub Desktop.
[root@kafka-a-01 /]# /opt/kafka_current/bin/kafka-console-consumer.sh
The console consumer is a tool that reads data from Kafka and outputs it to standard output.
Option Description
------ -----------
--blacklist <blacklist> Blacklist of topics to exclude from
consumption.
--bootstrap-server <server to connect REQUIRED (unless old consumer is
to> used): The server to connect to.
--consumer-property <consumer_prop> A mechanism to pass user-defined
properties in the form key=value to
the consumer.
--consumer.config <config file> Consumer config properties file. Note
that [consumer-property] takes
precedence over this config.
--csv-reporter-enabled If set, the CSV metrics reporter will
be enabled
--delete-consumer-offsets If specified, the consumer path in
zookeeper is deleted when starting up
--enable-systest-events Log lifecycle events of the consumer
in addition to logging consumed
messages. (This is specific for
system tests.)
--formatter <class> The name of a class to use for
formatting kafka messages for
display. (default: kafka.tools.
DefaultMessageFormatter)
--from-beginning If the consumer does not already have
an established offset to consume
from, start with the earliest
message present in the log rather
than the latest message.
--key-deserializer <deserializer for
key>
--max-messages <Integer: num_messages> The maximum number of messages to
consume before exiting. If not set,
consumption is continual.
--metrics-dir <metrics directory> If csv-reporter-enable is set, and
this parameter isset, the csv
metrics will be outputed here
--new-consumer Use the new consumer implementation.
This is the default.
--offset <consume offset> The offset id to consume from (a non-
negative number), or 'earliest'
which means from beginning, or
'latest' which means from end
(default: latest)
--partition <Integer: partition> The partition to consume from.
--property <prop> The properties to initialize the
message formatter.
--skip-message-on-error If there is an error when processing a
message, skip it instead of halt.
--timeout-ms <Integer: timeout_ms> If specified, exit if no message is
available for consumption for the
specified interval.
--topic <topic> The topic id to consume on.
--value-deserializer <deserializer for
values>
--whitelist <whitelist> Whitelist of topics to include for
consumption.
--zookeeper <urls> REQUIRED (only when using old
consumer): The connection string for
the zookeeper connection in the form
host:port. Multiple URLS can be
given to allow fail-over.
[root@kafka-a-01 /]# /opt/kafka_current/bin/kafka-simple-consumer-shell.sh
A low-level tool for fetching data directly from a particular replica.
Option Description
------ -----------
--broker-list <hostname:port,..., REQUIRED: The list of hostname and
hostname:port> port of the server to connect to.
--clientId <clientId> The ID of this client. (default:
SimpleConsumerShell)
--fetchsize <Integer: fetchsize> The fetch size of each request.
(default: 1048576)
--formatter <class> The name of a class to use for
formatting kafka messages for
display. (default: kafka.tools.
DefaultMessageFormatter)
--max-messages <Integer: max-messages> The number of messages to consume
(default: 2147483647)
--max-wait-ms <Integer: ms> The max amount of time each fetch
request waits. (default: 1000)
--no-wait-at-logend If set, when the simple consumer
reaches the end of the Log, it will
stop, not waiting for new produced
messages
--offset <Long: consume offset> The offset id to consume from, default
to -2 which means from beginning;
while value -1 means from end
(default: -2)
--partition <Integer: partition> The partition to consume from.
(default: 0)
--print-offsets Print the offsets returned by the
iterator
--property <prop>
--replica <Integer: replica id> The replica id to consume from,
default -1 means leader broker.
(default: -1)
--skip-message-on-error If there is an error when processing a
message, skip it instead of halt.
--topic <topic> REQUIRED: The topic to consume from.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment