I’ve since written a full blog on this particular Single Message Transform. It can be found at See also https://rmoff.net/2020/12/17/twelve-days-of-smt-day-8-timestampconverter/
|"_comment": "Use SMT to cast op_ts and current_ts to timestamp datatype (TimestampConverter is Kafka >=0.11 / Confluent Platform >=3.3). Format from https://docs.oracle.com/javase/8/docs/api/java/text/SimpleDateFormat.html",|
|"transforms.convert_op_ts.format": "yyyy-MM-dd HH:mm:ss.SSSSSS",|
|"transforms.convert_current_ts.format": "yyyy-MM-dd HH:mm:ss.SSSSSS"|
Very helpful indeed!!
I am struggling with one thing concerning my elasticsearch connector. The "transforms.convert_op_ts.target.type" actually work with "string" and "unix" but not with "Date" and "Timestamp" types. Here is my sink config:
I am looking to integrate it with Grafana and as you know the date field should be of "Date" type. I am getting the following error:
Any idea on how to addrress this issue?
Thanks for your help
Hello, Im generating data using avro schema, and this is the field i'm using to generate timestamps :
but then I get this as a result : 3414461-02-18 00:36:47.000234 . Can Anyone please advise on what to do ?
Were you able to correct this? I am hitting a wall this time.
I converted a timestamp with success, but the time zone of date isn't correct. The time zone is always UTC, even my server isn't in this time zone. My server is setup for America/Sao_Paulo and the date created by Kafka Connect Transform is in UTC timezone.
I'm using a sink elasticsearch connector. I could'n find this parameter "db.timezone" in the docs. Actually, I thing this parameter db.timezone is exclusive for JDBC connectors.