This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ elasticpipe --- | |
[INFO] artifact joda-time:joda-time: checking for updates from company-nexus | |
[INFO] com.company:elasticpipe:jar:0.1.0-SNAPSHOT | |
[INFO] +- org.scala-lang:scala-library:jar:2.11.11:compile | |
[INFO] +- org.scalatest:scalatest_2.11:jar:3.0.1:test | |
[INFO] +- org.scalamock:scalamock-scalatest-support_2.11:jar:3.6.0:test | |
[INFO] | \- org.scalamock:scalamock-core_2.11:jar:3.6.0:test | |
[INFO] +- org.apache.flink:flink-scala_2.11:jar:1.3.2:compile | |
[INFO] | +- org.apache.flink:flink-core:jar:1.3.2:compile | |
[INFO] | | +- org.apache.flink:flink-annotations:jar:1.3.2:compile |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[ip-10-81-45-83] [2017-10-20 05:56:29,534] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Map -> Sink: flink.parquet_processor (1/1) (3cfc5187bb7f26aa64afd95b8cd8b51d) switched from RUNNING to FAILED. | |
java.lang.Exception: Could not perform checkpoint 1 for operator Map -> Sink: flink.parquet_processor (1/1). | |
at org.apache.flink.streaming.runtime.tasks.StreamTask.triggerCheckpointOnBarrier(StreamTask.java:552) | |
at org.apache.flink.streaming.runtime.io.BarrierBuffer.notifyCheckpoint(BarrierBuffer.java:378) | |
at org.apache.flink.streaming.runtime.io.BarrierBuffer.processBarrier(BarrierBuffer.java:228) | |
at org.apache.flink.streaming.runtime.io.BarrierBuffer.getNextNonBlocked(BarrierBuffer.java:183) | |
at org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:213) | |
at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:69) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:263) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# | |
# A fatal error has been detected by the Java Runtime Environment: | |
# | |
# SIGSEGV (0xb) at pc=0x000000010d46440c, pid=25232, tid=0x0000000000003903 | |
# | |
# JRE version: Java(TM) SE Runtime Environment (8.0_131-b11) (build 1.8.0_131-b11) | |
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.131-b11 mixed mode bsd-amd64 compressed oops) | |
# Problematic frame: | |
# V [libjvm.dylib+0x46440c] | |
# |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Caused by: java.lang.Exception: Could not complete snapshot 1 for operator Map -> Sink: flink.processor (1/1). | |
at org.apache.flink.streaming.api.operators.AbstractStreamOperator.snapshotState(AbstractStreamOperator.java:407) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask$CheckpointingOperation.checkpointStreamOperator(StreamTask.java:1163) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask$CheckpointingOperation.executeCheckpointing(StreamTask.java:1095) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask.checkpointState(StreamTask.java:654) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask.performCheckpoint(StreamTask.java:590) | |
at org.apache.flink.streaming.runtime.tasks.StreamTask.triggerCheckpointOnBarrier(StreamTask.java:543) | |
... 8 more | |
Caused by: com.esotericsoftware.kryo.KryoException: java.util.ConcurrentModificationException | |
Serialization trace: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for Sink: flink.parquet_processor (1/1) (e209dbb6088ac82044fb97a3046fef45). | |
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task Sink: flink.parquet_processor (1/1) (e209dbb6088ac82044fb97a3046fef45) [FAILED] | |
INFO org.apache.flink.yarn.YarnTaskManager - Un-registering task and sending final execution state FAILED to JobManager for task Sink: flink.parquet_processor (e209dbb6088ac82044fb97a3046fef45) | |
INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Sink: flink.parquet_processor (1/1) (e209dbb6088ac82044fb97a3046fef45) switched from RUNNING to FAILED. | |
org.apache.hadoop.fs.FileAlreadyExistsException: /tmp/tmp_file.parquet for client <client_id> already exists | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2563) | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesys |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class FileArchievar() | |
extends RichSinkFunction[Message] | |
with CheckpointedFunction | |
with CheckpointedRestoring[ArrayBuffer[Message]] { | |
private var checkpointedMessages: ListState[Message] = _ | |
private val bufferredMessages = ListBuffer[Message] = _ | |
private val pendingFiles = ListBuffer[String]() | |
@throws[IOException] | |
def invoke(message: Message) { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
DEBUG [2017-02-06 17:55:10,878] org.apache.kafka.clients.consumer.internals.ConsumerCoordinator: Committed offset 2 for partition topic2-0 | |
DEBUG [2017-02-06 17:55:10,878] org.apache.kafka.clients.consumer.internals.ConsumerCoordinator: Committed offset 207100160 for partition topic1-0 | |
TRACE [2017-02-06 17:55:10,878] org.apache.kafka.clients.consumer.internals.Fetcher: Returning fetched records at offset 207100160 for assigned partition topic1-0 and update position to 207100163 | |
TRACE [2017-02-06 17:55:10,878] org.apache.kafka.clients.consumer.internals.Fetcher: Added fetch request for partition topic1-0 at offset 207100163 | |
TRACE [2017-02-06 17:55:10,878] org.apache.kafka.clients.consumer.internals.Fetcher: Added fetch request for partition topic2-0 at offset 2 | |
DEBUG [2017-02-06 17:55:10,985] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-06 17:55:11,483] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[2017-02-06 17:55:48,694] TRACE {kafka-network-thread-175127531-PLAINTEXT-1} Completed request:Name: OffsetRequest; Version: 0; CorrelationId: 8; ClientId: group1; ReplicaId: -1; RequestInfo: [topic1,0] -> PartitionOffsetRequestInfo(-2,1),[topic2,0] -> PartitionOffsetRequestInfo(-2,1) from connection 1.1.1.1:9092-1.1.1.2:51047;totalTime:1,requestQueueTime:0,localTime:0,remoteTime:0,responseQueueTime:0,sendTime:1,securityProtocol:PLAINTEXT,principal:User:ANONYMOUS (kafka.request.logger) | |
[2017-02-06 17:55:50,591] TRACE {kafka-network-thread-175127531-PLAINTEXT-1} Completed request:{api_key=12,api_version=0,correlation_id=192917,client_id=consumer-1} -- {group_id=group2,group_generation_id=22,member_id=consumer-1-434d7d74-b749-4501-81aa-b889965dba16} from connection 1.1.1.1:9092-1.1.1.3:37362;totalTime:2,requestQueueTime:0,localTime:0,remoteTime:0,responseQueueTime:1,sendTime:1,securityProtocol:PLAINTEXT,principal:User:ANONYMOUS (kafka.request.logger) | |
[2017-02-06 17:55:50,614] TRACE {kafka-network-thread-1751275 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
enable.auto.commit: "false" | |
auto.offset.reset: "earliest" | |
session.timeout.ms: "120000" | |
request.timeout.ms: "120001" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
DEBUG [2017-02-03 17:05:17,971] org.apache.kafka.clients.consumer.internals.ConsumerCoordinator: Committed offset abc for partition topic1-partition0 | |
DEBUG [2017-02-03 17:05:17,971] org.apache.kafka.clients.consumer.internals.ConsumerCoordinator: Committed offset abc1 for partition topic2-partition0 | |
DEBUG [2017-02-03 17:05:18,112] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-03 17:05:19,828] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-03 17:05:20,902] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-03 17:05:22,860] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-03 17:05:24,112] org.apache.kafka.clients.consumer.internals.AbstractCoordinator: Received successful heartbeat response. | |
DEBUG [2017-02-03 17:05:25,884] org.ap |