Kafka PubSub 连接器:Jetty ALPN/NPN 未正确配置

克里斯托斯·哈吉尼科利斯

我使用kafka_2.11-0.10.2.1和提供的发布-订阅连接器谷歌在这里我要做的就是使用独立连接器将数据从 Kafka 主题推送到 PubSub 主题。我按照我应该做的所有步骤进行了操作:

  1. 产生了 cps-kafka-connector.jar
  2. cps-sink-connector.properties在 kafka 的config目录中添加了该文件该文件如下所示:
name=CPSConnector
connector.class=com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
tasks.max=10
topics=kafka_topic
cps.topic=pubsub_topic
cps.project=my_gcp_project_12345
  1. 我确保我启用了string转换器,connect-standalone.properties因为我的目的是只发送字符串消息:
key.converter=org.apache.kafka.connect.storage.StringConverter 
value.converter=org.apache.kafka.connect.storage.StringConverter
  1. 我创建了一个主题kafka_topic并发送了一些消息,如下所示:
$ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic kafka_topic
$ hello streams
$ kafka streams rock
  1. 我按如下方式运行连接器:
$ bin/connect-standalone.sh config/connect-standalone.properties config/cps-sink-connector.properties

目的是运行:

$ gcloud beta pubsub subscriptions pull subscription_to_pubsub_topic

收集这些消息。然而,下面的错误发生了,我无法理解它们。

它们似乎与jetty-9.2.15.v20160210. 注意:

  [2017-05-04 22:42:26,635] ERROR Commit of WorkerSinkTask{id=CPSConnector-0} offsets threw an unexpected exception:(org.apache.kafka.connect.runtime.WorkerSinkTask:204)
  java.lang.RuntimeException: java.util.concurrent.ExecutionException:       io.grpc.StatusRuntimeException: UNKNOWN

  Caused by: java.util.concurrent.ExecutionException: io.grpc.StatusRuntimeException: UNKNOWN ...
  Caused by: io.grpc.StatusRuntimeException: UNKNOWN ...
  Caused by: java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured ...

有什么想法吗?如何配置码头?在这里阅读了一篇文章其中指出:

今天没有标准的 Java 版本内置对 ALPN 的支持(有一个跟踪问题,所以去投票吧!)所以我们需要为 OpenJDK 使用 Jetty-ALPN(或 Jetty-NPN,如果在 Java < 8 上)引导类路径扩展。为此,请添加一个 Xbootclasspath JVM 选项,以引用 Jetty alpn-boot jar 的路径。

java -Xbootclasspath/p:/path/to/jetty/alpn/extension.jar ...

请注意,您必须使用特定于您正在使用的 Java 版本的 Jetty-ALPN jar 版本。但是,您可以使用 JVM 代理 Jeety-ALPN-Agent 为当前 Java 版本加载正确的 Jetty alpn-boot jar 文件。为此,不要添加 Xbootclasspath 选项,而是添加引用 Jetty alpn-agent jar 路径的 javaagent JVM 选项。

java -javaagent:/path/to/jetty-alpn-agent.jar ...

...但我真的不知道如何在我的配置中说明这一点。有什么想法吗?部分错误日志显示如下:

    ...
    name = CPSConnector
    tasks.max = 10
    transforms = null
    value.converter = null
   (org.apache.kafka.connect.runtime.ConnectorConfig:180)
  [2017-05-04 22:42:17,447] INFO TaskConfig values:
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
   (org.apache.kafka.connect.runtime.TaskConfig:180)
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@6fda6170] Created with target pubsub.googleapis.com:443
  [2017-05-04 22:42:17,447] INFO Instantiated task CPSConnector-7 with version 0.10.2.1 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:317)
  [2017-05-04 22:42:17,451] INFO ConsumerConfig values:
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [localhost:9092]
    check.crcs = true
    client.id =
    connections.max.idle.ms = 540000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-CPSConnector
    heartbeat.interval.ms = 3000
    interceptor.classes = null
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.ms = 50
    request.timeout.ms = 305000
    retry.backoff.ms = 100
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
   (org.apache.kafka.clients.consumer.ConsumerConfig:180)
  [2017-05-04 22:42:17,461] INFO Kafka version : 0.10.2.1 (org.apache.kafka.common.utils.AppInfoParser:83)
  [2017-05-04 22:42:17,461] INFO Kafka commitId : e89bffd6b2eff799 (org.apache.kafka.common.utils.AppInfoParser:84)
  [2017-05-04 22:42:17,462] INFO Creating task CPSConnector-8 (org.apache.kafka.connect.runtime.Worker:305)
  [2017-05-04 22:42:17,463] INFO ConnectorConfig values:
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    key.converter = null
    name = CPSConnector
    tasks.max = 10
    transforms = null
    value.converter = null
   (org.apache.kafka.connect.runtime.ConnectorConfig:180)
  [2017-05-04 22:42:17,463] INFO TaskConfig values:
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
   (org.apache.kafka.connect.runtime.TaskConfig:180)
  [2017-05-04 22:42:17,463] INFO Instantiated task CPSConnector-8 with version 0.10.2.1 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:317)
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@74e87b55] Created with target pubsub.googleapis.com:443
  [2017-05-04 22:42:17,465] INFO ConsumerConfig values:
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [localhost:9092]
    check.crcs = true
    client.id =
    connections.max.idle.ms = 540000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-CPSConnector
    heartbeat.interval.ms = 3000
    interceptor.classes = null
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.ms = 50
    request.timeout.ms = 305000
    retry.backoff.ms = 100
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
   (org.apache.kafka.clients.consumer.ConsumerConfig:180)
  [2017-05-04 22:42:17,469] INFO Kafka version : 0.10.2.1 (org.apache.kafka.common.utils.AppInfoParser:83)
  [2017-05-04 22:42:17,472] INFO Kafka commitId : e89bffd6b2eff799 (org.apache.kafka.common.utils.AppInfoParser:84)
  [2017-05-04 22:42:17,478] INFO Creating task CPSConnector-9 (org.apache.kafka.connect.runtime.Worker:305)
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@15768b04] Created with target pubsub.googleapis.com:443
  [2017-05-04 22:42:17,484] INFO ConnectorConfig values:
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    key.converter = null
    name = CPSConnector
    tasks.max = 10
    transforms = null
    value.converter = null
   (org.apache.kafka.connect.runtime.ConnectorConfig:180)
  [2017-05-04 22:42:17,486] INFO TaskConfig values:
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
   (org.apache.kafka.connect.runtime.TaskConfig:180)
  [2017-05-04 22:42:17,486] INFO Instantiated task CPSConnector-9 with version 0.10.2.1 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:317)
  [2017-05-04 22:42:17,486] INFO ConsumerConfig values:
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [localhost:9092]
    check.crcs = true
    client.id =
    connections.max.idle.ms = 540000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-CPSConnector
    heartbeat.interval.ms = 3000
    interceptor.classes = null
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.ms = 50
    request.timeout.ms = 305000
    retry.backoff.ms = 100
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
   (org.apache.kafka.clients.consumer.ConsumerConfig:180)
  [2017-05-04 22:42:17,492] INFO Kafka version : 0.10.2.1 (org.apache.kafka.common.utils.AppInfoParser:83)
  [2017-05-04 22:42:17,493] INFO Kafka commitId : e89bffd6b2eff799 (org.apache.kafka.common.utils.AppInfoParser:84)
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@2aa0b78d] Created with target pubsub.googleapis.com:443
  [2017-05-04 22:42:17,496] INFO Created connector CPSConnector (org.apache.kafka.connect.cli.ConnectStandalone:90)
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@1bb167da] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@41380fb7] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@7ecb1a4d] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@4db17054] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3ca410f4] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@611d92fb] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@7a2f9bb8] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@24675ca9] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@7e101025] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@65220a00] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@79a69904] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@4dcf0b0d] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@6440f37f] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@179dd4b2] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@67a3fdd] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@5a8c6de1] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@7b83e31f] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3408c640] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@6b9cd402] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3790445] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@2a583bab] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@68d47f55] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3b0906db] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@4ef86c1f] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@793abbe9] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@2c734def] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@202098c0] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3b129183] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@2058dcfd] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@c977767] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@1c4dbc55] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@48952135] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@31b59feb] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@21c700c] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@35a9958a] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@5611534b] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@5b844502] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3af9d26e] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3d18fe46] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@6910904f] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@3aa392ae] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@5f8e90f5] Created with target pubsub.googleapis.com:443
  May 04, 2017 10:42:17 PM io.grpc.internal.ManagedChannelImpl <init>
  INFO: [ManagedChannelImpl@5e5c05fe] Created with target pubsub.googleapis.com:443
  ...
  Caused by: io.grpc.StatusRuntimeException: UNKNOWN
    at io.grpc.Status.asRuntimeException(Status.java:545)
    at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:417)
    at io.grpc.ClientInterceptors$CheckedForwardingClientCall.start(ClientInterceptors.java:203)
    at io.grpc.stub.ClientCalls.startCall(ClientCalls.java:248)
    at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:227)
    at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:186)
    at com.google.pubsub.v1.PublisherGrpc$PublisherFutureStub.publish(PublisherGrpc.java:480)
    at com.google.pubsub.kafka.sink.CloudPubSubGRPCPublisher.publish(CloudPubSubGRPCPublisher.java:44)
    at com.google.pubsub.kafka.sink.CloudPubSubRoundRobinPublisher.publish(CloudPubSubRoundRobinPublisher.java:43)
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.publishMessagesForPartition(CloudPubSubSinkTask.java:321)
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:265)
    ... 22 more
  Caused by: java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured.
    at io.grpc.netty.GrpcSslContexts.selectApplicationProtocolConfig(GrpcSslContexts.java:153)
    at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:130)
    at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:119)
    at io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:90)
    at io.grpc.netty.NettyChannelBuilder.createProtocolNegotiator(NettyChannelBuilder.java:263)
    at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.newClientTransport(NettyChannelBuilder.java:322)
    at io.grpc.internal.CallCredentialsApplyingTransportFactory.newClientTransport(CallCredentialsApplyingTransportFactory.java:62)
    at io.grpc.internal.TransportSet.startNewTransport(TransportSet.java:199)
    at io.grpc.internal.TransportSet.obtainActiveTransport(TransportSet.java:179)
    at io.grpc.internal.ManagedChannelImpl$3.getTransport(ManagedChannelImpl.java:476)
    at io.grpc.internal.ManagedChannelImpl$3.getTransport(ManagedChannelImpl.java:432)
    at io.grpc.DummyLoadBalancerFactory$DummyLoadBalancer.pickTransport(DummyLoadBalancerFactory.java:105)
    at io.grpc.internal.ManagedChannelImpl$1.get(ManagedChannelImpl.java:149)
    at io.grpc.internal.ClientCallImpl.start(ClientCallImpl.java:201)
    at io.grpc.auth.ClientAuthInterceptor$1.checkedStart(ClientAuthInterceptor.java:104)
    at io.grpc.ClientInterceptors$CheckedForwardingClientCall.start(ClientInterceptors.java:195)
    ... 30 more
克里斯托斯·哈吉尼科利斯

问题与pom.xml文件中的(此处:https : //github.com/GoogleCloudPlatform/pubsub/blob/master/kafka-connector/pom.xml有关,该文件默认设置为适用于linux-x86_64. 我替换linux-x86_64${os.detected.classifier},一切正常。具体来说,我替换了:

<dependency>
 <groupId>io.netty</groupId>
 <artifactId>netty-tcnative-boringssl-static</artifactId>
 <version>1.1.33.Fork14</version>
 <classifier>linux-x86_64</classifier>

和:

<dependency>
 <groupId>io.netty</groupId>
 <artifactId>netty-tcnative-boringssl-static</artifactId>
 <version>1.1.33.Fork14</version>
 <classifier>${os.detected.classifier}/classifier>

GoogleCloudPlatform/pubsub/kafka-connector/pom.xml

本文收集自互联网,转载请注明来源。

如有侵权,请联系[email protected] 删除。

编辑于
0

我来说两句

0条评论
登录后参与评论

相关文章

来自分类Dev

运行 pubsub kafka 连接器独立模式问题

来自分类Dev

kafka mongodb 接收器连接器未启动

来自分类Dev

将Restlet配置为在OSGi中使用Jetty连接器(不是简单连接器)

来自分类Dev

将Restlet配置为在OSGi中使用Jetty连接器(不是简单连接器)

来自分类Dev

Debezium如何正确向Kafka Connect注册SqlServer连接器-连接被拒绝

来自分类Dev

Kafka连接器配置错误:filter.condition:定义了无效的json路径

来自分类Dev

如何为在kubernetes集群上运行的Kafka Connect配置MongoDB官方源连接器

来自分类Dev

Vertica-Kafka vkconfig连接器

来自分类Dev

Kafka Connect MQTT连接器的ClassNotFoundException

来自分类Dev

Kafka连接器行为异常

来自分类Dev

Spark 流与 Kafka 连接器停止

来自分类Dev

每个连接器的 Kafka Connect 日志

来自分类Dev

Debezium Kafka连接器mongodb:将kafka连接器连接到mongodb时出错

来自分类Dev

Kafka连接器-用于Kafka的JMSSourceConnector主题

来自分类Dev

用于Azure Blob存储的Kafka连接器

来自分类Dev

Kafka JDBC Source连接器:从列值创建主题

来自分类Dev

Debezium SQL Server源连接器设置Kafka代理

来自分类Dev

如何设置JDBC源连接器(kafka)的密钥?

来自分类Dev

使用Flink-Kafka连接器平均消耗事件

来自分类Dev

用于雪花的Kafka连接器不断失败

来自分类Dev

如何在Windows下安装mongodb连接器Kafka?

来自分类Dev

用于雪花的Kafka连接器不断失败

来自分类Dev

kafka连接器,用于“ Ctrl + A”定界文件

来自分类Dev

Kafka S3源连接器

来自分类Dev

指定Kafka Connect连接器插件版本

来自分类Dev

在Helm安装的Kafka / Confluent上使用连接器

来自分类Dev

用于 Spark Streaming 的 Kafka 连接器版本

来自分类Dev

SF KAFKA连接器详细信息:表没有兼容的架构-雪花kafka连接器

来自分类Dev

在连接器配置中使用flowVar?

Related 相关文章

  1. 1

    运行 pubsub kafka 连接器独立模式问题

  2. 2

    kafka mongodb 接收器连接器未启动

  3. 3

    将Restlet配置为在OSGi中使用Jetty连接器(不是简单连接器)

  4. 4

    将Restlet配置为在OSGi中使用Jetty连接器(不是简单连接器)

  5. 5

    Debezium如何正确向Kafka Connect注册SqlServer连接器-连接被拒绝

  6. 6

    Kafka连接器配置错误:filter.condition:定义了无效的json路径

  7. 7

    如何为在kubernetes集群上运行的Kafka Connect配置MongoDB官方源连接器

  8. 8

    Vertica-Kafka vkconfig连接器

  9. 9

    Kafka Connect MQTT连接器的ClassNotFoundException

  10. 10

    Kafka连接器行为异常

  11. 11

    Spark 流与 Kafka 连接器停止

  12. 12

    每个连接器的 Kafka Connect 日志

  13. 13

    Debezium Kafka连接器mongodb:将kafka连接器连接到mongodb时出错

  14. 14

    Kafka连接器-用于Kafka的JMSSourceConnector主题

  15. 15

    用于Azure Blob存储的Kafka连接器

  16. 16

    Kafka JDBC Source连接器:从列值创建主题

  17. 17

    Debezium SQL Server源连接器设置Kafka代理

  18. 18

    如何设置JDBC源连接器(kafka)的密钥?

  19. 19

    使用Flink-Kafka连接器平均消耗事件

  20. 20

    用于雪花的Kafka连接器不断失败

  21. 21

    如何在Windows下安装mongodb连接器Kafka?

  22. 22

    用于雪花的Kafka连接器不断失败

  23. 23

    kafka连接器,用于“ Ctrl + A”定界文件

  24. 24

    Kafka S3源连接器

  25. 25

    指定Kafka Connect连接器插件版本

  26. 26

    在Helm安装的Kafka / Confluent上使用连接器

  27. 27

    用于 Spark Streaming 的 Kafka 连接器版本

  28. 28

    SF KAFKA连接器详细信息:表没有兼容的架构-雪花kafka连接器

  29. 29

    在连接器配置中使用flowVar?

热门标签

归档