Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka SASL_PLAINTEXT problem #333

Open
mikolajmorawski opened this issue Jan 25, 2018 · 14 comments
Open

Kafka SASL_PLAINTEXT problem #333

mikolajmorawski opened this issue Jan 25, 2018 · 14 comments
Labels

Comments

@mikolajmorawski
Copy link

Hi,
I am trying to configure Burrow with Kafka SASL. I am using kafka image from wurstmeister with the following configuration:

  kafka:
    image: wurstmeister/kafka:1.0.0
    network_mode: host
    restart: on-failure
    environment:
        KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: SASL_PLAINTEXT:SASL_PLAINTEXT,OUTSIDE:PLAINTEXT
        KAFKA_ADVERTISED_PROTOCOL_NAME: OUTSIDE
        KAFKA_PROTOCOL_NAME: SASL_PLAINTEXT
        KAFKA_ADVERTISED_PORT: "9094"
        KAFKA_ZOOKEEPER_CONNECT: "183.155.2.197:2181"
        KAFKA_INTER_BROKER_PROTOCOL: PLAIN
        KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: PLAIN
        KAFKA_SASL_ENABLED_MECHANISMS: PLAIN
        KAFKA_AUTHORIZER_CLASS_NAME: kafka.security.auth.SimpleAclAuthorizer
        KAFKA_SUPER_USERS: "User:admin"
        KAFKA_OPTS: "-Djava.security.auth.login.config=jaas.conf"

jaas.conf:

KafkaServer {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="admin"
  password="adminpass"
};

And Burrow with the following configuration:


[general]
access-control-allow-origin="*"

[logging]
level="debug"

[zookeeper]
servers=["183.155.2.197:2181"]

[client-profile.kafka-profile]
kafka-version="1.0.0"
client-id="burrow-client"
sasl="mysasl"

[sasl.mysasl]
username="admin"
password="adminpass"

[cluster.my_cluster]
class-name="kafka"
client-profile="kafka-profile"
servers=["183.155.2.197:9094"]
topic-refresh=120
offset-refresh=10

[consumer.consumer_kafka]
class-name="kafka"
cluster="my_cluster"
servers=["183.155.2.197:9094"]
client-profile="kafka-profile"
start-latest=false
offsets-topic="__consumer_offsets"
group-blacklist="^(console-consumer-|python-kafka-consumer-).*$"

[consumer.consumer_zk]
class-name="kafka_zk"
cluster="my_cluster"
servers=["183.155.2.197:2181"]
zookeeper-timeout=30
group-blacklist="^(console-consumer-|python-kafka-consumer-).*$"

[httpserver.default]
address=":8002"

When i turn off SASL_PLAINTEXT on Kafka, Burrow starts with success and is connected to kafka brokers. With this configuration i get the following error during burrow<->kafka connection:


2018-01-25 11:17:57,887] ERROR Closing socket for 183.155.2.197:9094-183.155.2.197:51406-107 because of error (kafka.network.Processor)
org.apache.kafka.common.errors.InvalidRequestException: Error parsing request header. Our best guess of the apiKey is: 97
Caused by: org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'client_id': Error reading string of length 25709, only 9 bytes available
	at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:77)
	at org.apache.kafka.common.requests.RequestHeader.parse(RequestHeader.java:121)
	at kafka.network.Processor.$anonfun$processCompletedReceives$1(SocketServer.scala:549)
	at kafka.network.Processor.$anonfun$processCompletedReceives$1$adapted(SocketServer.scala:545)
	at scala.collection.Iterator.foreach(Iterator.scala:929)
	at scala.collection.Iterator.foreach$(Iterator.scala:929)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
	at scala.collection.IterableLike.foreach(IterableLike.scala:71)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at kafka.network.Processor.processCompletedReceives(SocketServer.scala:545)
	at kafka.network.Processor.run(SocketServer.scala:453)
	at java.lang.Thread.run(Thread.java:748)
[2018-01-25 11:17:57,887] ERROR Exception while processing request from 183.155.2.197:9094-183.155.2.197:51406-107 (kafka.network.Processor)
org.apache.kafka.common.errors.InvalidRequestException: Error parsing request header. Our best guess of the apiKey is: 97
Caused by: org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'client_id': Error reading string of length 25709, only 9 bytes available
	at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:77)
	at org.apache.kafka.common.requests.RequestHeader.parse(RequestHeader.java:121)
	at kafka.network.Processor.$anonfun$processCompletedReceives$1(SocketServer.scala:549)
	at kafka.network.Processor.$anonfun$processCompletedReceives$1$adapted(SocketServer.scala:545)
	at scala.collection.Iterator.foreach(Iterator.scala:929)
	at scala.collection.Iterator.foreach$(Iterator.scala:929)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
	at scala.collection.IterableLike.foreach(IterableLike.scala:71)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at kafka.network.Processor.processCompletedReceives(SocketServer.scala:545)
	at kafka.network.Processor.run(SocketServer.scala:453)
	at java.lang.Thread.run(Thread.java:748)
[2018-01-25 11:20:37,060] INFO [GroupMetadataManager brokerId=1001] Removed 0 expired offsets in 0 milliseconds. (kafka.coordinator.group.GroupMetadataManager)
@toddpalino
Copy link
Contributor

We just pass SASL configs to Sarama, so this is a little tricky to debug. But can you try one thing for me? Change the sasl profile definition to the following:

[sasl.mysasl]
username="admin"
password="adminpass"
handshake-first=true

I suspect we might have a bad default value for handshake-first, but I don't have a SASL environment right now to test in.

@toddpalino toddpalino added the bug label Jan 31, 2018
@mikolajmorawski
Copy link
Author

Thanks for your help, after this change it started to work correctly. I was trying all possible configurations but I haven’t thought about adding field handshake-first=true to my configuration to override this “default” value :D, so it was a problem with the wrong default value :D

The next step I was doing was to connect Burrow with SASL_SSL secured kafka. In documentation there is the note about certfile and keyfile filed which are required. It is not true when you want to connect to SSL secured kafka. The only thing I had to do was to set this:

[tls.mytls]
noverify=true

I found that there is a mistake in documentation, the field there is shown as “no-verify” but correct value is “noverify” like in your code.
The problem I had with this that it was not possible to debug. The Burrow logs only shows this debug message:

“Cannot start Kafka client for cluster local: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)”

This message could be related to any problem, by I already knew that the last thing that was not working was TLS. So changing this field name helped me.
Also this other fields are not required during SSL connection to Kafka. Maybe there are required when you are creating HttpServer with TLS. It would be nice to split this two things and describe them separately.

@toddpalino
Copy link
Contributor

Yes, that's correct. certfile and keyfile are required for the HTTP server. They are also needed for the Kafka client config if you are using client certificates for authentication. We'll need to update the docs as well.

As far as the value of handshake-first goes, thank you for trying that out and confirming it was the problem. There needs to be a PR generated now to make sure that handshake-first has a default of true set in the code before it is read from config and used.

@akamalov
Copy link

akamalov commented Apr 5, 2018

@mikolajmorawski and @toddpalino: does it mean that Burrow does not support Kafka when brokers are configured with SASL_PLAINTEXT ?? I am also running SASL_PLAINTEXT in the environment, and getting "kafka: client has run out of available brokers to talk to (Is your cluster reachable?)" error. Tried to put in noverify=true, but it's not helping. @mikolajmorawski, have you found a workaround ? If yes, is it possible to share your config for burrow?

Thanks again,

Alex

@akamalov
Copy link

@mikolajmorawski and @toddpalino can I ask if current release of burrow supports SASL_PLAINTEXT? Thanks so much in advance for replying.

@toddpalino
Copy link
Contributor

I haven't explicitly tested it against SASL_PLAINTEXT, so it would depend on the underlying Sarama client support. There was a bug, as noted, in the default value of handshake-first. That hasn't been resolved yet (as noted, we need a PR for it - I haven't done it myself because I'm not using it at present). So I would make sure that that config is explicitly set to true in the SASL profile section of the config.

@akamalov
Copy link

akamalov commented May 2, 2018

Thanks for reply @toddpalino . Yes, config is explicitly set to true.

handshake-first=true

...it is bugging out nevertheless :(

Thanks again..

@pboado
Copy link

pboado commented May 24, 2018

Hi @toddpalino

I'm trying to go through a similar route... SASL_PLAINTEXT + Keytab . How can I tell burrow to use this configuration? I've made kinit and the start burrow, but as soon as I define a sasl profile the startup starts failing

{"level":"info","ts":1527166609.0603826,"msg":"starting","type":"module","coordi nator":"cluster","class":"kafka","name":"dev"} {"level":"error","ts":1527166609.0604134,"msg":"failed to start client","type":" module","coordinator":"cluster","class":"kafka","name":"dev","error":"kafka: inv alid configuration (Net.SASL.User must not be empty when SASL is enabled)"}

If I don't define one then I get

{"level":"error","ts":1527167116.8251395,"msg":"failed to start client","type":" module","coordinator":"cluster","class":"kafka","name":"dev","error":"kafka: cli ent has run out of available brokers to talk to (Is your cluster reachable?)"}

Am I missing something? I don't see any other reference to this setup in the configuration notes.

Thanks!

@jrevillard
Copy link

@toddpalino , I'm also interested in keytab authentication. Did you find a solution ?

Best,
Jerome

@rja1
Copy link

rja1 commented Nov 14, 2018

Didn't ever see an update here. So, can Burrow auth using SASL_PLAINTEXT + Keytab ?

Thanks much!

@myloginid
Copy link

myloginid commented Nov 20, 2018

Same Q here - Can we do SASL_SSL + Kerberos

@tsrikanth06
Copy link

I cannot get sasl + scram to work, with latest burrow. sarama added scram in its 1.22.1 and latest burrow does use that version. I get the same error '"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: client has run out of available brokers to talk to (Is your cluster reachable?)"'

Can anyone let me know what config I need to do inorder for it to work?

@jbresciani
Copy link

@tsrikanth06 Does this help at all?

#526 (comment)

@smaley07
Copy link

Didn't ever see an update here. So, can Burrow auth using SASL_PLAINTEXT + Keytab ?

Thanks much!

Hello @rja1
Did you find a way to achieve this ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

10 participants