Skip to content

Commit

Permalink
docs(ingest): update links to Kafka docs (#2834)
Browse files Browse the repository at this point in the history
  • Loading branch information
hsheth2 authored Jul 6, 2021
1 parent 3d9f4ec commit 288d17f
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 5 deletions.
6 changes: 4 additions & 2 deletions metadata-ingestion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ source:
schema_registry_config: {} # passed to https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.schema_registry.SchemaRegistryClient
```
For a full example with a number of security options, see this [example recipe](./examples/recipes/secured_kafka_to_console.yml).
For a full example with a number of security options, see this [example recipe](./examples/recipes/secured_kafka.yml).
### MySQL Metadata `mysql`

Expand Down Expand Up @@ -895,11 +895,13 @@ sink:
config:
connection:
bootstrap: "localhost:9092"
producer_config: {} # passed to https://docs.confluent.io/platform/current/clients/confluent-kafka-python/index.html#serializingproducer
producer_config: {} # passed to https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#serializingproducer
schema_registry_url: "http://localhost:8081"
schema_registry_config: {} # passed to https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.schema_registry.SchemaRegistryClient
```

For a full example with a number of security options, see this [example recipe](./examples/recipes/secured_kafka.yml).

### Console `console`

Simply prints each metadata event to stdout. Useful for experimentation and debugging purposes.
Expand Down
6 changes: 3 additions & 3 deletions metadata-ingestion/src/datahub/configuration/kafka.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class _KafkaConnectionConfig(ConfigModel):

# Extra schema registry config.
# These options will be passed into Kafka's SchemaRegistryClient.
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/index.html?highlight=schema%20registry#schemaregistryclient.
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html?#schemaregistryclient
schema_registry_config: dict = Field(default_factory=dict)

@validator("bootstrap")
Expand Down Expand Up @@ -43,7 +43,7 @@ class KafkaConsumerConnectionConfig(_KafkaConnectionConfig):

# Extra consumer config.
# These options will be passed into Kafka's DeserializingConsumer.
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/index.html#deserializingconsumer
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#deserializingconsumer
# and https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md.
consumer_config: dict = Field(default_factory=dict)

Expand All @@ -53,6 +53,6 @@ class KafkaProducerConnectionConfig(_KafkaConnectionConfig):

# Extra producer config.
# These options will be passed into Kafka's SerializingProducer.
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/index.html#serializingproducer
# See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#serializingproducer
# and https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md.
producer_config: dict = Field(default_factory=dict)

0 comments on commit 288d17f

Please sign in to comment.