-
Notifications
You must be signed in to change notification settings - Fork 40.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Review Kafka auto-configuration and support for defining additional beans #40174
Comments
Spring Boot 2.7.x is no longer supported.
Please upgrade to 3.1.x or later and let us know if the problem remains. If it does, we can take a further look. |
Thanks for the fast reply! I tried it in other services already bumped to Spring Boot 3.x, and the same happened.
|
Thank you. I think #19221 is relevant here. Reading the discussion there, I am reminded that the current behaviour is intentional. The intent here is to auto-configure a I think this may have to be fixed by updating Spring Kafka's documentation. Before we go down that route, flagging for a team meeting in case there's anything that I've overlooked and a fix in Boot is actually possible. |
We discussed this and concluded that there's nothing we can do about this in the short- or medium-term. In the longer term, we'd like to reconsider this current behavior as part of investigating service bindings and support for auto-configuring multiple beans (#15732, #22403). In the meantime, the documentation for Spring Kafka should be updated. I've opened spring-projects/spring-kafka#3242 for that. |
I checked the issue you linked before, and it makes sense with the mentioned use case. With the upcoming change in the Spring Kafka documentation, the 3rd workaround from the description becomes the actual solution for this particular problem. Thanks for the help! |
Documentation added to Spring Kafka via spring-projects/spring-kafka#3243. |
Summary
Over-defining the
ConsumerFactory
instance in our configuration as stated in thespring-kafka
docs won't take effect, but the default instance will be created and used instead. With a minor modification in the bean definition, the intended behavior can be achieved, but it isn't straightforward to do so.To work the way the documentation states some
spring-boot-autoconfigure
related changes are required, or if it is not possible, then thespring-kafka
documentation should be modified accordingly.Details
I'm working with Spring Boot-based microservices connected by Kafka. The message values are in JSON format, in the serialized form of a common DTO, let's call it
Document
.spring-kafka
JsonDeserializer
for theDocument
ConsumerFactory
and applying our custom deserializer to itExpected behavior
Based on the
spring-kafka
documentation we should be able to override the default factory this way with our custom instance with the custom deserializer.What happens instead
The service will get runtime errors about failed deserialization instead, as it deserializes the message with the default into a
byte[]
and then tries to cast it into aDocument
.On the other hand, if the property-based config is provided, then the service runs without issues.
Based on these symptoms I'd say, that not our
ConsumerFactory
instance is used when creating theConsumer
, but the default one. It will pick up the property-based configuration, but if the property is not provided then it will use the default deserializer. (Verified it by putting breakpoint into theKafkaListenerContainerFactory
creation in theKafkaAnnotationDrivenConfiguration
, and the injected factory is in fact null.)Our programmatically configured
@Bean
should override the defaultConsumerFactory
(so we should be able to deserialize the messages without setting the property).Reason
I saw that in the
KafkaAnnotationDrivenConfiguration
theConsumerFactory
is injected like this:The problem I found regarding this is that if we are defining a
ConsumerFactory
instance the way it is stated in thespring-kafka
docs, it won't be injected here because theConsumerFactory<String, Document>
is not an instance of theConsumerFactory<Object, Object>
.Workarounds
@Bean
definition, then it would work as expectedConcurrentKafkaListenerContainerFactory
in our configurationConsumerFactory
with wildcards like this:Proposed solution
IMO using wildcards when injecting the
ConsumerFactory
into thekafkaListenerContainerFactory
would lead to complying with thekafka-spring
documentation, theConsumerFactory
could be used with concrete type arguments instead of wildcards.If it is not possible for some reasons I'm not aware of, then the
spring-kafka
docs should be modified to contain the correct usage in the example.Versions
spring-boot-autoconfigure:2.7.10
spring-kafka:2.8.11
This specific part of the
spring-boot-autoconfigure
implementation is the same in the latest version too as I could see, so I think the same issue would persist with a version upgrade as well, but I didn't check that.The text was updated successfully, but these errors were encountered: