Skip to content

Commit

Permalink
Fix typos across multiple documents (#3538)
Browse files Browse the repository at this point in the history
  • Loading branch information
bky373 authored Oct 7, 2024
1 parent 3155c6b commit 367f40f
Show file tree
Hide file tree
Showing 8 changed files with 11 additions and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -906,7 +906,7 @@ Also, a `StringOrBytesSerializer` is now available; it can serialize `byte[]`, `
See xref:kafka/serdes.adoc#messaging-message-conversion[Spring Messaging Message Conversion] for more information.

The `JsonSerializer`, `JsonDeserializer` and `JsonSerde` now have fluent APIs to make programmatic configuration simpler.
See the javadocs, xref:kafka/serdes.adoc[Serialization, Deserialization, and Message Conversion], and xref:streams.adoc#serde[Streams JSON Serialization and Deserialization] for more informaion.
See the javadocs, xref:kafka/serdes.adoc[Serialization, Deserialization, and Message Conversion], and xref:streams.adoc#serde[Streams JSON Serialization and Deserialization] for more information.

[[cb-2-2-and-2-3-replyingkafkatemplate]]
=== ReplyingKafkaTemplate
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,6 @@ public ApplicationRunner runner(KafkaTemplate<String, Object> template, KafkaLis
}
----

As the code above shows, the application uses the `KafkaListenerEndpointRegistry` to gain access to the message listener container and then calling the `enforceRebalnce` API on it.
As the code above shows, the application uses the `KafkaListenerEndpointRegistry` to gain access to the message listener container and then calling the `enforceRebalance` API on it.
When calling the `enforceRebalance` on the listener container, it delegates the call to the underlying Kafka consumer.
The Kafka consumer will trigger a rebalance as part of the next `poll()` operation.
The Kafka consumer will trigger a rebalance as part of the next `poll()` operation.
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ public KafkaListenerContainerFactory<?> batchFactory() {
}
----

NOTE: Starting with version 2.8, you can override the factory's `batchListener` propery using the `batch` property on the `@KafkaListener` annotation.
NOTE: Starting with version 2.8, you can override the factory's `batchListener` property using the `batch` property on the `@KafkaListener` annotation.
This, together with the changes to xref:kafka/annotation-error-handling.adoc#error-handlers[Container Error Handlers] allows the same factory to be used for both record and batch listeners.

NOTE: Starting with version 2.9.6, the container factory has separate setters for the `recordMessageConverter` and `batchMessageConverter` properties.
Expand Down Expand Up @@ -404,7 +404,7 @@ public class Listener {
}
----

If, in the unlikely event that you have an actual bean called `__listener`, you can change the expression token byusing the `beanRef` attribute.
If, in the unlikely event that you have an actual bean called `__listener`, you can change the expression token by using the `beanRef` attribute.
The following example shows how to do so:

[source, java]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ containerProperties.setConsumerRebalanceListener(new ConsumerAwareRebalanceListe
----

IMPORTANT: Starting with version 2.4, a new method `onPartitionsLost()` has been added (similar to a method with the same name in `ConsumerRebalanceLister`).
The default implementation on `ConsumerRebalanceLister` simply calls `onPartionsRevoked`.
The default implementation on `ConsumerRebalanceLister` simply calls `onPartitionsRevoked`.
The default implementation on `ConsumerAwareRebalanceListener` does nothing.
When supplying the listener container with a custom listener (of either type), it is important that your implementation does not call `onPartitionsRevoked` from `onPartitionsLost`.
If you implement `ConsumerRebalanceListener` you should override the default method.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ Example:
public class MyListener extends AbstractConsumerSeekAware {
@KafkaListener(...)
void listn(...) {
void listen(...) {
...
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -349,7 +349,7 @@ public ProducerFactory<Integer, Object> producerFactory(Map<String, Object> conf
----

Starting with version 2.8.3, you can configure the serializer to check if the map key is assignable from the target object, useful when a delegate serializer can serialize sub classes.
In this case, if there are amiguous matches, an ordered `Map`, such as a `LinkedHashMap` should be provided.
In this case, if there are ambiguous matches, an ordered `Map`, such as a `LinkedHashMap` should be provided.

[[by-topic]]
=== By Topic
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -336,7 +336,7 @@ Of course, the `recoverer()` bean can be your own implementation of `ConsumerRec
Starting with version 3.2, Spring for Apache Kafka provides basic facilities required for interactive queries in Kafka Streams.
Interactive queries are useful in stateful Kafka Streams applications since they provide a way to constantly query the stateful stores in the application.
Thus, if an application wants to materialize the current view of the system under consideration, interactive queries provide a way to do that.
To learn more about interacive queries, see this https://kafka.apache.org/36/documentation/streams/developer-guide/interactive-queries.html[article].
To learn more about interactive queries, see this https://kafka.apache.org/36/documentation/streams/developer-guide/interactive-queries.html[article].
The support in Spring for Apache Kafka is centered around an API called `KafkaStreamsInteractiveQueryService` which is a facade around interactive queries APIs in Kafka Streams library.
An application can create an instance of this service as a bean and then later on use it to retrieve the state store by its name.

Expand Down Expand Up @@ -376,7 +376,7 @@ Here is the type signature from the API.
public <T> T retrieveQueryableStore(String storeName, QueryableStoreType<T> storeType)
----

When calling this method, the user can specifially ask for the proper state store type, as we have done in the above example.
When calling this method, the user can specifically ask for the proper state store type, as we have done in the above example.

=== Retrying State Store Retrieval

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ public void sendToKafka(String in) {
[[tip-json]]
== Customizing the JsonSerializer and JsonDeserializer

The serializer and deserializer support a number of cusomizations using properties, see xref:kafka/serdes.adoc#json-serde[JSON] for more information.
The serializer and deserializer support a number of customizations using properties, see xref:kafka/serdes.adoc#json-serde[JSON] for more information.
The `kafka-clients` code, not Spring, instantiates these objects, unless you inject them directly into the consumer and producer factories.
If you wish to configure the (de)serializer using properties, but wish to use, say, a custom `ObjectMapper`, simply create a subclass and pass the custom mapper into the `super` constructor. For example:

Expand Down

0 comments on commit 367f40f

Please sign in to comment.