Skip to content

Commit

Permalink
Release notes for 1.0-M1 (#633)
Browse files Browse the repository at this point in the history
* Prepare release notes
* Review links in docs and Scaladoc
* Add authors
  • Loading branch information
ennru authored Nov 6, 2018
1 parent 59a1053 commit 86fea26
Show file tree
Hide file tree
Showing 14 changed files with 95 additions and 17 deletions.
1 change: 1 addition & 0 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,7 @@ lazy val docs = project
"confluent.version" -> confluentAvroSerializerVersion,
"extref.akka-docs.base_url" -> s"https://doc.akka.io/docs/akka/$akkaVersion/%s",
"extref.kafka-docs.base_url" -> s"https://kafka.apache.org/$kafkaVersionForDocs/documentation/%s",
"extref.java-docs.base_url" -> "https://docs.oracle.com/en/java/javase/11/%s",
"scaladoc.scala.base_url" -> s"https://www.scala-lang.org/api/current/",
"scaladoc.akka.base_url" -> s"https://doc.akka.io/api/akka/$akkaVersion",
"scaladoc.akka.kafka.base_url" -> s"https://doc.akka.io/api/akka-stream-kafka/${version.value}/",
Expand Down
2 changes: 1 addition & 1 deletion core/src/main/scala/akka/kafka/CommitterSettings.scala
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ object CommitterSettings {

/**
* Settings for committer. See `akka.kafka.committer` section in
* reference.conf. Note that the [[CommitterSettings companion]] object provides
* reference.conf. Note that the [[akka.kafka.CommitterSettings$ companion]] object provides
* `apply` and `create` functions for convenient construction of the settings, together with
* the `with` methods.
*/
Expand Down
2 changes: 1 addition & 1 deletion core/src/main/scala/akka/kafka/ConsumerSettings.scala
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ object ConsumerSettings {

/**
* Settings for consumers. See `akka.kafka.consumer` section in
* `reference.conf`. Note that the [[ConsumerSettings companion]] object provides
* `reference.conf`. Note that the [[akka.kafka.ConsumerSetting$ companion]] object provides
* `apply` and `create` functions for convenient construction of the settings, together with
* the `with` methods.
*
Expand Down
2 changes: 1 addition & 1 deletion core/src/main/scala/akka/kafka/KafkaConsumerActor.scala
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ object KafkaConsumerActor {

/**
* Creates Props for the Kafka Consumer Actor with a reference back to the owner of it
* which will be signalled with [[akka.actor.Status.Failure(exception)]], in case the
* which will be signalled with [[akka.actor.Status.Failure Failure(exception)]], in case the
* Kafka client instance can't be created.
*/
def props[K, V](owner: ActorRef, settings: ConsumerSettings[K, V]): Props =
Expand Down
2 changes: 1 addition & 1 deletion core/src/main/scala/akka/kafka/ProducerSettings.scala
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ object ProducerSettings {

/**
* Settings for producers. See `akka.kafka.producer` section in
* reference.conf. Note that the [[ProducerSettings companion]] object provides
* reference.conf. Note that the [[akka.kafka.ProducerSettings$ companion]] object provides
* `apply` and `create` functions for convenient construction of the settings, together with
* the `with` methods.
*
Expand Down
8 changes: 8 additions & 0 deletions docs/src/main/paradox/consumer.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,3 +273,11 @@ Scala

Java
: @@ snip [snip](/tests/src/test/java/docs/javadsl/ConsumerExample.java) { #shutdownCommitableSource }


@@@ index

* [subscription](subscription.md)
* [metadata](consumer-metadata.md)

@@@
2 changes: 1 addition & 1 deletion docs/src/main/paradox/debugging.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ The Kafka client library used by the Alpakka Kafka connector uses SLF4J, as well
version2=1.2.3
}

To enable Akka SLF4J logging, configure Akka in `application.conf` as below. Refer to the [Akka documentation](https://doc.akka.io/docs/akka/current/logging.html#slf4j) for details.
To enable Akka SLF4J logging, configure Akka in `application.conf` as below. Refer to the @extref[Akka documentation](akka-docs:logging.html#slf4j) for details.

```hocon
akka {
Expand Down
4 changes: 2 additions & 2 deletions docs/src/main/paradox/errorhandling.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Failing consumer

When a consumer fails to read from Kafka due to connection problems, it throws a @javadoc[WakeupException](org.apache.kafka.common.errors.WakeupException) which is handled internally with retries. Refer to consumer configuration [settings](consumer.html#settings) for details on `wakeup-timeout` and `max-wakeups` if you're interested in tweaking the retry handling parameters.
When a consumer fails to read from Kafka due to connection problems, it throws a @javadoc[WakeupException](org.apache.kafka.common.errors.WakeupException) which is handled internally with retries. Refer to consumer configuration @ref[settings](consumer.md#settings) for details on `wakeup-timeout` and `max-wakeups` if you're interested in tweaking the retry handling parameters.
When the currently configured number of `max-wakeups` is reached, the source stage will fail with an exception and stop.

## Failing producer
Expand All @@ -25,6 +25,6 @@ When a stream fails, library internals will handle all underlying resources.
@@@note { title=(de)serialization }

If reading from Kafka failure is caused by other reasons, like **deserialization problems**, then the stage will fail immediately. If you expect such cases, consider
consuming raw byte arrays and deserializing in a subsequent `map` stage where you can use supervision to skip failed elements. See also the ["At least once"](atleastonce.html) page for more suggestions.
consuming raw byte arrays and deserializing in a subsequent `map` stage where you can use supervision to skip failed elements. See also @ref:[Serialization](serialization.md) and @ref:["At least once"](atleastonce.md) pages for more suggestions.

@@@
13 changes: 9 additions & 4 deletions docs/src/main/paradox/home.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
# Overview

The [Alpakka project](https://developer.lightbend.com/docs/alpakka/current/) is an open source initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/docs/akka/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.
The [Alpakka project](https://developer.lightbend.com/docs/alpakka/current/) is an open source initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of @extref[Akka Streams](akka-docs:stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ @extref[java.util.concurrent.Flow](java-docs:docs/api/java.base/java/util/concurrent/Flow.html)-compliant implementation and therefore @extref[fully interoperable](akka-docs:general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.

This **Alpakka Kafka connector** lets you connect [Apache Kafka](https://kafka.apache.org/) to Akka Streams. It was formerly known as **Akka Streams Kafka** and even **Reactive Kafka**.

## Versions

The examples in this documentation use

* Alpakka Kafka connector $project.version$ ([Github](https://github.com/akka/reactive-kafka), [API docs](https://doc.akka.io/api/akka-stream-kafka/current/#package))
* Alpakka Kafka connector $project.version$ ([Github](https://github.com/akka/alpakka-kafka), [API docs](https://doc.akka.io/api/akka-stream-kafka/current/akka/kafka/index.html))
* Scala $scala.binary.version$ (also available for Scala 2.11)
* Akka Streams $akka.version$ (@extref[Docs](akka-docs:stream/index.html), [Github](https://github.com/akka/akka))
* Apache Kafka client $kafka.version$ (@extref[Docs](kafka-docs:index.html), [Github](https://github.com/apache/kafka))

Release notes are found at [Github releases](https://github.com/akka/reactive-kafka/releases).
Release notes are found at @ref:[Release Notes](release-notes/index.md).

If you want to try out a connector that has not yet been released, give @ref[snapshots](snapshots.md) a spin which are published after every commit on master.

Expand All @@ -22,7 +22,7 @@ If you want to try out a connector that has not yet been released, give @ref[sna

|Kafka | Akka version | Alpakka Kafka Connector
|-------|--------------|-------------------------
|2.0.x | 2.5.x | [release 1.0-M1](https://github.com/akka/reactive-kafka/releases)
|2.0.x | 2.5.x | @ref:[release 1.0-M1](release-notes/1.0-M1.md)
|1.1.x | 2.5.x | [release 0.20+](https://github.com/akka/reactive-kafka/releases)
|1.0.x | 2.5.x | [release 0.20+](https://github.com/akka/reactive-kafka/releases)
|0.11.x | 2.5.x | [release 0.19](https://github.com/akka/reactive-kafka/milestone/19?closed=1)
Expand Down Expand Up @@ -59,3 +59,8 @@ Please feel free to contribute to Alpakka and the Alpakka Kafka connector by rep
We want Akka and Alpakka to strive in a welcoming and open atmosphere and expect all contributors to respect our [code of conduct](https://github.com/akka/reactive-kafka/blob/master/CODE_OF_CONDUCT.md).


@@@ index

* [release notes](release-notes/index.md)

@@@
4 changes: 1 addition & 3 deletions docs/src/main/paradox/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Alpakka Kafka Documentation

The [Alpakka project](https://developer.lightbend.com/docs/alpakka/current/) is an open source initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/docs/akka/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.
The [Alpakka project](https://developer.lightbend.com/docs/alpakka/current/) is an open source initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of @extref[Akka Streams](akka-docs:stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ @extref[java.util.concurrent.Flow](java-docs:docs/api/java.base/java/util/concurrent/Flow.html)-compliant implementation and therefore @extref[fully interoperable](akka-docs:general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.

This **Alpakka Kafka connector** lets you connect [Apache Kafka](https://kafka.apache.org/) to Akka Streams. It was formerly known as **Akka Streams Kafka** and even **Reactive Kafka**.

Expand All @@ -11,8 +11,6 @@ This **Alpakka Kafka connector** lets you connect [Apache Kafka](https://kafka.a
* [overview](home.md)
* [Producer](producer.md)
* [Consumer](consumer.md)
* [Subscription](subscription.md)
* [Consumer Metadata](consumer-metadata.md)
* [Error Handling](errorhandling.md)
* [At-Least-Once Delivery](atleastonce.md)
* [Transactions](transactions.md)
Expand Down
57 changes: 57 additions & 0 deletions docs/src/main/paradox/release-notes/1.0-M1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# Alpakka Kafka 1.0-M1

## First milestone release for Alpakka Kafka 1.0

On this [Road to Alpakka 1.0](https://akka.io/blog/news/2018/08/30/alpakka-towards-1.0) we may introduce non-compatible API changes - **but did not in Alpakka Kafka 1.0-M1 vs. 0.22**. From version 1.0 Alpakka will stay [binary-compatible](https://doc.akka.io/docs/akka/current/common/binary-compatibility-rules.html#binary-compatibility-rules) between minor releases.

Having that said, Alpakka Kafka will start to make use of the `@ApiMayChange` annotation to keep the door open for API changes, so that new feature API can evolve more rapidly than other parts of Alpakka Kafka.


## Highlights in this release

* Upgraded the Kafka client to version 2.0.0 [#544](https://github.com/akka/alpakka-kafka/pull/544) by [@fr3akX](https://github.com/fr3akX) (with huge improvements to come in [#614](https://github.com/akka/alpakka-kafka/pull/614) by an enormous effort of [@zaharidichev](https://github.com/zaharidichev))

* New `Committer.sink` for standardised committing [#622](https://github.com/akka/alpakka-kafka/pull/622) by [@rtimush](https://github.com/rtimush)

* Commit with metadata [#563](https://github.com/akka/alpakka-kafka/pull/563) and [#579](https://github.com/akka/alpakka-kafka/pull/579) by [@johnclara](https://github.com/johnclara)

* Factored out `akka.kafka.testkit` for internal and external use: see @ref:[Testing](../testing.md)

* Support for merging commit batches [#584](https://github.com/akka/alpakka-kafka/pull/584) by [@rtimush](https://github.com/rtimush)

* Reduced risk of message loss for partitioned sources [#589](https://github.com/akka/alpakka-kafka/pull/589)

* Expose Kafka errors to stream [#617](https://github.com/akka/alpakka-kafka/pull/617)

* Java APIs for all settings classes [#616](https://github.com/akka/alpakka-kafka/pull/616)

* Much more comprehensive tests


## Improved documentation

* Documented @ref:[subscriptions](../subscription.md)

* Documented use of @ref:[serialization](../serialization.md)


Everything done in this release is [in the milestone](https://github.com/akka/alpakka-kafka/issues?q=milestone%3A1.0-M1).


## General information

This release is compiled and tested against [Akka 2.5](https://doc.akka.io/docs/akka/current/) and Scala 2.11 and 2.12.

This release was made possible by new and earlier contributors:

```
commits added removed
84 7657 4762 Enno
29 1543 693 Martynas Mickevičius
2 345 17 Roman Timushev
2 279 32 John Clara
1 24 18 Philippus Baalman
1 18 3 fr3akX
1 3 2 Sherif Ebady
1 2 1 Piotr Gabara
```
9 changes: 9 additions & 0 deletions docs/src/main/paradox/release-notes/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Release Notes

@@toc { depth=2 }

@@@ index

* [1.0-M1](1.0-M1.md)

@@@
4 changes: 2 additions & 2 deletions docs/src/main/paradox/serialization.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ Gradle
```


## Producer
### Producer

To create serializers that use the Schema Registry, its URL needs to be provided as configuration `AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG` to the serializer and that serializer is used in the `ProducerSettings`.

Expand All @@ -89,7 +89,7 @@ Java



## Consumer
### Consumer

To create deserializers that use the Schema Registry, its URL needs to be provided as configuration `AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG` to the deserializer and that deserializer is used in the `ConsumerSettings`.

Expand Down
2 changes: 1 addition & 1 deletion project/plugins.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ addSbtPlugin("se.marcuslonnberg" % "sbt-docker" % "1.5.0")
addSbtPlugin("com.lightbend.paradox" % "sbt-paradox" % "0.4.2")
addSbtPlugin("com.lightbend.akka" % "sbt-paradox-akka" % "0.12")
addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.13")
addSbtPlugin("com.thoughtworks.sbt-api-mappings" % "sbt-api-mappings" % "2.1.0")
addSbtPlugin("com.thoughtworks.sbt-api-mappings" % "sbt-api-mappings" % "3.0.0")
addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.5.4")
addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.3.0")
addSbtPlugin("lt.dvim.paradox" % "sbt-paradox-local" % "0.2")
Expand Down

0 comments on commit 86fea26

Please sign in to comment.