05 Sep 2016
The Apache Flink community released another bugfix version of the Apache Flink 1.1. series.
We recommend all users to upgrade to Flink 1.1.2.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>1.1.2</version>
</dependency>
You can find the binaries on the updated Downloads page.
Release Notes - Flink - Version 1.1.2
- [FLINK-4236] - Flink Dashboard stops showing list of uploaded jars if main method cannot be looked up
- [FLINK-4309] - Potential null pointer dereference in DelegatingConfiguration#keySet()
- [FLINK-4334] - Shaded Hadoop1 jar not fully excluded in Quickstart
- [FLINK-4341] - Kinesis connector does not emit maximum watermark properly
- [FLINK-4402] - Wrong metrics parameter names in documentation
- [FLINK-4409] - class conflict between jsr305-1.3.9.jar and flink-shaded-hadoop2-1.1.1.jar
- [FLINK-4411] - [py] Chained dual input children are not properly propagated
- [FLINK-4412] - [py] Chaining does not properly handle broadcast variables
- [FLINK-4425] - "Out Of Memory" during savepoint deserialization
- [FLINK-4454] - Lookups for JobManager address in config
- [FLINK-4480] - Incorrect link to elastic.co in documentation
- [FLINK-4486] - JobManager not fully running when yarn-session.sh finishes
- [FLINK-4488] - Prevent cluster shutdown after job execution for non-detached jobs
- [FLINK-4514] - ExpiredIteratorException in Kinesis Consumer on long catch-ups to head of stream
- [FLINK-4526] - ApplicationClient: remove redundant proxy messages
- [FLINK-3866] - StringArraySerializer claims type is immutable; shouldn't
- [FLINK-3899] - Document window processing with Reduce/FoldFunction + WindowFunction
- [FLINK-4302] - Add JavaDocs to MetricConfig
- [FLINK-4495] - Running multiple jobs on yarn (without yarn-session)