博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
flink源码编译(windows环境)
阅读量:5049 次
发布时间:2019-06-12

本文共 4265 字,大约阅读时间需要 14 分钟。

前言

最新开始捣鼓flink,fucking the code之前,编译是第一步。

编译环境

win7 java maven

编译步骤

https://ci.apache.org/projects/flink/flink-docs-release-1.6/start/building.html   官方文档搞起,如下:

Building Flink from Source

This page covers how to build Flink 1.6.1 from sources.

In order to build Flink you need the source code. Either  or .

In addition you need Maven 3 and a JDK (Java Development Kit). Flink requires at least Java 8 to build.

NOTE: Maven 3.3.x can build Flink, but will not properly shade away certain dependencies. Maven 3.2.5 creates the libraries properly. To build unit tests use Java 8u51 or above to prevent failures in unit tests that use the PowerMock runner.

To clone from git, enter:

git clone https://github.com/apache/flink

The simplest way of building Flink is by running:

mvn clean install -DskipTests

This instructs  (mvn) to first remove all existing builds (clean) and then create a new Flink binary (install).

To speed up the build you can skip tests, QA plugins, and JavaDocs:

mvn clean install -DskipTests -Dfast

The default build adds a Flink-specific JAR for Hadoop 2, to allow using Flink with HDFS and YARN.

Dependency Shading

Flink  some of the libraries it uses, in order to avoid version clashes with user programs that use different versions of these libraries. Among the shaded libraries are Google GuavaAsmApache CuratorApache HTTP ComponentsNetty, and others.

The dependency shading mechanism was recently changed in Maven and requires users to build Flink slightly differently, depending on their Maven version:

Maven 3.0.x, 3.1.x, and 3.2.x It is sufficient to call mvn clean install -DskipTests in the root directory of Flink code base.

Maven 3.3.x The build has to be done in two steps: First in the base directory, then in the distribution project:

mvn clean install -DskipTestscd flink-distmvn clean install

Note: To check your Maven version, run mvn --version.

Hadoop Versions

Info Most users do not need to do this manually. The  contains binary packages for common Hadoop versions.

Flink has dependencies to HDFS and YARN which are both dependencies from . There exist many different versions of Hadoop (from both the upstream project and the different Hadoop distributions). If you are using a wrong combination of versions, exceptions can occur.

Hadoop is only supported from version 2.4.0 upwards. You can also specify a specific Hadoop version to build against:

mvn clean install -DskipTests -Dhadoop.version=2.6.1

Vendor-specific Versions  指定hadoop发行商

To build Flink against a vendor specific Hadoop version, issue the following command:

mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.6.1-cdh5.0.0

The -Pvendor-repos activates a Maven  that includes the repositories of popular Hadoop vendors such as Cloudera, Hortonworks, or MapR.

官网给出的是指定cdh发行商的版本,这里我给出一个hdp发行商的版本

 

mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.7.3.2.6.1.114-2
详细的版本信息可以从http://repo.hortonworks.com/content/repositories/releases/org/apache/hadoop/hadoop-common/查看

 

Scala Versions

Info Users that purely use the Java APIs and libraries can ignore this section.

Flink has APIs, libraries, and runtime modules written in . Users of the Scala API and libraries may have to match the Scala version of Flink with the Scala version of their projects (because Scala is not strictly backwards compatible).

Flink 1.4 currently builds only with Scala version 2.11.

We are working on supporting Scala 2.12, but certain breaking changes in Scala 2.12 make this a more involved effort. Please check out  for updates.

Encrypted File Systems

If your home directory is encrypted you might encounter a java.io.IOException: File name too long exception. Some encrypted file systems, like encfs used by Ubuntu, do not allow long filenames, which is the cause of this error.

The workaround is to add:

-Xmax-classfile-name
128

in the compiler configuration of the pom.xml file of the module causing the error. For example, if the error appears in the flink-yarn module, the above code should be added under the <configuration> tag of scala-maven-plugin. See  for more information.

编译结果

 

 

flink\flink-dist\target\flink-1.6.0-bin\flink-1.6.0下会有编译结果

 

转载于:https://www.cnblogs.com/felixzh/p/9685529.html

你可能感兴趣的文章
C# Async与Await的使用
查看>>
Mysql性能调优
查看>>
iOS基础-UIKit框架-多控制器管理-实例:qq界面框架
查看>>
javascript学习---BOM
查看>>
IOS-每个程序员的编程之路上都应该看这11本书
查看>>
自定义tabbar(纯代码)
查看>>
extjs fieldset 和 radio
查看>>
小程序底部导航栏
查看>>
Codeforces Gym101505G:Orchard Division(扫描线+线段树第k大)
查看>>
ibatis学习笔记
查看>>
18-ES6(1)
查看>>
poj1611 简单并查集
查看>>
tensorflow实现迁移学习
查看>>
Ubuntu 14.04下安装CUDA8.0
查看>>
跨平台开发 -- C# 使用 C/C++ 生成的动态链接库
查看>>
关于Redis处理高并发
查看>>
C# BS消息推送 SignalR介绍(一)
查看>>
asp.net core 系列 16 Web主机 IWebHostBuilder
查看>>
WPF星空效果
查看>>
WPF Layout 系统概述——Arrange
查看>>