Flink任务提交源码阅读(二):入口类CliFrontend


接着上一章的内容,今天继续阅读Flink1.9.0任务提交源码的入口类:org.apache.flink.client.cli.CliFrontend
上篇文章:Flink1.9.0任务提交源码阅读(一):flink脚本
下一篇:Flink1.9.0任务提交源码阅读(三):Job提交 run()

1.入口类逻辑

从main()开始,直接上代码

  /** Submits the job based on the arguments. */
  public static void main(final String[] args) {
    // 1.打印基本的环境信息
    EnvironmentInformation.logEnvironmentInfo(LOG, "Command Line Client", args);

    // 2. find the configuration directory -- 获取配置文件目录:.../flink1.9.0/conf
    final String configurationDirectory = getConfigurationDirectoryFromEnv();

    // 3. load the global configuration -- 加载flink-conf.yaml中的全局配置转成Configuration对象
    final Configuration configuration =
        GlobalConfiguration.loadConfiguration(configurationDirectory);

    // 4. load the custom command lines -- 加载用户输入的命令行、获取命令行参数
    final List<CustomCommandLine<?>> customCommandLines =
        loadCustomCommandLines(configuration, configurationDirectory);

    try {
      //5.创建CliFrontend对象
      final CliFrontend cli = new CliFrontend(configuration, customCommandLines);
      //6.加载 SecurityConfiguration类,是flink全局安全配置
      SecurityUtils.install(new SecurityConfiguration(cli.configuration));
      //7.根据命令行参数进行匹配,运行程序,获取程序执行的运行码
      //调用CliFrontend的parseParameters方法, 解析命令行参数,运行具体的action
      int retCode = SecurityUtils.getInstalledContext().runSecured(() -> cli.parseParameters(args));
      //8.获取执行返回值,关闭提交程序
      System.exit(retCode);
    } catch (Throwable t) {
      final Throwable strippedThrowable =
          ExceptionUtils.stripException(t, UndeclaredThrowableException.class);
      LOG.error("Fatal error while running command line interface.", strippedThrowable);
      strippedThrowable.printStackTrace();
      System.exit(31);
    }
  }

主函数接收命令行传过来的参数,做了一下几个操作:
1.打印基本的环境信息
2.调用getConfigurationDirectoryFromEnv方法,根据环境变量FLINK_CONF_DIR获取flink配置文件目录
3.调用GlobalConfiguration的loadConfiguration方法加载flink配置文件flink-conf.yaml中的配置,解析后转成Configuration对象
4.调用loadCustomCommandLines方法,加载自定义命令行(包含yarn模式命令行和默认命令行两种)
5.初始化CliFrontend对象
6.调用SecurityUtils.install方法,加载安全配置模块
7.根据命令行参数进行Switch case 匹配,执行对应的action、回调,并返回状态码。这块是主要逻辑
8.获取执行返回状态码,关闭提交程序

2.细节分析

2.1打印基本的环境信息

/**
	 * Logs information about the environment, like code revision, current user, Java version,
	 * and JVM parameters.
	 *
	 * @param log The logger to log the information to.
	 * @param componentName The component name to mention in the log.
	 * @param commandLineArgs The arguments accompanying the starting the component.
	 */
	public static void logEnvironmentInfo(Logger log, String componentName, String[] commandLineArgs) {
		if (log.isInfoEnabled()) {
			// 得到代码git的最终提交id和日期
	        RevisionInformation rev = getRevisionInformation();
	        // 代码版本
	        String version = getVersion();
	        // JVM版本,利用JavaSDK自带的ManagementFactory类来获取。
	        String jvmVersion = getJvmVersion();
	        // JVM的启动参数,也是通过JavaSDK自带的ManagementFactory类来获取。
	        String[] options = getJvmStartupOptionsArray();
	        // JAVA_Home目录
	        String javaHome = System.getenv("JAVA_HOME");
	        // JVM的最大堆内存大小,单位Mb。
	        long maxHeapMegabytes = getMaxJvmHeapMemory() >>> 20;
	
	        // 打印基本信息
			log.info("--------------------------------------------------------------------------------");
			log.info(" Starting " + componentName + " (Version: " + version + ", "
					+ "Rev:" + rev.commitId + ", " + "Date:" + rev.commitDate + ")");
			log.info(" OS current user: " + System.getProperty("user.name"));
			log.info(" Current Hadoop/Kerberos user: " + getHadoopUser());
			log.info(" JVM: " + jvmVersion);
			log.info(" Maximum heap size: " + maxHeapMegabytes + " MiBytes");
			log.info(" JAVA_HOME: " + (javaHome == null ? "(not set)" : javaHome));
			// hadoop的版本信息
			String hadoopVersionString = getHadoopVersionString();
			if (hadoopVersionString != null) {
				log.info(" Hadoop version: " + hadoopVersionString);
			} else {
				log.info(" No Hadoop Dependency available");
			}
			// 打印JVM运行 参数
			if (options.length == 0) {
				log.info(" JVM Options: (none)");
			}
			else {
				log.info(" JVM Options:");
				for (String s: options) {
					log.info("    " + s);
				}
			}
			// 任务程序启动参数
			if (commandLineArgs == null || commandLineArgs.length == 0) {
				log.info(" Program Arguments: (none)");
			}
			else {
				log.info(" Program Arguments:");
				for (String s: commandLineArgs) {
					log.info("    " + s);
				}
			}

			log.info(" Classpath: " + System.getProperty("java.class.path"));

			log.info("--------------------------------------------------------------------------------");
		}
	}

2.2 获取flink配置文件目录

调用getConfigurationDirectoryFromEnv方法,根据环境变量FLINK_CONF_DIR获取flink配置文件目录

  public static String getConfigurationDirectoryFromEnv() {
    // 获取系统的环境变量FLINK_CONF_DIR,该目录是FLink的配置文件目录
    // 在Flink提交脚本中会调用config.sh脚本,该脚本中会获取Flink的配置目录,并添加到系统环境变量中
    // 这里获取到该目录后,判断该目录是否存在,如果存在则返回Flink配置文件目录路径
    String location = System.getenv(ConfigConstants.ENV_FLINK_CONF_DIR);

    if (location != null) {
      if (new File(location).exists()) {
        return location;
      } else {
        throw new RuntimeException(
            "The configuration directory '"
                + location
                + "', specified in the '"
                + ConfigConstants.ENV_FLINK_CONF_DIR
                + "' environment variable, does not exist.");
      }
    } else if (new File(CONFIG_DIRECTORY_FALLBACK_1).exists()) {
      location = CONFIG_DIRECTORY_FALLBACK_1;
    } else if (new File(CONFIG_DIRECTORY_FALLBACK_2).exists()) {
      location = CONFIG_DIRECTORY_FALLBACK_2;
    } else {
      throw new RuntimeException(
          "The configuration directory was not specified. "
              + "Please specify the directory containing the configuration file through the '"
              + ConfigConstants.ENV_FLINK_CONF_DIR
              + "' environment variable.");
    }
    return location;
  }

2.3 加载/解析flink-conf.yaml

调用GlobalConfiguration的loadConfiguration方法加载flink配置文件flink-conf.yaml中的配置,解析后转成Configuration对象

  /**
   * Loads the configuration files from the specified directory.
   *
   * <p>YAML files are supported as configuration files.
   *
   * @param configDir the directory which contains the configuration files
   */
  public static Configuration loadConfiguration(final String configDir) {
    return loadConfiguration(configDir, null);
  }

进一步调用loadConfiguration方法:

  /**
   * Loads the configuration files from the specified directory. If the dynamic properties
   * configuration is not null, then it is added to the loaded configuration.
   *
   * @param configDir directory to load the configuration from
   * @param dynamicProperties configuration file containing the dynamic properties. Null if none.
   * @return The configuration loaded from the given configuration directory
   */
  public static Configuration loadConfiguration(
      final String configDir, @Nullable final Configuration dynamicProperties) {

    if (configDir == null) {
      throw new IllegalArgumentException(
          "Given configuration directory is null, cannot load configuration");
    }

    final File confDirFile = new File(configDir);
    if (!(confDirFile.exists())) {
      throw new IllegalConfigurationException(
          "The given configuration directory name '"
              + configDir
              + "' ("
              + confDirFile.getAbsolutePath()
              + ") does not describe an existing directory.");
    }
    /** 1.判断配置目录是否为空,不为空获取配置文件,就是flink的配置文件flink-conf.yaml */
    // get Flink yaml configuration file
    final File yamlConfigFile = new File(confDirFile, FLINK_CONF_FILENAME);

    if (!yamlConfigFile.exists()) {
      throw new IllegalConfigurationException(
          "The Flink config file '"
              + yamlConfigFile
              + "' ("
              + confDirFile.getAbsolutePath()
              + ") does not exist.");
    }
    /** 2.【核心逻辑】获取到文件文件后,调用loadYAMLResource方法,去解析yaml配置文件,并返回HashMap键值对形式的Configuration */
    Configuration configuration = loadYAMLResource(yamlConfigFile);

    if (dynamicProperties != null) {
      configuration.addAll(dynamicProperties);
    }

    return enrichWithEnvironmentVariables(configuration);
  }

2.4 加载自定义命令行

调用loadCustomCommandLines方法,加载自定义命令行(包含yarn模式命令行和默认命令行两种)
具体逻辑:

/**
  * 加载自定义命令行
  * @param configuration 配置项
  * @param configurationDirectory  配置文件目录
  * @return
  */
public static List<CustomCommandLine<?>> loadCustomCommandLines(Configuration configuration, String configurationDirectory) {
    // 1. 初始化一个容量是2的命令栏容器。
    List<CustomCommandLine<?>> customCommandLines = new ArrayList<>(2);
	 //	Command line interface of the YARN session, with a special initialization here
    //	to prefix all options with y/yarn.
    //	Tips: DefaultCLI must be added at last, because getActiveCustomCommandLine(..) will get the
    //	      active CustomCommandLine in order and DefaultCLI isActive always return true.
     // 2. YARN会话的命令行接口,所有选项参数都是以y/yarn前缀。
    final String flinkYarnSessionCLI = "org.apache.flink.yarn.cli.FlinkYarnSessionCli";
    try {
        // 3. 添加yarn模式命令行
        customCommandLines.add(
            loadCustomCommandLine(flinkYarnSessionCLI,
                                  configuration,
                                  configurationDirectory,
                                  "y",
                                  "yarn"));
    } catch (NoClassDefFoundError | Exception e) {
        LOG.warn("Could not load CLI class {}.", flinkYarnSessionCLI, e);
    }

    // 4. 添加默认模式命令行
    customCommandLines.add(new DefaultCLI(configuration));

    return customCommandLines;
}

下面分别展开分析是怎么添加yarn模式命令行和默认模式命令行的。
yarn模式命令客户端和默认普通模式客户端的类图关系如下:
在这里插入图片描述

添加yarn模式命令行

/**
  * 通过反射构建命令行
  * @param className 加载的类名全程.
  * @param params 构建参数
  */
private static CustomCommandLine<?> loadCustomCommandLine(String className, Object... params) throws IllegalAccessException, InvocationTargetException, InstantiationException, ClassNotFoundException, NoSuchMethodException {

    // 1. 加载classpath里相关的类,这个加载的类实现了CustomCommandLine接口
    Class<? extends CustomCommandLine> customCliClass =
        Class.forName(className).asSubclass(CustomCommandLine.class);

    // 2. 从参数里构建出参数的Class类型
    Class<?>[] types = new Class<?>[params.length];
    for (int i = 0; i < params.length; i++) {
        Preconditions.checkNotNull(params[i], "Parameters for custom command-lines may not be null.");
        types[i] = params[i].getClass();
    }
    // 3. 生成构造器org.apache.flink.yarn.cli$FlinkYarnSessionCli
    Constructor<? extends CustomCommandLine> constructor = customCliClass.getConstructor(types);

    // 4. 构造器实例化。调用org.apache.flink.yarn.cli$FlinkYarnSessionCli的构造方法,进行实例化。
    return constructor.newInstance(params);
}

可以看出这里的逻辑是通过FlinkYarnSessionCli的构造器来实例化对象。所以进一步看具体调用了org.apache.flink.yarn.cli$FlinkYarnSessionCli的哪一个构造器,这个是根据构造器的参数决定的。

public FlinkYarnSessionCli(
	Configuration configuration,
	String configurationDirectory,
	String shortPrefix,
	String longPrefix) throws FlinkException {
	  this(configuration, configurationDirectory, shortPrefix, longPrefix, true);
}

继续调用this(configuration, configurationDirectory, shortPrefix, longPrefix, true):

/**
	 * 初始化一个FlinkYarnSessionCli
	 * @param configuration  全局的配置
	 * @param configurationDirectory  全局的配置文件目录
	 * @param shortPrefix   命令行参数的缩写前缀
	 * @param longPrefix    命令行参数的展开前缀
	 * @param acceptInteractiveInput 是否接受交互型输入
	 * @throws FlinkException
	 */
public FlinkYarnSessionCli(
    Configuration configuration,
    String configurationDirectory,
    String shortPrefix,
    String longPrefix,
    boolean acceptInteractiveInput) throws FlinkException {
    // 1. 初始化参数
    super(configuration);
    this.configurationDirectory = Preconditions.checkNotNull(configurationDirectory);
    this.acceptInteractiveInput = acceptInteractiveInput;

    // 2. 创建命令行选项
    query = new Option(shortPrefix + "q", longPrefix + "query", false, "Display available YARN resources (memory, cores)");
    applicationId = new Option(shortPrefix + "id", longPrefix + "applicationId", true, "Attach to running YARN session");
    queue = new Option(shortPrefix + "qu", longPrefix + "queue", true, "Specify YARN queue.");
    shipPath = new Option(shortPrefix + "t", longPrefix + "ship", true, "Ship files in the specified directory (t for transfer)");
    flinkJar = new Option(shortPrefix + "j", longPrefix + "jar", true, "Path to Flink jar file");
    jmMemory = new Option(shortPrefix + "jm", longPrefix + "jobManagerMemory", true, "Memory for JobManager Container with optional unit (default: MB)");
    tmMemory = new Option(shortPrefix + "tm", longPrefix + "taskManagerMemory", true, "Memory per TaskManager Container with optional unit (default: MB)");
    container = new Option(shortPrefix + "n", longPrefix + "container", true, "Number of YARN container to allocate (=Number of Task Managers)");
    slots = new Option(shortPrefix + "s", longPrefix + "slots", true, "Number of slots per TaskManager");
    dynamicproperties = Option.builder(shortPrefix + "D")
        .argName("property=value")
        .numberOfArgs(2)
        .valueSeparator()
        .desc("use value for given property")
        .build();
    streaming = new Option(shortPrefix + "st", longPrefix + "streaming", false, "Start Flink in streaming mode");
    name = new Option(shortPrefix + "nm", longPrefix + "name", true, "Set a custom name for the application on YARN");
    zookeeperNamespace = new Option(shortPrefix + "z", longPrefix + "zookeeperNamespace", true, "Namespace to create the Zookeeper sub-paths for high availability mode");
    nodeLabel = new Option(shortPrefix + "nl", longPrefix + "nodeLabel", true, "Specify YARN node label for the YARN application");
    help = new Option(shortPrefix + "h", longPrefix + "help", false, "Help for the Yarn session CLI.");

    allOptions = new Options();
    allOptions.addOption(flinkJar);
    allOptions.addOption(jmMemory);
    allOptions.addOption(tmMemory);
    allOptions.addOption(container);
    allOptions.addOption(queue);
    allOptions.addOption(query);
    allOptions.addOption(shipPath);
    allOptions.addOption(slots);
    allOptions.addOption(dynamicproperties);
    allOptions.addOption(DETACHED_OPTION);
    allOptions.addOption(SHUTDOWN_IF_ATTACHED_OPTION);
    allOptions.addOption(YARN_DETACHED_OPTION);
    allOptions.addOption(streaming);
    allOptions.addOption(name);
    allOptions.addOption(applicationId);
    allOptions.addOption(zookeeperNamespace);
    allOptions.addOption(nodeLabel);
    allOptions.addOption(help);

    // 3. 加载默认的yarn配置文件
    this.yarnPropertiesFileLocation = configuration.getString(YarnConfigOptions.PROPERTIES_FILE_LOCATION);
    final File yarnPropertiesLocation = getYarnPropertiesLocation(yarnPropertiesFileLocation);

    // 4. 解析出yarn的配置参数
    yarnPropertiesFile = new Properties();

    if (yarnPropertiesLocation.exists()) {
        LOG.info("Found Yarn properties file under {}.", yarnPropertiesLocation.getAbsolutePath());

        try (InputStream is = new FileInputStream(yarnPropertiesLocation)) {
            yarnPropertiesFile.load(is);
        } catch (IOException ioe) {
            throw new FlinkException("Could not read the Yarn properties file " + yarnPropertiesLocation +
                                     ". Please delete the file at " + yarnPropertiesLocation.getAbsolutePath() + '.', ioe);
        }

        final String yarnApplicationIdString = yarnPropertiesFile.getProperty(YARN_APPLICATION_ID_KEY);

        if (yarnApplicationIdString == null) {
            throw new FlinkException("Yarn properties file found but doesn't contain a " +
                                     "Yarn application id. Please delete the file at " + yarnPropertiesLocation.getAbsolutePath());
        }

        try {
            // 尝试将id转化成ApplicationId
            yarnApplicationIdFromYarnProperties = ConverterUtils.toApplicationId(yarnApplicationIdString);
        }
        catch (Exception e) {
            throw new FlinkException("YARN properties contains an invalid entry for " +
                                     "application id: " + yarnApplicationIdString + ". Please delete the file at " +
                                     yarnPropertiesLocation.getAbsolutePath(), e);
        }
    } else {
        yarnApplicationIdFromYarnProperties = null;
    }
    // 5. 初始化yarn的配置
    this.yarnConfiguration = new YarnConfiguration();
}

添加默认模式命令行

public DefaultCLI(Configuration configuration) {
		super(configuration);
	}

2.5 初始化CliFrontend对象

初始化CliFrontend对象

public CliFrontend(
    Configuration configuration,
    List<CustomCommandLine<?>> customCommandLines) throws Exception {
    // 1. 初始化对象属性判断是否为空
    this.configuration = Preconditions.checkNotNull(configuration);
    this.customCommandLines = Preconditions.checkNotNull(customCommandLines);

    try {
        // 2. 初始化文件系统
        FileSystem.initialize(
        configuration, PluginUtils.createPluginManagerFromRootFolder(configuration));
    } catch (IOException e) {
        throw new Exception("Error while setting the default " +
                            "filesystem scheme from configuration.", e);
    }
    // 3. 给命令行对象添加选项
    this.customCommandLineOptions = new Options();
	// 获取用户命令行配置customCommandLines,遍历list将其添加到运行配置和一般配置中
    for (CustomCommandLine<?> customCommandLine : customCommandLines) {
        customCommandLine.addGeneralOptions(customCommandLineOptions);
        customCommandLine.addRunOptions(customCommandLineOptions);
    }

    // 4. 从全局配置里得到akka 客户端等待超时时间(akka.client.timeout)
    this.clientTimeout = AkkaUtils.getClientTimeout(this.configuration);
    // 5. 从全局配置里得到默认的系统并行度
    this.defaultParallelism = configuration.getInteger(CoreOptions.DEFAULT_PARALLELISM);
}

2.6 加载安全配置模块

调用SecurityUtils.install方法,加载安全配置模块
安装安全机制的逻辑是调用:

SecurityUtils.install(new SecurityConfiguration(cli.configuration));

我们先分析下SecurityConfiguration对象的初始化,然后再分析SecurityUtils的install逻辑。

SecurityConfiguration初始化

/**
  * 从全局配置创建安全配置.
  * @param flinkConf flink全局配置
  */
public SecurityConfiguration(Configuration flinkConf) {
    this(flinkConf, DEFAULT_MODULES);
}

其中DEFAULT_MODULES为默认的安全模板:

// 默认的安全模块
private static final List<SecurityModuleFactory> DEFAULT_MODULES = Collections.unmodifiableList(
Arrays.asList(new HadoopModuleFactory(), new JaasModuleFactory(), new ZookeeperModuleFactory()));

继续:

/**
* 从全局配置创建安全配置。
* @param flinkConf Flink的全局配置
* @param securityModuleFactories 要应用的安全模块.
*/
public SecurityConfiguration(Configuration flinkConf,
List<SecurityModuleFactory> securityModuleFactories) {
   // 1. 一些全局参数的配置
    this.isZkSaslDisable = flinkConf.getBoolean(SecurityOptions.ZOOKEEPER_SASL_DISABLE);
    this.keytab = flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_KEYTAB);
    this.principal = flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_PRINCIPAL);
    this.useTicketCache = flinkConf.getBoolean(SecurityOptions.KERBEROS_LOGIN_USETICKETCACHE);
    this.loginContextNames = parseList(flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_CONTEXTS));
    this.zkServiceName = flinkConf.getString(SecurityOptions.ZOOKEEPER_SASL_SERVICE_NAME);
    this.zkLoginContextName = flinkConf.getString(SecurityOptions.ZOOKEEPER_SASL_LOGIN_CONTEXT_NAME);
   
    // 2. 安全模块就是默认的安全模块
    this.securityModuleFactories = Collections.unmodifiableList(securityModuleFactories);
    this.flinkConfig = checkNotNull(flinkConf);
    // 3. 验证
    validate();
}	

进一步看下validate的逻辑:

/**
  * 验证
  */
private void validate() {
    if (!StringUtils.isBlank(keytab)) {
        // principal is required
        if (StringUtils.isBlank(principal)) {
            throw new IllegalConfigurationException("Kerberos login configuration is invalid; keytab requires a principal.");
        }

        // check the keytab is readable
        File keytabFile = new File(keytab);
        if (!keytabFile.exists() || !keytabFile.isFile() || !keytabFile.canRead()) {
            throw new IllegalConfigurationException("Kerberos login configuration is invalid; keytab is unreadable");
        }
    }
}

如果全局配置(flink-conf.yaml)里配置了security.kerberos.login.keytab这个参数。那么要校验这个配置所指定的目录存在以及可读。这里其实有必要对kerberos的安全认证相关知识了解下。

SecurityUtils的install逻辑

SecurityConfiguration对象初始化好之后,作为参数传进SecurityUtils的install方法里面。具体逻辑:

/**
  * 安装进程范围的安全配置。
  *
  * <p>使用可用的安全模块应用配置 (i.e. Hadoop, JAAS).
  */
public static void install(SecurityConfiguration config) throws Exception {

    // 安装安全模块。
    List<SecurityModule> modules = new ArrayList<>();
    try {
        // 遍历模板,对每个安全模板进行安装。
        for (SecurityModuleFactory moduleFactory : config.getSecurityModuleFactories()) {
            SecurityModule module = moduleFactory.createModule(config);
            // can be null if a SecurityModule is not supported in the current environment
            if (module != null) {
                module.install();
                modules.add(module);
            }
        }
    }
    catch (Exception ex) {
        throw new Exception("unable to establish the security context", ex);
    }
    installedModules = modules;

    // First check if we have Hadoop in the ClassPath. If not, we simply don't do anything.
    try {
        Class.forName(
            "org.apache.hadoop.security.UserGroupInformation",
            false,
            SecurityUtils.class.getClassLoader());

        // install a security context
        // use the Hadoop login user as the subject of the installed security context
        if (!(installedContext instanceof NoOpSecurityContext)) {
            LOG.warn("overriding previous security context");
        }
        UserGroupInformation loginUser = UserGroupInformation.getLoginUser();
        installedContext = new HadoopSecurityContext(loginUser);
    } catch (ClassNotFoundException e) {
        LOG.info("Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath.");
    } catch (LinkageError e) {
        LOG.error("Cannot install HadoopSecurityContext.", e);
    }
}

这里安装的安全模板主要包括了Java认证与授权服务(JAAS),Hadoop用户组信息(UGI)和Zookeeper的全过程安全设置。

2.7 执行对应的action、回调,并返回状态码

根据命令行参数进行Switch case 匹配,执行对应的action、回调,并返回状态码。

int retCode = SecurityUtils.getInstalledContext()
					.runSecured(() -> cli.parseParameters(args));

runSecured():

/**
 * 可能需要具有的安全上下文才能运行可调用的.
 */
public interface SecurityContext {

	<T> T runSecured(Callable<T> securedCallable) throws Exception;

}

具体执行逻辑是cli.parseParameters(args)

	/**
	 * 分析命令行参数并启动请求的操作.
	 *
	 * @param args 客户端的命令行参数.
	 * @return 程序的返回状态码
	 */
	 public int parseParameters(String[] args) {

    // check for action
    if (args.length < 1) {
      CliFrontendParser.printHelp(customCommandLines);
      System.out.println("Please specify an action.");
      return 1;
    }

    // get action 提取执行动作,比如run,list,cancel
    String action = args[0];

    // remove action from parameters 从参数中移除执行动作
    final String[] params = Arrays.copyOfRange(args, 1, args.length);

    try {
      // do action
      switch (action) {
        case ACTION_RUN:
          run(params);
          return 0;
        case ACTION_LIST:
          list(params);
          return 0;
        case ACTION_INFO:
          info(params);
          return 0;
        case ACTION_CANCEL:
          cancel(params);
          return 0;
        case ACTION_STOP:
          stop(params);
          return 0;
        case ACTION_SAVEPOINT:
          savepoint(params);
          return 0;
        case "-h":
        case "--help":
          CliFrontendParser.printHelp(customCommandLines);
          return 0;
        case "-v":
        case "--version":
          String version = EnvironmentInformation.getVersion();
          String commitID = EnvironmentInformation.getRevisionInformation().commitId;
          System.out.print("Version: " + version);
          System.out.println(
              commitID.equals(EnvironmentInformation.UNKNOWN) ? "" : ", Commit ID: " + commitID);
          return 0;
        default:
          System.out.printf("\"%s\" is not a valid action.\n", action);
          System.out.println();
          System.out.println(
              "Valid actions are \"run\", \"list\", \"info\", \"savepoint\", \"stop\", or \"cancel\".");
          System.out.println();
          System.out.println(
              "Specify the version option (-v or --version) to print Flink version.");
          System.out.println();
          System.out.println("Specify the help option (-h or --help) to get help on the command.");
          return 1;
      }
    } catch (CliArgsException ce) {
      return handleArgException(ce);
    } catch (ProgramParametrizationException ppe) {
      return handleParametrizationException(ppe);
    } catch (ProgramMissingJobException pmje) {
      return handleMissingJobException();
    } catch (Exception e) {
      return handleError(e);
    }
  }

重点分析下执行任务的逻辑,即执行./flink run的逻辑。这部分比较长,打算另写一篇记录:
Flink1.9.0任务提交源码阅读(三):Job提交 run()

2.8 获取执行返回状态码,关闭提交程序

获取执行返回状态码,关闭提交程序


参考博客:https://blog.csdn.net/hxcaifly/article/details/87864154

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章