Flink任務提交源碼閱讀(二):入口類CliFrontend


接着上一章的內容,今天繼續閱讀Flink1.9.0任務提交源碼的入口類:org.apache.flink.client.cli.CliFrontend
上篇文章:Flink1.9.0任務提交源碼閱讀(一):flink腳本
下一篇:Flink1.9.0任務提交源碼閱讀(三):Job提交 run()

1.入口類邏輯

從main()開始,直接上代碼

  /** Submits the job based on the arguments. */
  public static void main(final String[] args) {
    // 1.打印基本的環境信息
    EnvironmentInformation.logEnvironmentInfo(LOG, "Command Line Client", args);

    // 2. find the configuration directory -- 獲取配置文件目錄:.../flink1.9.0/conf
    final String configurationDirectory = getConfigurationDirectoryFromEnv();

    // 3. load the global configuration -- 加載flink-conf.yaml中的全局配置轉成Configuration對象
    final Configuration configuration =
        GlobalConfiguration.loadConfiguration(configurationDirectory);

    // 4. load the custom command lines -- 加載用戶輸入的命令行、獲取命令行參數
    final List<CustomCommandLine<?>> customCommandLines =
        loadCustomCommandLines(configuration, configurationDirectory);

    try {
      //5.創建CliFrontend對象
      final CliFrontend cli = new CliFrontend(configuration, customCommandLines);
      //6.加載 SecurityConfiguration類,是flink全局安全配置
      SecurityUtils.install(new SecurityConfiguration(cli.configuration));
      //7.根據命令行參數進行匹配,運行程序,獲取程序執行的運行碼
      //調用CliFrontend的parseParameters方法, 解析命令行參數,運行具體的action
      int retCode = SecurityUtils.getInstalledContext().runSecured(() -> cli.parseParameters(args));
      //8.獲取執行返回值,關閉提交程序
      System.exit(retCode);
    } catch (Throwable t) {
      final Throwable strippedThrowable =
          ExceptionUtils.stripException(t, UndeclaredThrowableException.class);
      LOG.error("Fatal error while running command line interface.", strippedThrowable);
      strippedThrowable.printStackTrace();
      System.exit(31);
    }
  }

主函數接收命令行傳過來的參數,做了一下幾個操作:
1.打印基本的環境信息
2.調用getConfigurationDirectoryFromEnv方法,根據環境變量FLINK_CONF_DIR獲取flink配置文件目錄
3.調用GlobalConfiguration的loadConfiguration方法加載flink配置文件flink-conf.yaml中的配置,解析後轉成Configuration對象
4.調用loadCustomCommandLines方法,加載自定義命令行(包含yarn模式命令行和默認命令行兩種)
5.初始化CliFrontend對象
6.調用SecurityUtils.install方法,加載安全配置模塊
7.根據命令行參數進行Switch case 匹配,執行對應的action、回調,並返回狀態碼。這塊是主要邏輯
8.獲取執行返回狀態碼,關閉提交程序

2.細節分析

2.1打印基本的環境信息

/**
	 * Logs information about the environment, like code revision, current user, Java version,
	 * and JVM parameters.
	 *
	 * @param log The logger to log the information to.
	 * @param componentName The component name to mention in the log.
	 * @param commandLineArgs The arguments accompanying the starting the component.
	 */
	public static void logEnvironmentInfo(Logger log, String componentName, String[] commandLineArgs) {
		if (log.isInfoEnabled()) {
			// 得到代碼git的最終提交id和日期
	        RevisionInformation rev = getRevisionInformation();
	        // 代碼版本
	        String version = getVersion();
	        // JVM版本,利用JavaSDK自帶的ManagementFactory類來獲取。
	        String jvmVersion = getJvmVersion();
	        // JVM的啓動參數,也是通過JavaSDK自帶的ManagementFactory類來獲取。
	        String[] options = getJvmStartupOptionsArray();
	        // JAVA_Home目錄
	        String javaHome = System.getenv("JAVA_HOME");
	        // JVM的最大堆內存大小,單位Mb。
	        long maxHeapMegabytes = getMaxJvmHeapMemory() >>> 20;
	
	        // 打印基本信息
			log.info("--------------------------------------------------------------------------------");
			log.info(" Starting " + componentName + " (Version: " + version + ", "
					+ "Rev:" + rev.commitId + ", " + "Date:" + rev.commitDate + ")");
			log.info(" OS current user: " + System.getProperty("user.name"));
			log.info(" Current Hadoop/Kerberos user: " + getHadoopUser());
			log.info(" JVM: " + jvmVersion);
			log.info(" Maximum heap size: " + maxHeapMegabytes + " MiBytes");
			log.info(" JAVA_HOME: " + (javaHome == null ? "(not set)" : javaHome));
			// hadoop的版本信息
			String hadoopVersionString = getHadoopVersionString();
			if (hadoopVersionString != null) {
				log.info(" Hadoop version: " + hadoopVersionString);
			} else {
				log.info(" No Hadoop Dependency available");
			}
			// 打印JVM運行 參數
			if (options.length == 0) {
				log.info(" JVM Options: (none)");
			}
			else {
				log.info(" JVM Options:");
				for (String s: options) {
					log.info("    " + s);
				}
			}
			// 任務程序啓動參數
			if (commandLineArgs == null || commandLineArgs.length == 0) {
				log.info(" Program Arguments: (none)");
			}
			else {
				log.info(" Program Arguments:");
				for (String s: commandLineArgs) {
					log.info("    " + s);
				}
			}

			log.info(" Classpath: " + System.getProperty("java.class.path"));

			log.info("--------------------------------------------------------------------------------");
		}
	}

2.2 獲取flink配置文件目錄

調用getConfigurationDirectoryFromEnv方法,根據環境變量FLINK_CONF_DIR獲取flink配置文件目錄

  public static String getConfigurationDirectoryFromEnv() {
    // 獲取系統的環境變量FLINK_CONF_DIR,該目錄是FLink的配置文件目錄
    // 在Flink提交腳本中會調用config.sh腳本,該腳本中會獲取Flink的配置目錄,並添加到系統環境變量中
    // 這裏獲取到該目錄後,判斷該目錄是否存在,如果存在則返回Flink配置文件目錄路徑
    String location = System.getenv(ConfigConstants.ENV_FLINK_CONF_DIR);

    if (location != null) {
      if (new File(location).exists()) {
        return location;
      } else {
        throw new RuntimeException(
            "The configuration directory '"
                + location
                + "', specified in the '"
                + ConfigConstants.ENV_FLINK_CONF_DIR
                + "' environment variable, does not exist.");
      }
    } else if (new File(CONFIG_DIRECTORY_FALLBACK_1).exists()) {
      location = CONFIG_DIRECTORY_FALLBACK_1;
    } else if (new File(CONFIG_DIRECTORY_FALLBACK_2).exists()) {
      location = CONFIG_DIRECTORY_FALLBACK_2;
    } else {
      throw new RuntimeException(
          "The configuration directory was not specified. "
              + "Please specify the directory containing the configuration file through the '"
              + ConfigConstants.ENV_FLINK_CONF_DIR
              + "' environment variable.");
    }
    return location;
  }

2.3 加載/解析flink-conf.yaml

調用GlobalConfiguration的loadConfiguration方法加載flink配置文件flink-conf.yaml中的配置,解析後轉成Configuration對象

  /**
   * Loads the configuration files from the specified directory.
   *
   * <p>YAML files are supported as configuration files.
   *
   * @param configDir the directory which contains the configuration files
   */
  public static Configuration loadConfiguration(final String configDir) {
    return loadConfiguration(configDir, null);
  }

進一步調用loadConfiguration方法:

  /**
   * Loads the configuration files from the specified directory. If the dynamic properties
   * configuration is not null, then it is added to the loaded configuration.
   *
   * @param configDir directory to load the configuration from
   * @param dynamicProperties configuration file containing the dynamic properties. Null if none.
   * @return The configuration loaded from the given configuration directory
   */
  public static Configuration loadConfiguration(
      final String configDir, @Nullable final Configuration dynamicProperties) {

    if (configDir == null) {
      throw new IllegalArgumentException(
          "Given configuration directory is null, cannot load configuration");
    }

    final File confDirFile = new File(configDir);
    if (!(confDirFile.exists())) {
      throw new IllegalConfigurationException(
          "The given configuration directory name '"
              + configDir
              + "' ("
              + confDirFile.getAbsolutePath()
              + ") does not describe an existing directory.");
    }
    /** 1.判斷配置目錄是否爲空,不爲空獲取配置文件,就是flink的配置文件flink-conf.yaml */
    // get Flink yaml configuration file
    final File yamlConfigFile = new File(confDirFile, FLINK_CONF_FILENAME);

    if (!yamlConfigFile.exists()) {
      throw new IllegalConfigurationException(
          "The Flink config file '"
              + yamlConfigFile
              + "' ("
              + confDirFile.getAbsolutePath()
              + ") does not exist.");
    }
    /** 2.【核心邏輯】獲取到文件文件後,調用loadYAMLResource方法,去解析yaml配置文件,並返回HashMap鍵值對形式的Configuration */
    Configuration configuration = loadYAMLResource(yamlConfigFile);

    if (dynamicProperties != null) {
      configuration.addAll(dynamicProperties);
    }

    return enrichWithEnvironmentVariables(configuration);
  }

2.4 加載自定義命令行

調用loadCustomCommandLines方法,加載自定義命令行(包含yarn模式命令行和默認命令行兩種)
具體邏輯:

/**
  * 加載自定義命令行
  * @param configuration 配置項
  * @param configurationDirectory  配置文件目錄
  * @return
  */
public static List<CustomCommandLine<?>> loadCustomCommandLines(Configuration configuration, String configurationDirectory) {
    // 1. 初始化一個容量是2的命令欄容器。
    List<CustomCommandLine<?>> customCommandLines = new ArrayList<>(2);
	 //	Command line interface of the YARN session, with a special initialization here
    //	to prefix all options with y/yarn.
    //	Tips: DefaultCLI must be added at last, because getActiveCustomCommandLine(..) will get the
    //	      active CustomCommandLine in order and DefaultCLI isActive always return true.
     // 2. YARN會話的命令行接口,所有選項參數都是以y/yarn前綴。
    final String flinkYarnSessionCLI = "org.apache.flink.yarn.cli.FlinkYarnSessionCli";
    try {
        // 3. 添加yarn模式命令行
        customCommandLines.add(
            loadCustomCommandLine(flinkYarnSessionCLI,
                                  configuration,
                                  configurationDirectory,
                                  "y",
                                  "yarn"));
    } catch (NoClassDefFoundError | Exception e) {
        LOG.warn("Could not load CLI class {}.", flinkYarnSessionCLI, e);
    }

    // 4. 添加默認模式命令行
    customCommandLines.add(new DefaultCLI(configuration));

    return customCommandLines;
}

下面分別展開分析是怎麼添加yarn模式命令行和默認模式命令行的。
yarn模式命令客戶端和默認普通模式客戶端的類圖關係如下:
在這裏插入圖片描述

添加yarn模式命令行

/**
  * 通過反射構建命令行
  * @param className 加載的類名全程.
  * @param params 構建參數
  */
private static CustomCommandLine<?> loadCustomCommandLine(String className, Object... params) throws IllegalAccessException, InvocationTargetException, InstantiationException, ClassNotFoundException, NoSuchMethodException {

    // 1. 加載classpath裏相關的類,這個加載的類實現了CustomCommandLine接口
    Class<? extends CustomCommandLine> customCliClass =
        Class.forName(className).asSubclass(CustomCommandLine.class);

    // 2. 從參數裏構建出參數的Class類型
    Class<?>[] types = new Class<?>[params.length];
    for (int i = 0; i < params.length; i++) {
        Preconditions.checkNotNull(params[i], "Parameters for custom command-lines may not be null.");
        types[i] = params[i].getClass();
    }
    // 3. 生成構造器org.apache.flink.yarn.cli$FlinkYarnSessionCli
    Constructor<? extends CustomCommandLine> constructor = customCliClass.getConstructor(types);

    // 4. 構造器實例化。調用org.apache.flink.yarn.cli$FlinkYarnSessionCli的構造方法,進行實例化。
    return constructor.newInstance(params);
}

可以看出這裏的邏輯是通過FlinkYarnSessionCli的構造器來實例化對象。所以進一步看具體調用了org.apache.flink.yarn.cli$FlinkYarnSessionCli的哪一個構造器,這個是根據構造器的參數決定的。

public FlinkYarnSessionCli(
	Configuration configuration,
	String configurationDirectory,
	String shortPrefix,
	String longPrefix) throws FlinkException {
	  this(configuration, configurationDirectory, shortPrefix, longPrefix, true);
}

繼續調用this(configuration, configurationDirectory, shortPrefix, longPrefix, true):

/**
	 * 初始化一個FlinkYarnSessionCli
	 * @param configuration  全局的配置
	 * @param configurationDirectory  全局的配置文件目錄
	 * @param shortPrefix   命令行參數的縮寫前綴
	 * @param longPrefix    命令行參數的展開前綴
	 * @param acceptInteractiveInput 是否接受交互型輸入
	 * @throws FlinkException
	 */
public FlinkYarnSessionCli(
    Configuration configuration,
    String configurationDirectory,
    String shortPrefix,
    String longPrefix,
    boolean acceptInteractiveInput) throws FlinkException {
    // 1. 初始化參數
    super(configuration);
    this.configurationDirectory = Preconditions.checkNotNull(configurationDirectory);
    this.acceptInteractiveInput = acceptInteractiveInput;

    // 2. 創建命令行選項
    query = new Option(shortPrefix + "q", longPrefix + "query", false, "Display available YARN resources (memory, cores)");
    applicationId = new Option(shortPrefix + "id", longPrefix + "applicationId", true, "Attach to running YARN session");
    queue = new Option(shortPrefix + "qu", longPrefix + "queue", true, "Specify YARN queue.");
    shipPath = new Option(shortPrefix + "t", longPrefix + "ship", true, "Ship files in the specified directory (t for transfer)");
    flinkJar = new Option(shortPrefix + "j", longPrefix + "jar", true, "Path to Flink jar file");
    jmMemory = new Option(shortPrefix + "jm", longPrefix + "jobManagerMemory", true, "Memory for JobManager Container with optional unit (default: MB)");
    tmMemory = new Option(shortPrefix + "tm", longPrefix + "taskManagerMemory", true, "Memory per TaskManager Container with optional unit (default: MB)");
    container = new Option(shortPrefix + "n", longPrefix + "container", true, "Number of YARN container to allocate (=Number of Task Managers)");
    slots = new Option(shortPrefix + "s", longPrefix + "slots", true, "Number of slots per TaskManager");
    dynamicproperties = Option.builder(shortPrefix + "D")
        .argName("property=value")
        .numberOfArgs(2)
        .valueSeparator()
        .desc("use value for given property")
        .build();
    streaming = new Option(shortPrefix + "st", longPrefix + "streaming", false, "Start Flink in streaming mode");
    name = new Option(shortPrefix + "nm", longPrefix + "name", true, "Set a custom name for the application on YARN");
    zookeeperNamespace = new Option(shortPrefix + "z", longPrefix + "zookeeperNamespace", true, "Namespace to create the Zookeeper sub-paths for high availability mode");
    nodeLabel = new Option(shortPrefix + "nl", longPrefix + "nodeLabel", true, "Specify YARN node label for the YARN application");
    help = new Option(shortPrefix + "h", longPrefix + "help", false, "Help for the Yarn session CLI.");

    allOptions = new Options();
    allOptions.addOption(flinkJar);
    allOptions.addOption(jmMemory);
    allOptions.addOption(tmMemory);
    allOptions.addOption(container);
    allOptions.addOption(queue);
    allOptions.addOption(query);
    allOptions.addOption(shipPath);
    allOptions.addOption(slots);
    allOptions.addOption(dynamicproperties);
    allOptions.addOption(DETACHED_OPTION);
    allOptions.addOption(SHUTDOWN_IF_ATTACHED_OPTION);
    allOptions.addOption(YARN_DETACHED_OPTION);
    allOptions.addOption(streaming);
    allOptions.addOption(name);
    allOptions.addOption(applicationId);
    allOptions.addOption(zookeeperNamespace);
    allOptions.addOption(nodeLabel);
    allOptions.addOption(help);

    // 3. 加載默認的yarn配置文件
    this.yarnPropertiesFileLocation = configuration.getString(YarnConfigOptions.PROPERTIES_FILE_LOCATION);
    final File yarnPropertiesLocation = getYarnPropertiesLocation(yarnPropertiesFileLocation);

    // 4. 解析出yarn的配置參數
    yarnPropertiesFile = new Properties();

    if (yarnPropertiesLocation.exists()) {
        LOG.info("Found Yarn properties file under {}.", yarnPropertiesLocation.getAbsolutePath());

        try (InputStream is = new FileInputStream(yarnPropertiesLocation)) {
            yarnPropertiesFile.load(is);
        } catch (IOException ioe) {
            throw new FlinkException("Could not read the Yarn properties file " + yarnPropertiesLocation +
                                     ". Please delete the file at " + yarnPropertiesLocation.getAbsolutePath() + '.', ioe);
        }

        final String yarnApplicationIdString = yarnPropertiesFile.getProperty(YARN_APPLICATION_ID_KEY);

        if (yarnApplicationIdString == null) {
            throw new FlinkException("Yarn properties file found but doesn't contain a " +
                                     "Yarn application id. Please delete the file at " + yarnPropertiesLocation.getAbsolutePath());
        }

        try {
            // 嘗試將id轉化成ApplicationId
            yarnApplicationIdFromYarnProperties = ConverterUtils.toApplicationId(yarnApplicationIdString);
        }
        catch (Exception e) {
            throw new FlinkException("YARN properties contains an invalid entry for " +
                                     "application id: " + yarnApplicationIdString + ". Please delete the file at " +
                                     yarnPropertiesLocation.getAbsolutePath(), e);
        }
    } else {
        yarnApplicationIdFromYarnProperties = null;
    }
    // 5. 初始化yarn的配置
    this.yarnConfiguration = new YarnConfiguration();
}

添加默認模式命令行

public DefaultCLI(Configuration configuration) {
		super(configuration);
	}

2.5 初始化CliFrontend對象

初始化CliFrontend對象

public CliFrontend(
    Configuration configuration,
    List<CustomCommandLine<?>> customCommandLines) throws Exception {
    // 1. 初始化對象屬性判斷是否爲空
    this.configuration = Preconditions.checkNotNull(configuration);
    this.customCommandLines = Preconditions.checkNotNull(customCommandLines);

    try {
        // 2. 初始化文件系統
        FileSystem.initialize(
        configuration, PluginUtils.createPluginManagerFromRootFolder(configuration));
    } catch (IOException e) {
        throw new Exception("Error while setting the default " +
                            "filesystem scheme from configuration.", e);
    }
    // 3. 給命令行對象添加選項
    this.customCommandLineOptions = new Options();
	// 獲取用戶命令行配置customCommandLines,遍歷list將其添加到運行配置和一般配置中
    for (CustomCommandLine<?> customCommandLine : customCommandLines) {
        customCommandLine.addGeneralOptions(customCommandLineOptions);
        customCommandLine.addRunOptions(customCommandLineOptions);
    }

    // 4. 從全局配置裏得到akka 客戶端等待超時時間(akka.client.timeout)
    this.clientTimeout = AkkaUtils.getClientTimeout(this.configuration);
    // 5. 從全局配置裏得到默認的系統並行度
    this.defaultParallelism = configuration.getInteger(CoreOptions.DEFAULT_PARALLELISM);
}

2.6 加載安全配置模塊

調用SecurityUtils.install方法,加載安全配置模塊
安裝安全機制的邏輯是調用:

SecurityUtils.install(new SecurityConfiguration(cli.configuration));

我們先分析下SecurityConfiguration對象的初始化,然後再分析SecurityUtils的install邏輯。

SecurityConfiguration初始化

/**
  * 從全局配置創建安全配置.
  * @param flinkConf flink全局配置
  */
public SecurityConfiguration(Configuration flinkConf) {
    this(flinkConf, DEFAULT_MODULES);
}

其中DEFAULT_MODULES爲默認的安全模板:

// 默認的安全模塊
private static final List<SecurityModuleFactory> DEFAULT_MODULES = Collections.unmodifiableList(
Arrays.asList(new HadoopModuleFactory(), new JaasModuleFactory(), new ZookeeperModuleFactory()));

繼續:

/**
* 從全局配置創建安全配置。
* @param flinkConf Flink的全局配置
* @param securityModuleFactories 要應用的安全模塊.
*/
public SecurityConfiguration(Configuration flinkConf,
List<SecurityModuleFactory> securityModuleFactories) {
   // 1. 一些全局參數的配置
    this.isZkSaslDisable = flinkConf.getBoolean(SecurityOptions.ZOOKEEPER_SASL_DISABLE);
    this.keytab = flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_KEYTAB);
    this.principal = flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_PRINCIPAL);
    this.useTicketCache = flinkConf.getBoolean(SecurityOptions.KERBEROS_LOGIN_USETICKETCACHE);
    this.loginContextNames = parseList(flinkConf.getString(SecurityOptions.KERBEROS_LOGIN_CONTEXTS));
    this.zkServiceName = flinkConf.getString(SecurityOptions.ZOOKEEPER_SASL_SERVICE_NAME);
    this.zkLoginContextName = flinkConf.getString(SecurityOptions.ZOOKEEPER_SASL_LOGIN_CONTEXT_NAME);
   
    // 2. 安全模塊就是默認的安全模塊
    this.securityModuleFactories = Collections.unmodifiableList(securityModuleFactories);
    this.flinkConfig = checkNotNull(flinkConf);
    // 3. 驗證
    validate();
}	

進一步看下validate的邏輯:

/**
  * 驗證
  */
private void validate() {
    if (!StringUtils.isBlank(keytab)) {
        // principal is required
        if (StringUtils.isBlank(principal)) {
            throw new IllegalConfigurationException("Kerberos login configuration is invalid; keytab requires a principal.");
        }

        // check the keytab is readable
        File keytabFile = new File(keytab);
        if (!keytabFile.exists() || !keytabFile.isFile() || !keytabFile.canRead()) {
            throw new IllegalConfigurationException("Kerberos login configuration is invalid; keytab is unreadable");
        }
    }
}

如果全局配置(flink-conf.yaml)裏配置了security.kerberos.login.keytab這個參數。那麼要校驗這個配置所指定的目錄存在以及可讀。這裏其實有必要對kerberos的安全認證相關知識瞭解下。

SecurityUtils的install邏輯

SecurityConfiguration對象初始化好之後,作爲參數傳進SecurityUtils的install方法裏面。具體邏輯:

/**
  * 安裝進程範圍的安全配置。
  *
  * <p>使用可用的安全模塊應用配置 (i.e. Hadoop, JAAS).
  */
public static void install(SecurityConfiguration config) throws Exception {

    // 安裝安全模塊。
    List<SecurityModule> modules = new ArrayList<>();
    try {
        // 遍歷模板,對每個安全模板進行安裝。
        for (SecurityModuleFactory moduleFactory : config.getSecurityModuleFactories()) {
            SecurityModule module = moduleFactory.createModule(config);
            // can be null if a SecurityModule is not supported in the current environment
            if (module != null) {
                module.install();
                modules.add(module);
            }
        }
    }
    catch (Exception ex) {
        throw new Exception("unable to establish the security context", ex);
    }
    installedModules = modules;

    // First check if we have Hadoop in the ClassPath. If not, we simply don't do anything.
    try {
        Class.forName(
            "org.apache.hadoop.security.UserGroupInformation",
            false,
            SecurityUtils.class.getClassLoader());

        // install a security context
        // use the Hadoop login user as the subject of the installed security context
        if (!(installedContext instanceof NoOpSecurityContext)) {
            LOG.warn("overriding previous security context");
        }
        UserGroupInformation loginUser = UserGroupInformation.getLoginUser();
        installedContext = new HadoopSecurityContext(loginUser);
    } catch (ClassNotFoundException e) {
        LOG.info("Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath.");
    } catch (LinkageError e) {
        LOG.error("Cannot install HadoopSecurityContext.", e);
    }
}

這裏安裝的安全模板主要包括了Java認證與授權服務(JAAS),Hadoop用戶組信息(UGI)和Zookeeper的全過程安全設置。

2.7 執行對應的action、回調,並返回狀態碼

根據命令行參數進行Switch case 匹配,執行對應的action、回調,並返回狀態碼。

int retCode = SecurityUtils.getInstalledContext()
					.runSecured(() -> cli.parseParameters(args));

runSecured():

/**
 * 可能需要具有的安全上下文才能運行可調用的.
 */
public interface SecurityContext {

	<T> T runSecured(Callable<T> securedCallable) throws Exception;

}

具體執行邏輯是cli.parseParameters(args)

	/**
	 * 分析命令行參數並啓動請求的操作.
	 *
	 * @param args 客戶端的命令行參數.
	 * @return 程序的返回狀態碼
	 */
	 public int parseParameters(String[] args) {

    // check for action
    if (args.length < 1) {
      CliFrontendParser.printHelp(customCommandLines);
      System.out.println("Please specify an action.");
      return 1;
    }

    // get action 提取執行動作,比如run,list,cancel
    String action = args[0];

    // remove action from parameters 從參數中移除執行動作
    final String[] params = Arrays.copyOfRange(args, 1, args.length);

    try {
      // do action
      switch (action) {
        case ACTION_RUN:
          run(params);
          return 0;
        case ACTION_LIST:
          list(params);
          return 0;
        case ACTION_INFO:
          info(params);
          return 0;
        case ACTION_CANCEL:
          cancel(params);
          return 0;
        case ACTION_STOP:
          stop(params);
          return 0;
        case ACTION_SAVEPOINT:
          savepoint(params);
          return 0;
        case "-h":
        case "--help":
          CliFrontendParser.printHelp(customCommandLines);
          return 0;
        case "-v":
        case "--version":
          String version = EnvironmentInformation.getVersion();
          String commitID = EnvironmentInformation.getRevisionInformation().commitId;
          System.out.print("Version: " + version);
          System.out.println(
              commitID.equals(EnvironmentInformation.UNKNOWN) ? "" : ", Commit ID: " + commitID);
          return 0;
        default:
          System.out.printf("\"%s\" is not a valid action.\n", action);
          System.out.println();
          System.out.println(
              "Valid actions are \"run\", \"list\", \"info\", \"savepoint\", \"stop\", or \"cancel\".");
          System.out.println();
          System.out.println(
              "Specify the version option (-v or --version) to print Flink version.");
          System.out.println();
          System.out.println("Specify the help option (-h or --help) to get help on the command.");
          return 1;
      }
    } catch (CliArgsException ce) {
      return handleArgException(ce);
    } catch (ProgramParametrizationException ppe) {
      return handleParametrizationException(ppe);
    } catch (ProgramMissingJobException pmje) {
      return handleMissingJobException();
    } catch (Exception e) {
      return handleError(e);
    }
  }

重點分析下執行任務的邏輯,即執行./flink run的邏輯。這部分比較長,打算另寫一篇記錄:
Flink1.9.0任務提交源碼閱讀(三):Job提交 run()

2.8 獲取執行返回狀態碼,關閉提交程序

獲取執行返回狀態碼,關閉提交程序


參考博客:https://blog.csdn.net/hxcaifly/article/details/87864154

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章