場景一:windows平臺上Intellij IDEA運行spark
打開spark安裝目錄下的conf 文件夾 D:\soft\spark\conf ,
將log4j.properties.template 複製重命名爲log4j.properties,將其中的INFO修改爲WARN(第二行位置) 後,只顯示WARN和ERROR信息。將log4j.properties直接放至/src/main/resources/下,就可以了。
場景二 linux 環境下運行spark
1. cd $SPARK_HOME/conf目錄下,拷貝一個log4j.properties.template,命名爲log4j.properties
- $ cp log4j.properties.template log4j.properties
- # Set everything to be logged to the console
- log4j.rootCategory=WARN, console
- log4j.appender.console=org.apache.log4j.ConsoleAppender
- log4j.appender.console.target=System.err
- log4j.appender.console.layout=org.apache.log4j.PatternLayout
- log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
- # Settings to quiet third party logs that are too verbose
- log4j.logger.org.spark-project.jetty=WARN
- log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
- log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
- log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
- log4j.logger.org.apache.parquet=ERROR
- log4j.logger.parquet=ERROR
- # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
- log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
- log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR