QBoard » Big Data » Big Data - Spark » How to stop INFO messages displaying on spark console?

How to stop INFO messages displaying on spark console?

  • I'd like to stop various messages that are coming on spark shell.

    I tried to edit the log4j.properties file in order to stop these message.

    Here are the contents of log4j.properties

    # Define the root logger with appender file
    log4j.rootCategory=WARN, console
    log4j.appender.console=org.apache.log4j.ConsoleAppender
    log4j.appender.console.target=System.err
    log4j.appender.console.layout=org.apache.log4j.PatternLayout
    log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
    
    # Settings to quiet third party logs that are too verbose
    log4j.logger.org.eclipse.jetty=WARN
    log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
    log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
    log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO​


    But messages are still getting displayed on the console.

    Here are some example messages

    15/01/05 15:11:45 INFO SparkEnv: Registering BlockManagerMaster
    15/01/05 15:11:45 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150105151145-b1ba
    15/01/05 15:11:45 INFO MemoryStore: MemoryStore started with capacity 0.0 B.
    15/01/05 15:11:45 INFO ConnectionManager: Bound socket to port 44728 with id = ConnectionManagerId(192.168.100.85,44728)
    15/01/05 15:11:45 INFO BlockManagerMaster: Trying to register BlockManager
    15/01/05 15:11:45 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 192.168.100.85:44728 with 0.0 B RAM
    15/01/05 15:11:45 INFO BlockManagerMaster: Registered BlockManager
    15/01/05 15:11:45 INFO HttpServer: Starting HTTP Server
    15/01/05 15:11:45 INFO HttpBroadcast: Broadcast server star


    How do I stop these?

     
      October 26, 2021 1:19 PM IST
    0
  • I just add this line to all my pyspark scripts on top just below the import statements.

    SparkSession.builder.getOrCreate().sparkContext.setLogLevel("ERROR")
    

     

    example header of my pyspark scripts

    from pyspark.sql import SparkSession, functions as fs
    SparkSession.builder.getOrCreate().sparkContext.setLogLevel("ERROR")
      October 29, 2021 3:10 PM IST
    0
  • Answers above are correct but didn't exactly help me as there was additional information I required.

    I have just setup Spark so the log4j file still had the '.template' suffix and wasn't being read. I believe that logging then defaults to Spark core logging conf.

    So if you are like me and find that the answers above didn't help, then maybe you too have to remove the '.template' suffix from your log4j conf file and then the above works perfectly!

    http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-td11278.html

      December 6, 2021 2:02 PM IST
    0
  • You set disable the Logs by setting its level to OFF as follows:

    Logger.getLogger("org").setLevel(Level.OFF);
    Logger.getLogger("akka").setLevel(Level.OFF);

     

    or edit log file and set log level to off by just changing the following property:

    log4j.rootCategory=OFF, console
    
      October 28, 2021 4:34 PM IST
    0