kafka源碼分析之環境搭建

kafka作爲常用的消息中間件,在各大企業得到廣泛應用,筆者從今天開始將陸續對kafka 0.10.1進行源碼分析。工欲善其事必先利其器,在閱讀Kafka源碼之前,首先第一步得搭建源碼閱讀環境,選擇合適的工具可以使我們的學習效率更好,在這裏我選用Intellij Idea 作爲源碼閱讀環境的IDEA環境。

環境依賴

jdk

下載jdk 1.8,設置JAVA_HOME

java -version
java version "1.8.0_221"
Java(TM) SE Runtime Environment (build 1.8.0_221-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.221-b11, mixed mode)

gradle

下載gradle 3.1.6,設置GRADLE_HOME

GRADLE_HOME=/Users/lidongmeng/software/gradle-6.1.1
PATH=$PATH:$GRADLE_HOME/bin

scala

下載scala-2.11.8 ,設置SCALA_HOME

SCALA_HOME=/Users/lidongmeng/software/scala-2.11.8
PATH=$PATH:$SCALA_HOME/bin

zookeeper

下載zookeeper 3.4.9: ,設置ZK_HOME

ZK_HOME=/Users/lidongmeng/software/zookeeper-3.4.9
PATH=$PATH:$ZK_HOME/bin

kafka源碼

下載kafka-0.10.1源代碼: http://kafka.apache.org/downloads

idea-scala插件

Idea -> preferences -> plugin -> scala
在這裏插入圖片描述

編譯源碼

構建

使用命令: gradle idea構建項目,如果是eclipse使用gradle eclipse

> Task :idea
Generated IDEA project at file:///Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/kafka-0.10.0.1-src.ipr

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.6.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 7m 16s
16 actionable tasks: 16 executed

構建錯誤及解決思路

  1. scalaBasePlugin缺失
// 問題
* Where:
Build file '/Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/build.gradle' line: 233

* What went wrong:
A problem occurred evaluating root project 'kafka-0.10.0.1-src'.
> Failed to apply plugin [class 'org.gradle.api.plugins.scala.ScalaBasePlugin']
   > Could not create task ':core:compileScala'.
      > No such property: useAnt for class: org.gradle.api.tasks.scala.ScalaCompileOptions
          
// 解決方案
ScalaCompileOptions.metaClass.daemonServer = true
ScalaCompileOptions.metaClass.fork = true
ScalaCompileOptions.metaClass.useAnt = false
ScalaCompileOptions.metaClass.useCompileDaemon = false
  1. org.scoverage plugin出錯
// 錯誤棧
* Where:
Build file '/Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/build.gradle' line: 376

* What went wrong:
A problem occurred evaluating root project 'kafka-0.10.0.1-src'.
> Failed to apply plugin [id 'org.scoverage']
   > Could not create an instance of type org.scoverage.ScoverageExtension.
      > You can't map a property that does not exist: propertyName=testClassesDir

// 解決方案
classpath 'org.scoverage:gradle-scoverage:2.5.0' ## 將2.0.1修改爲2.5.0

運行&驗證

server啓動

  1. log4j文件

    將config/log4j.properties文件放置/core/src/main/scala/目錄下,並且添加kafka.logs.dir=logs配置

  2. Server.log文件

    將config/server.properties文件中的log.dirs=xx/kafka-0.10.0.1-src/kafka-logs更改到指定的文件夾中

  3. Run配置
    在這裏插入圖片描述

  4. 運行

    [2020-02-24 14:48:11,176] INFO Kafka version : unknown (org.apache.kafka.common.utils.AppInfoParser)
    [2020-02-24 14:48:11,176] INFO Kafka commitId : unknown (org.apache.kafka.common.utils.AppInfoParser)
    [2020-02-24 14:48:11,177] INFO [Kafka Server 0], started (kafka.server.KafkaServer)
    

創建topic

kafka_2.11-0.11.0.1/bin : ./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
Created topic "test".
[2020-02-24 14:27:32,217] INFO Completed load of log test-0 with log end offset 0 (kafka.log.Log)
[2020-02-24 14:27:32,220] INFO Created log for partition [test,0] in /Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/kafka-logs with properties {compression.type -> producer, message.format.version -> 0.10.0-IV1, file.delete.delay.ms -> 60000, max.message.bytes -> 1000012, message.timestamp.type -> CreateTime, min.insync.replicas -> 1, segment.jitter.ms -> 0, preallocate -> false, min.cleanable.dirty.ratio -> 0.5, index.interval.bytes -> 4096, unclean.leader.election.enable -> true, retention.bytes -> -1, delete.retention.ms -> 86400000, cleanup.policy -> delete, flush.ms -> 9223372036854775807, segment.ms -> 604800000, segment.bytes -> 1073741824, retention.ms -> 604800000, message.timestamp.difference.max.ms -> 9223372036854775807, segment.index.bytes -> 10485760, flush.messages -> 9223372036854775807}. (kafka.log.LogManager)

producer & consumer發送接收消息

Producer:

kafka_2.11-0.11.0.1 : bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
>afafdsafs
>this is an

Consumer:

kafka_2.11-0.11.0.1 : bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test
afafdsafs
this is an

參考

  1. https://blog.csdn.net/qq1137623160/article/details/104357430
  2. https://blog.csdn.net/zlx510tsde/article/details/52688787
  3. Apache Kafka源碼剖析第一章
  4. http://kafka.apache.org/0101/documentation.html#quickstart
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章