kafka源码分析之环境搭建

kafka作为常用的消息中间件,在各大企业得到广泛应用,笔者从今天开始将陆续对kafka 0.10.1进行源码分析。工欲善其事必先利其器,在阅读Kafka源码之前,首先第一步得搭建源码阅读环境,选择合适的工具可以使我们的学习效率更好,在这里我选用Intellij Idea 作为源码阅读环境的IDEA环境。

环境依赖

jdk

下载jdk 1.8,设置JAVA_HOME

java -version
java version "1.8.0_221"
Java(TM) SE Runtime Environment (build 1.8.0_221-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.221-b11, mixed mode)

gradle

下载gradle 3.1.6,设置GRADLE_HOME

GRADLE_HOME=/Users/lidongmeng/software/gradle-6.1.1
PATH=$PATH:$GRADLE_HOME/bin

scala

下载scala-2.11.8 ,设置SCALA_HOME

SCALA_HOME=/Users/lidongmeng/software/scala-2.11.8
PATH=$PATH:$SCALA_HOME/bin

zookeeper

下载zookeeper 3.4.9: ,设置ZK_HOME

ZK_HOME=/Users/lidongmeng/software/zookeeper-3.4.9
PATH=$PATH:$ZK_HOME/bin

kafka源码

下载kafka-0.10.1源代码: http://kafka.apache.org/downloads

idea-scala插件

Idea -> preferences -> plugin -> scala
在这里插入图片描述

编译源码

构建

使用命令: gradle idea构建项目,如果是eclipse使用gradle eclipse

> Task :idea
Generated IDEA project at file:///Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/kafka-0.10.0.1-src.ipr

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.6.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 7m 16s
16 actionable tasks: 16 executed

构建错误及解决思路

  1. scalaBasePlugin缺失
// 问题
* Where:
Build file '/Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/build.gradle' line: 233

* What went wrong:
A problem occurred evaluating root project 'kafka-0.10.0.1-src'.
> Failed to apply plugin [class 'org.gradle.api.plugins.scala.ScalaBasePlugin']
   > Could not create task ':core:compileScala'.
      > No such property: useAnt for class: org.gradle.api.tasks.scala.ScalaCompileOptions
          
// 解决方案
ScalaCompileOptions.metaClass.daemonServer = true
ScalaCompileOptions.metaClass.fork = true
ScalaCompileOptions.metaClass.useAnt = false
ScalaCompileOptions.metaClass.useCompileDaemon = false
  1. org.scoverage plugin出错
// 错误栈
* Where:
Build file '/Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/build.gradle' line: 376

* What went wrong:
A problem occurred evaluating root project 'kafka-0.10.0.1-src'.
> Failed to apply plugin [id 'org.scoverage']
   > Could not create an instance of type org.scoverage.ScoverageExtension.
      > You can't map a property that does not exist: propertyName=testClassesDir

// 解决方案
classpath 'org.scoverage:gradle-scoverage:2.5.0' ## 将2.0.1修改为2.5.0

运行&验证

server启动

  1. log4j文件

    将config/log4j.properties文件放置/core/src/main/scala/目录下,并且添加kafka.logs.dir=logs配置

  2. Server.log文件

    将config/server.properties文件中的log.dirs=xx/kafka-0.10.0.1-src/kafka-logs更改到指定的文件夹中

  3. Run配置
    在这里插入图片描述

  4. 运行

    [2020-02-24 14:48:11,176] INFO Kafka version : unknown (org.apache.kafka.common.utils.AppInfoParser)
    [2020-02-24 14:48:11,176] INFO Kafka commitId : unknown (org.apache.kafka.common.utils.AppInfoParser)
    [2020-02-24 14:48:11,177] INFO [Kafka Server 0], started (kafka.server.KafkaServer)
    

创建topic

kafka_2.11-0.11.0.1/bin : ./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
Created topic "test".
[2020-02-24 14:27:32,217] INFO Completed load of log test-0 with log end offset 0 (kafka.log.Log)
[2020-02-24 14:27:32,220] INFO Created log for partition [test,0] in /Users/lidongmeng/source_code_read/kafka-0.10.0.1-src/kafka-logs with properties {compression.type -> producer, message.format.version -> 0.10.0-IV1, file.delete.delay.ms -> 60000, max.message.bytes -> 1000012, message.timestamp.type -> CreateTime, min.insync.replicas -> 1, segment.jitter.ms -> 0, preallocate -> false, min.cleanable.dirty.ratio -> 0.5, index.interval.bytes -> 4096, unclean.leader.election.enable -> true, retention.bytes -> -1, delete.retention.ms -> 86400000, cleanup.policy -> delete, flush.ms -> 9223372036854775807, segment.ms -> 604800000, segment.bytes -> 1073741824, retention.ms -> 604800000, message.timestamp.difference.max.ms -> 9223372036854775807, segment.index.bytes -> 10485760, flush.messages -> 9223372036854775807}. (kafka.log.LogManager)

producer & consumer发送接收消息

Producer:

kafka_2.11-0.11.0.1 : bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
>afafdsafs
>this is an

Consumer:

kafka_2.11-0.11.0.1 : bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test
afafdsafs
this is an

参考

  1. https://blog.csdn.net/qq1137623160/article/details/104357430
  2. https://blog.csdn.net/zlx510tsde/article/details/52688787
  3. Apache Kafka源码剖析第一章
  4. http://kafka.apache.org/0101/documentation.html#quickstart
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章