kafka的二次封裝

kafka是一個優秀的分佈式發佈訂閱系統,我們可以很輕易地實現使用kafka Java API做發佈消息或者訂閱消息的功能。

//producer
public class ProducerApi {

    public static void main(String[] args) {
        Properties props = new Properties();
        props.put("bootstrap.servers", "192.168.16.150:9092");
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("batch.size", 16384);
        props.put("linger.ms", 1);
        props.put("buffer.memory", 33554432);
        props.put("key.serializer",
                  "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer",
                  "org.apache.kafka.common.serialization.StringSerializer");

        Producer<String, String> producer = new KafkaProducer<>(props);
        for (int i = 0; i < 100; i++) {
            producer.send(new ProducerRecord<String, String>(
                      "t1", Integer.toString(i), Integer.toString(i)));
        }
        producer.close();
    }
}
//consumer
public class ConsumerAOC {
    public static void main(String[] args) {
        final Properties props = new Properties();
        props.put("bootstrap.servers", "192.168.16.150:9092");
        props.put("group.id", "test");
        props.put("enable.auto.commit", "true");
        props.put("auto.commit.interval.ms", "1000");
        props.put("key.deserializer",
                  "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer",
                  "org.apache.kafka.common.serialization.StringDeserializer");

        KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
        consumer.subscribe(Arrays.asList("t1"));
        while (true) {
            ConsumerRecords<String, String> records = consumer.poll(1000);
            for (ConsumerRecord<String, String> record : records)
                System.out.printf("offset = %d, key = %s, value = %s%n", 
                                  record.offset(), record.key(), record.value());
        }

    }
}

但是,在一個公司內部,一般有着數個甚至成百上千的系統需要使用這些kafka API,所以通常不會像上面的接口那樣使用kafka,而是最做統一的封裝,封裝的目的是:

1. 將kafka的配置統一放配置文件或配置中心

2. 接口會更加友好,更加便於使用。

下面我爲大家介紹一種封裝手法,封裝完成後,kafka可以這樣使用

//producer
public class SpringPublisherDemo {
    @Autowired
    StringPublisher stringPublisher;

    public void emit(int count) throws Exception{
        for (int i=0; i<count; i++) {
            stringPublisher.emit("test", "just a test " + i*11);
            Thread.sleep(200);
        }
    }
}
//consumer
@EventConfigLoader(consumer = StringConsumer.class)
public class SpringConsumerDemo implements EventListener {
    @Override
    public void onEvent(Event event) {
        try {
            List<Record> records = event.getRecords();
            for (Record record : records) {
                System.out.println("receive message <"+record.getKey()+", "+record.getValue()+">");
            }
        } catch (DeserializerException e) {

        }
    }
}

這樣使用kafka接口,簡直優雅極了,尤其是consumer,使用者只管實現接收到消息後的業務邏輯onEvent函數即可。下面我們一一介紹實現流程。

一. 配置封裝

xxx:
  event:
    kafka:
      bootstrap-servers: "localhost:9091,localhost:9092,localhost:9093"
    topic: "test01"
    publisher:
      key-serializer: "com.xxx.center.event.serializer.StringSerializer"
      val-serializer: "com.xxx.center.event.serializer.StringSerializer"
    subscriber:
      auto-commit: "enable"
      key-deserializer: "com.xxx.center.event.serializer.StringDeserializer"
      val-deserializer: "com.xxx.center.event.serializer.StringDeserializer"

1.1 自定義註解@PublisherConfiguration,用於描述Publisher接口

@Inherited
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface PublisherConfiguration {
    String topic() default "";
    String partitioner() default "auto";
    String keySerializer() default "com.xxx.center.event.serializer.StringSerializer";
    String valSerializer() default "com.xxx.center.event.serializer.StringSerializer";
    String config() default "";
    int retries() default 3;
    int batchSize() default 16384;
    int lingerMs() default 1;
    long bufferMemory() default 33554432l;
    int serializerSize() default 512;
}

1.2 自定義註解@ConsuerConfiguration,用於描述Consumer接口

@Inherited
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface SubscriberConfiguration {
    String topic() default "";
    String group() default "";
    String keyDeserializer() default "com.xxx.center.event.serializer.StringDeserializer";
    String valDeserializer() default "com.xxx.center.event.serializer.StringDeserializer";
    String autoCommit() default "enable";
    String autoOffset() default "latest";
}

1.3 自定義註解@EventConfigLoader,用於描述一個Consumer Demo類

@Inherited
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Configuration
public @interface EventConfigLoader {
    Class consumer();        //指定Consumer接口
}

1.4 定義Prodiver和Consumer接口。這個工作,雖然簡單,但是非常重要,這個接口的定義,基本上就約定好了prodiver和consumer通信的基本配置。

public interface Publisher<K, V> {
    Future<Context> emit(Record<K, V> record);
    Future<Context> emit(K key, V value);
}

public interface Consumer<K, V>{
    List<Record> poll() throws DeserializerException;
    List<Record> poll(Duration timeout) throws DeserializerException;
    void close();
}


@PublisherConfiguration(
        topic = "test002"
)
public interface StringPublisher extends Publisher<String, String> {
}

@SubscriberConfiguration(
        topic = "test002"
)
public interface StringConsumer extends Consumer<String, String> {
}

二. 使用cglib動態代理爲StringPublisher和StringConsumer的Bean代理工廠類,以及對應的業務實現類

public class PublisherProxyFactory<T> implements FactoryBean<T>, MethodInterceptor {
    private Class<T> interfaceClass;
    private PublisherDelegate publisherDelegate;
    private Object object;

    public PublisherProxyFactory(Class<T> interfaceClass) {
        this.interfaceClass = interfaceClass;
    }

    @Override
    public Object intercept(Object object, Method method, Object[] args, MethodProxy proxy)throws Throwable{
        if("emit".equalsIgnoreCase(method.getName())) {
            return this.publisherDelegate.emit(object, method, args, proxy);
        } else {
            return proxy.invokeSuper(object, args);
        }
    }

    public Object createEventPublisher(Topic topic){
        if (object == null) {
            Enhancer en = new Enhancer();
            en.setSuperclass(interfaceClass);
            en.setCallback(this);

            Properties properties = getPublisherProperties();
            this.publisherDelegate = new PublisherDelegateDefault(properties);
            object = en.create();
        }

        return object;
    }

    public Object createEventPublisher(){
        return this.createEventPublisher(null);
    }

    @Override
    public T getObject() throws Exception {
        return (T) createEventPublisher();
    }

    @Override
    public Class<?> getObjectType() {
        return interfaceClass;
    }

    @Override
    public boolean isSingleton() {
        return true;
    }
}
public class SubscriberProxyFactory<T> implements FactoryBean<T>, MethodInterceptor {
    private Class<T> interfaceClass;
    private SubscriberDelegate subscriberDelegate;
    private Object object;

    public SubscriberProxyFactory(Class<T> interfaceClass) {
        this.interfaceClass = interfaceClass;
    }

    public Object createEventSubscriber() {
        return createEventSubscriber(null);
    }

    public Object createEventSubscriber(Topic topic){
        try {
            if (object == null) {
                Enhancer en = new Enhancer();
                en.setSuperclass(interfaceClass);
                en.setCallback(this);
                object = en.create();

                Properties properties = getConsumerProperties();
                this.subscriberDelegate = new SubscriberDelegateDefault(properties);
            }
        } catch (Exception e) {
            logger.error("Failed to create subscriber delegate: {}", e.getMessage());
        }

        return object;
    }

    @Override
    public T getObject() throws Exception {
        return (T) createEventSubscriber();
    }

    @Override
    public Class<?> getObjectType() {
        return interfaceClass;
    }

    @Override
    public boolean isSingleton() {
        return true;
    }

    @Override
    public Object intercept(Object object, Method method, Object[] args, MethodProxy proxy)throws Throwable{
        if ("run".equals(method.getName())) {
            this.subscriberDelegate.run(args);
            return 1;
        } else if ("poll".equals(method.getName())) {
            return this.subscriberDelegate.poll(args);
        } else {
            return proxy.invokeSuper(object, args);
        }
    }
}
public class PublisherDelegateDefault{
    private Properties properties;

    public PublisherDelegateDefault(Properties properties){
        this.properties = properties;
    }

    public KafkaProducer createKafkaProducer(Object object) throws IOException{
        return new KafkaProducer(props);
    }

    ......
}
public interface Record<K, V> {
    long getPublishTime();
    K getKey();
    V getValue();
}

public class AbstractRecord<K, V> implements Record {
    private int partition;
    private long publishTime;
    private K key;
    private V value;

    public int getPartition() {
        return partition;
    }

    public void setPartition(int partition) {
        this.partition = partition;
    }

    public long getPublishTime() {
        return publishTime;
    }

    public void setPublishTime(long publishTime) {
        this.publishTime = publishTime;
    }

    public K getKey() {
        return key;
    }

    public void setKey(K key) {
        this.key = key;
    }

    public V getValue() {
        return value;
    }

    public void setValue(V value) {
        this.value = value;
    }
}

public class DefaultRecord extends AbstractRecord {
}


public class SubscriberDelegateDefault() {
    private Properties properties;
    private KafkaConsumer consumer;

    public SubscriberDelegateDefault(Properties properties) {
        this.properties = properties;
    }

    public void run() {
        consumer = new KafkaConsumer<String, String>(props);
        consumer.subscribe(Arrays.asList(props.get("topic")));
    }

    public void poll() {
        ConsumerRecords<?, ?> records = consumer.poll(100);
        for (ConsumerRecord consumerRecord : records) {
            DefaultRecord record = new DefaultRecord();
            Object key = consumerRecord.key();
            Object value = consumerRecord.value();

            Object deserializerKey = key == null ? null :
                        KafkaSerializerProxyFactory.deserialize(props.getCustomKeyDeserializer(),
                                (Bytes) key,
                                (Class<?>) actualTypeArguments[0]);
            Object deserializerValue = value == null ? null :
                        KafkaSerializerProxyFactory.deserialize(props.getCustomValDeserializer(),
                                (Bytes) value,
                                (Class<?>) actualTypeArguments[1]);

             record.setKey(deserializerKey);
             record.setValue(deserializerValue);
             record.setPartition(consumerRecord.partition());
             record.setPublishTime(consumerRecord.timestamp());
             list.add(record);
        }
        return records;
    }

    public void close() {
        consumer.close();
    }
}

三. 爲consumer創建一個reactor線程池模型來實時拉取消息。

對於publisher,demo程序直接依賴引入對應的bean即可使用。而consumer是被動工作的,必須要有一個後臺線程定時去拉取消息。同時爲了提供性能,最好有一個線程池去工作,最後,這個線程池直接採用reactor模型來工作。由於這個reactor模型較爲複雜,這裏不再展開,僅貼上最爲重要的代碼

    int subscribe() {
        List<Record> recordList = null;
        String className = consumer.getClass().getSimpleName();
        if (className.contains("$")) {
            className = className.substring(0, className.indexOf("$"));
        }

        logger.debug("Begin to poll records of topic<{}>.", className);

        try {
            recordList = consumer.poll(Duration.ofSeconds(2));
        }catch (DeserializerException e) {
            logger.error("Exception when poll events: {}", e.getMessage());
            return 0;
        }

        if(recordList.isEmpty()){
            return 0;
        }

        logger.info("Receive {} records of topic<{}>.", recordList.size(), className);

        DefaultContext context = new DefaultContext();
        context.setConsumer(consumer);

        DefaultEvent event = new DefaultEvent();
        event.setRecords(recordList);
        event.setContext(context);

        for (Listener listener : listeners) {
            listener.onEvent(event);    //這裏執行業務邏輯代碼
        }

        logger.debug("Completed to subsciber of topic<{}>.", subscriber.getClass().getSimpleName());
        return recordList.size();
    }

四. 向spring框架加入自定義的bean掃描程序

這個掃描程序要掃苗所有的Publisher和Consumer的接口,並用動態代理爲之創建bean。

public class ClassPathEventScanner extends ClassPathBeanDefinitionScanner {

    public ClassPathEventScanner(BeanDefinitionRegistry registry) {
        super(registry, true);
    }

    public void registerFilters(){
        addIncludeFilter(new AnnotationTypeFilter(PublisherConfiguration.class));
        addIncludeFilter(new AnnotationTypeFilter(SubscriberConfiguration.class));
        addIncludeFilter(new AnnotationTypeFilter(SenderConfiguration.class));

        addIncludeFilter(new TypeFilter(){
            @Override
            public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory) throws IOException {
                return Arrays.stream(metadataReader.getClassMetadata().getInterfaceNames()).anyMatch(s ->
                        Subscriber.class.getName().equals(s) ||
                        Sender.class.getName().equals(s) ||
                        Publisher.class.getName().equals(s));
            }
        });
    }

    @Override
    protected boolean isCandidateComponent(AnnotatedBeanDefinition beanDefinition) {
        return true;
    }

    @Override
    public Set<BeanDefinitionHolder> doScan(String... basePackages) {
        Set<BeanDefinitionHolder> beanDefinitions = super.doScan(basePackages);
        if (beanDefinitions.isEmpty()) {
            LOGGER.info("No Event interface was found in '" + Arrays.toString(basePackages) + "' package. Please check your configuration.");
        } else {
            processBeanDefinitions(beanDefinitions);
        }

        return beanDefinitions;
    }

    private void processBeanDefinitions(Set<BeanDefinitionHolder> beanDefinitions) {
        GenericBeanDefinition definition;
        HashMap<String, BeanDefinitionHolder> elBeanDefinitionHolders = new HashMap<String, BeanDefinitionHolder>();

        //優先掃碼subscriber和publisher,因爲eventlistener要依賴subscriber
        for (BeanDefinitionHolder holder : beanDefinitions) {
            definition = (GenericBeanDefinition) holder.getBeanDefinition();
            String beanClassName = definition.getBeanClassName();
            try {
                Class clazz = Class.forName(beanClassName);
                for(Class interfaceClass : clazz.getInterfaces()){
                    if(interfaceClass.equals(Publisher.class)){
                        definition.getConstructorArgumentValues().addGenericArgumentValue(beanClassName);
                        definition.setBeanClass(PublisherProxyFactory.class);
                        definition.setLazyInit(false);
                        continue;
                    }else if(interfaceClass.equals(Subscriber.class)){
                        definition.getConstructorArgumentValues().addGenericArgumentValue(beanClassName);
                        definition.setBeanClass(SubscriberProxyFactory.class);
                        definition.setLazyInit(false);
                        continue;
                    }
                    else if (interfaceClass.equals(Sender.class)){
                        definition.getConstructorArgumentValues().addGenericArgumentValue(beanClassName);
                        definition.setBeanClass(SenderProxyFactory.class);
                        definition.setLazyInit(false);
                        continue;
                    }
                }
            } catch (ClassNotFoundException e) {
                LOGGER.debug("Can't found class from name: {}", beanClassName, e);
            }
        }
    }
}
public class EventScannerRegistrar implements BeanFactoryAware, ImportBeanDefinitionRegistrar, ResourceLoaderAware {

    private BeanFactory beanFactory;
    private ResourceLoader resourceLoader;

    public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
        this.beanFactory = beanFactory;
    }

    public void setResourceLoader(ResourceLoader resourceLoader) {
        this.resourceLoader = resourceLoader;
    }


    public void registerBeanDefinitions(AnnotationMetadata importingClassMetadata, BeanDefinitionRegistry registry) {

        AnnotationAttributes annoAttrs = AnnotationAttributes.fromMap(importingClassMetadata.getAnnotationAttributes(EventScan.class.getName()));
        if(annoAttrs==null){
            annoAttrs = AnnotationAttributes.fromMap(importingClassMetadata.getAnnotationAttributes(ComponentScan.class.getName()));
        }

        ClassPathEventScanner scanner = new ClassPathEventScanner(registry);
        if (this.resourceLoader != null) {
            scanner.setResourceLoader(this.resourceLoader);
        }

        List<String> packages = AutoConfigurationPackages.get(this.beanFactory);

        if(annoAttrs!=null){
            for (String pkg : annoAttrs.getStringArray("value")) {
                if (StringUtils.hasText(pkg)) {
                    packages.add(pkg);
                }
            }
            for (String pkg : annoAttrs.getStringArray("basePackages")) {
                if (StringUtils.hasText(pkg)) {
                    packages.add(pkg);
                }
            }
            for (Class<?> clazz : annoAttrs.getClassArray("basePackageClasses")) {
                packages.add(ClassUtils.getPackageName(clazz));
            }
        }

        // 此處應當使用特殊格式的LOGGER,確保格式化
        LOGGER.info("**************************************************");
        LOGGER.info("** Starting scan event publisher and subscriber **");
        LOGGER.info("**************************************************");
        scanner.registerFilters();
        scanner.doScan(StringUtils.toStringArray(packages));

    }
}

EventScannerRegistrar 需要添加到spring.factories中(作爲啓動類存在),因爲只有這個函數被啓動類加載,纔會按部就班地把各種producer和consumer的bean都創建好。

五. ConsumerDemo類的啓動類EventStater,這個啓動類要爲Consumer啓動reactor線程池

@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Import({EventScannerRegistrar.class, EventStarter.class})
public @interface EventScan {
    String[] value() default {};
    String[] basePackages() default {};
    Class<?>[] basePackageClasses() default {};
}

public class EventStarter implements ApplicationContextAware, CommandLineRunner {
    private static ApplicationContext appContext;

    @Override
    public void setApplicationContext(ApplicationContext context) {
        appContext = context;
    }

    @Override
    public void run(String...strings) {
        if (!checkEnable()) {
            logger.info("Event switch is disable");
            return;
        }

        //遍歷所有注入的consumer的listener
        String[] consumerDemoBeanDefinitionNames = appContext.getBeanNamesForAnnotation(EventConfigLoader.class);

        int poolSize = consumerDemoBeanDefinitionNames.length;

        if(poolSize == 0) {
            return;
        }

        AccepterReactor reactor = AccepterReactor.getInstance(poolSize);

        for (String name: subscriberDemoBeanDefinitionNames) {
            Object eventListener = appContext.getBean(name);

            Class eventListenerClass = eventListener.getClass();
            EventConfigLoader anno = (EventConfigLoader)eventListenerClass.getAnnotation(EventConfigLoader.class);

            Consumer consumer = (Consumer) appContext.getBean(anno.consumer());
            reactor.put(consumer, consumer);
            reactor.add(consumer, (EventListener)eventListener);
            consumer.run(eventListenerClass.getName());
        }

        reactor.start();        //啓動線程池,至此,系統就可以工作了
    }
}

至此,一個基本的kafka的封裝框架基本成型。要注意的是,本文並沒有把配置的讀取,以及序列器和反序列器的代碼,reactor線程池的代碼,還有其它很多實現的細節增加進來。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章