How goes the battle?
This post is another part of my Kafka weather station use case idea.
I want to show how I created an app using Spring Boot and Thymeleaf to show a real-time dashboard with Sense Hat temperature data that I read from a Kafka topic. This is another blog about Java on Raspberry PI.
Picture 1: The highlighted part of the diagram that this post is focused on.
You can check my Kafka Producer blog and Kafka at the edge blog, where I explain the Sense Hat producer with Micronaut.
Picture 2: end-to-end flow
Idea
A Spring Boot web application where I use Thymeleaf to display a real-time dashboard to show Temperature values that I read from a Kafka topic. I use a Websocket to update the data on the page and Spring Kafka to read the Kafka data.
The Data came from a Raspberry PI using the Sense Hat.
Picture 3: I added a REST interface for and a @PostConstruct method producing a message
The program is using Spring-Kafka integration and I add a Kafka producer to get the Sense Hat data and I add a REST interface as well to be able to add some data for testing or if I don’t have the Sense Hat running.
Code
First things first; configuration. There are two ways to configure our Producer and Consumer.
Option 1: Using application.properties or application.yml
Option 2: Java class with @Configuration
I’m using option 2 in this example.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | @EnableKafka @Configuration public class KafkaConsumerConfig { @Value("${kafka.bootstrapserver}") public String bootstrapServer; @Bean public Map<String,Object> consumerConfigs(){ Map<String,Object> props=new HashMap<String,Object>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.GROUP_ID_CONFIG, "temp-groupid.group"); props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest"); return props; } @Bean public ConsumerFactory<String, String> consumerFactory(){ return new DefaultKafkaConsumerFactory<>(consumerConfigs()); } @Bean public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String,String>> kafkaListenerContainerFactory(){ ConcurrentKafkaListenerContainerFactory<String, String> factory=new ConcurrentKafkaListenerContainerFactory(); factory.setConsumerFactory(consumerFactory()); return factory; } } |
The idea is to use this just to consume, but I added a REST interface for producing a message and a @PostConstruct method in the service to produce Sense Hat data, to be able to test if I don’t have the Sense Hat producer.
I’m using this Java Wrapper for Raspberry Pi Sense Hat
1 2 3 4 5 6 7 | <dependency> <groupId>sensehat</groupId> <artifactId>sensehat</artifactId> <version>1.0.0</version> <scope>system</scope> <systemPath>${basedir}/lib/java-executor-1.0-SNAPSHOT.jar</systemPath> </dependency> |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | @PostConstruct public void init() { if (senseHat.equals("true")) { SenseHat senseHat = new SenseHat(); new Thread(() -> { while (true) { sendMessage(Float.toString(senseHat.environmentalSensor.getTemperature())); try { Thread.sleep(5000); } catch (InterruptedException e) { e.printStackTrace(); } } }).start(); } } |
And I have kafka.sensehat=true properties to enable or disable the Sense Hat producer
You can get the full code on my GitHub.
Picture 4: The end result
Kafka
1 | ./bin/zookeeper-server-start.sh ./config/zookeeper.properties |
1 | ./bin/kafka-server-start.sh ./config/server.properties |
1 | ./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic temperature |
Links
http://www.igfasouza.com/blog/what-is-kafka/
http://www.igfasouza.com/blog/sense-hat/
http://www.igfasouza.com/blog/kafka-weather-station/
http://www.igfasouza.com/blog/websockets-vs-server-sent-events/
Thanks sir,
Good job