Kafka for developers data contracts using schema registry

The course begins with an introduction that provides an overview of what to expect from it. We will cover the relationship between serialization and Kafka, and the benefits it provides to the overall Kafka architecture. You will gain an understanding of the different serialization formats and the su...

Descripción completa

Detalles Bibliográficos
Autor Corporativo: Packt Publishing, publisher (publisher)
Otros Autores: Sundarraj, Dilip, presenter (presenter)
Formato: Video
Idioma:Inglés
Publicado: [Place of publication not identified] : Packt Publishing [2023]
Edición:[First edition]
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009730932706719
Descripción
Sumario:The course begins with an introduction that provides an overview of what to expect from it. We will cover the relationship between serialization and Kafka, and the benefits it provides to the overall Kafka architecture. You will gain an understanding of the different serialization formats and the support for Schema in AVRO, Protobuf, and Thrift. You will be introduced to AVRO and why AVRO is popular to work with Kafka and Schema Registry. Further in this course, we will set up Kafka in local and produce and consume messages using Kafka Console Producer and Consumer. You will set up the base project for the greeting app, which you can use to generate Java classes from the greetings schema using the Gradle build tool. You will also understand how to set up the base project for the greeting app, which we can use to generate Java classes from the greetings schema using the Maven build tool. You will understand the different techniques of evolving a Schema with the changing business requirements. In further sections, you will code and build a Spring Boot Kafka application that exchanges the data in an AVRO format and interacts with Schema Registry for data evolution. You will also build a RESTful service to publish the events where we receive events through the REST interface and then publish them to Kafka. By the end of this course, you will have a complete understanding of how to use AVRO as a data serialization format and help you understand the evolution of data using Schema Registry. What You Will Learn Understand the fundamentals of data serialization Understand the different serialization formats available Consume AVRO records using Kafka Producer Publish AVRO records using Kafka Producer Enforce data contracts using Schema Registry Use Schema Registry to register the AVRO Schema Audience This course is suitable for experienced Java developers and developers interested in learning AVRO and how to exchange data between applications using AVRO and Kafka. This can also be opted for by developers who are interested in learning about Schema Registry and how it fits into Kafka and those developers who are interested in learning techniques to evolve the data. Prior understanding of Java and experience building Kafka Producer is a must to take this course.
Notas:"Published in March 2023."
Descripción Física:1 online resource (1 video file (5 hr., 34 min.)) : sound, color
ISBN:9781837633487