Don’t leave Apache Flink and Schema Registry alone
Thinking earlier to evolve our architecture could be tough but won’t be more harder and complex if you decide to tackle this challenge at a later stage.
Perhaps at first sight re-design the architecture to have a Schema Registry could take from you some time that can be invested in developing another feature, but believe me, as the system is growing and growing is necessary to have a catalog of the events, as well a well-established definition of each one.
Schema Registry?
Schema Registry is a central repository with a RESTful interface for developers to define standard schemas and register applications to enable compatibility
source: Confluent
Schema Registry is a solution to solve the challenge of sharing schemas between the applications allowing:
- Evolution of data in a simple way, where the consumers and producers can be using different schemas version and even doing that can be compatible, depending on the compatibility strategy used backward or forward compatibility.
- Centralized repository/catalog, having a centralized piece that manages the data models, allows your architecture to have a data catalog that knows all the data types in the system.
- Schemas enforcement ensures the data published, matches the schema defined and the constraints are respected.