How to convert from GenericRecord to SpecificRecord to Avro for compatible circuits

Is Avro SpecificRecord (i.e., generated Java classes) compatible with schema evolution? That is, if I have an Avro message source (in my case, kafka), and I want to deserialize these messages to a specific registry, is it safe to do this?

What I see:

  • adding a field to the end of the circuit works fine - can deserialize ok to specificrecord
  • adding a field to the middle does not break existing customers

Even if the messages are compatible, this is a problem.

If I can find a new schema (using, for example, a resident system registry), I can deserialize to GenericRecord, but there seems to be no way to match genericrecord to specificrecord from another schema.

MySpecificType message = (T SpecificData.get().deepCopy(MySpecificType.SCHEMA$, genericMessage); 

Deepcopy is mentioned in various places, but it uses an index, so it does not work.

Is there a safe way to map between two avro objects when you have both circuits and they are compatible? Even if I could display from genercrecord to genericrecord, it would do as I could do a deep stunt trick to complete the job.

+6
source share
1 answer

Here are sample tests for a specific data type conversion. Its all in the configuration of "specificDeserializerProps"

https://github.com/confluentinc/schema-registry/blob/master/avro-serializer/src/test/java/io/confluent/kafka/serializers/KafkaAvroSerializerTest.java

I added the following configuration and got a specific type as desired.

 HashMap<String, String> specificDeserializerProps = new HashMap<String, String>(); specificDeserializerProps.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "bogus"); specificDeserializerProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true"); specificAvroDeserializer = new KafkaAvroDeserializer(schemaRegistry, specificDeserializerProps); 

Hope that helps

+2
source

All Articles