I have a class that implements the Kryo native serializer, implementing the read() and write() com.esotericsoftware.kryo.Serializer from com.esotericsoftware.kryo.Serializer (see example below). How can I register this custom serializer with Spark?
Here is a sudo-code example of what I have:
class A() CustomASerializer extends com.esotericsoftware.kryo.Serializer[A]{ override def write(kryo: Kryo, output: Output, a: A): Unit = ??? override def read(kryo: Kryo, input: Input, t: Class[A]): A = ??? } val kryo: Kryo = ... kryo.register(classOf[A], new CustomASerializer()); // I can register my serializer
Now in Spark:
val sparkConf = new SparkConf() sparkConf.registerKryoClasses(Array(classOf[A]))
Unfortunately, Spark does not allow me to register my own serializer. Any idea if there is a way to do this?
scala apache-spark kryo
marios
source share