I am using Spark MultilayerPerceptronClassifier. This generates the โpredictedโ column in the โpredictionsโ. When I try to show this, I get an error:
SparkException: Failed to execute user defined function($anonfun$1: (vector) => double) ... Caused by: java.lang.IllegalArgumentException: requirement failed: A & B Dimension mismatch!
Other columns, for example, a vector display OK. Part of the forecast scheme:
|-- vector: vector (nullable = true) |-- prediction: double (nullable = true)
My code is:
//racist is boolean, needs to be string: val train2 = train.withColumn("racist", 'racist.cast("String")) val test2 = test.withColumn("racist", 'racist.cast("String")) val indexer = new StringIndexer().setInputCol("racist").setOutputCol("indexracist") val word2Vec = new Word2Vec().setInputCol("lemma").setOutputCol("vector") //.setVectorSize(3).setMinCount(0) val layers = Array[Int](4,5, 2) val mpc = new MultilayerPerceptronClassifier().setLayers(layers).setBlockSize(128).setSeed(1234L).setMaxIter(100).setFeaturesCol("vector").setLabelCol("indexracist") val pipeline = new Pipeline().setStages(Array(indexer, word2Vec, mpc)) val model = pipeline.fit(train2) val predictions = model.transform(test2) predictions.select("prediction").show()
EDIT proposed similar issue issue
val layers = Array[Int](0, 0, 0, 0)
what is wrong here, and this is not a mistake.
EDIT AGAIN: part 0 of the train and test is saved in the PARQUET format here .
scala neural-network apache-spark
schoon
source share