I am trying to store my data in a nested way to parquet and use a map type column to store complex objects as values.
If anyone can tell me if the filter works by clicking on the type of column map or not. For example, below my sql query is
`select measureMap['CR01'].tenorMap['1M'] from RiskFactor where businessDate='2016-03-14' and bookId='FI-UK'`
measureMap is a map with a key as String and a value as a user data type containing 2 attributes - String and another String map, Double pair.
I want to know if pushdown will work on the map or not, i.e. if the map has 10 key value pairs, Spark will cast the whole map data to memort and create an object model or filter the data depending on the key in I / O.
Also I want to know if there is any way to specify the key in where, somewhere like: where measureMap.key = 'CR01'?
source
share