Spark 1.6+
You can use the input format textto read a text file as DataFrame:
read.df(sqlContext=sqlContext, source="text", path="README.md")
Spark & lt; = 1.5
The short answer is no. SparkR 1.4 is almost completely devoid of the low-level API, leaving only a limited subset of Data Frame operations. As you can read on the old SparkR webpage :
2015 SparkR Apache Spark (1.4). (...) Spark R , ETL .
, - , spark-csv:
> df <- read.df(sqlContext, "README.md", source = "com.databricks.spark.csv")
> showDF(limit(df, 5))
+--------------------+
| C0|
+--------------------+
| # Apache Spark|
|Spark is a fast a...|
|high-level APIs i...|
|supports general ...|
|rich set of highe...|
+--------------------+
RDD, map, flatMap, reduce filter, , , , .
API , - , , . SparkR, , . ::: man-:
": , , , . , , .
, , , . 1.4 API Catalyst , , , API 1.4.
> rdd <- SparkR:::textFile(sc, 'README.md')
> counts <- SparkR:::map(rdd, nchar)
> SparkR:::take(counts, 3)
[[1]]
[1] 14
[[2]]
[1] 0
[[3]]
[1] 78
, spark-csv, textFile, .