how to use zip function spark java

hammad

i need to transform this partition of code of scala to java language

   scala> List("a", "b", "c") zip (Stream from 1)
   res1: List[(String, Int)] = List((a,1), (b,2), (c,3))

how can i replace (Stream from 1) in java? help please

PJ Fanning

If you want to add an index, use rdd zipWithIndex

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

How to use map-function in SPARK with Java

From Dev

How to use function mapPartitionsWithIndex in Spark?

From Dev

How to use sortby in Java in Spark

From Dev

How to use dataframes within a map function in Spark?

From Dev

How to apply map function on dataset in spark java

From Dev

How to use Spark's .newAPIHadoopFile() in Java

From Dev

How to load java properties file and use in Spark?

From Dev

How to use Spark's .newAPIHadoopRDD() from Java

From Dev

how to use spark 2.0.0 preview in java

From Dev

how to use function correctly in java

From Dev

How to use a java function in libgdx

From Dev

How to use Java 8 Streams to Implement zip in specific way?

From Java

How to use countDistinct using a window function in spark/cala?

From Dev

How to define and use a User-Defined Aggregate Function in Spark SQL?

From Dev

How to open/stream .zip files through Spark?

From Dev

Apache Spark - How to zip multiple RDDs

From Dev

AWS KMS How to use Decrypt function Java

From Dev

How to use opencv snake function in Java?

From Dev

How to use a java custom comparator for ordering a Spark RDD

From Dev

how to use jni in spark?

From Dev

How to use foreachPartition in Spark?

From Dev

how to use jni in spark?

From Dev

How to use combinations in Spark?

From Dev

How to copy a zip file in Java

From Dev

How to zip the content of a directory in Java

From Dev

How to copy a zip file in Java

From Dev

Use of foreachActive for spark Vector in Java

From Dev

Use https in spark-java

From Dev

Spark: How to send arguments to Spark foreach function