Solve Globant PySpark Interview Question | Apache Spark |

Sdílet
Vložit
  • čas přidán 7. 09. 2024

Komentáře • 8

  • @user-ic2ft8ed1r
    @user-ic2ft8ed1r Před 2 měsíci +1

    from pyspark.sql.functions import from_json,col
    df1 = df.withColumn("Parsed_JSON",from_json(col("address"),"street string,city string"))
    df1=df1.withColumn("Street",col("Parsed_JSON").street).withColumn("city",col("Parsed_JSON").city).select("*")
    display(df1)

  • @sreenivasm9931
    @sreenivasm9931 Před 9 měsíci +2

    Globant 😂😂😂😂😂. Such a worst and ruthless company I ever seen. This company forcing their employees to resign without compensation.

  • @vaibhavverma1340
    @vaibhavverma1340 Před 6 měsíci +1

    res = df.select(col("name"), json_tuple(col("address"), "street", "city")).toDF("name", "street", "city")

  • @amanpirjade9
    @amanpirjade9 Před 9 měsíci +1

    Make pyspark video for beginners to advanced

    • @GeekCoders
      @GeekCoders  Před 9 měsíci

      Please check my databricks and pyspark series

  • @cnureddy4078
    @cnureddy4078 Před 9 měsíci +1

    Col("parsed_json.*") Instead Col("parsed_json").*...this should work

    • @GeekCoders
      @GeekCoders  Před 9 měsíci

      Thanks for it. I was writing the same but didn’t see that I wrote parsed_json. Thank you