In spark iterate through each column and find the max length

In spark iterate through each column and find the max length

I am new to spark scala and I have following situation as below I have a table "TEST_TABLE" on cluster(can be hive table) I am converting that to dataframe as:

I am new to spark scala and I have following situation as below I have a table "TEST_TABLE" on cluster(can be hive table) I am converting that to dataframe as:

scala> val testDF = spark.sql("select * from TEST_TABLE limit 10")

Now the DF can be viewed as

scala> testDF.show()

COL1|COL2|COL3

abc|abcd|abcdef a|BCBDFG|qddfde MN|1234B678|sd

I want an output like below

COLUMN_NAME|MAX_LENGTH
       COL1|3
       COL2|8
       COL3|6

Is this feasible to do so in spark scala?

scala apache-spark

What's new in Bootstrap 5 and when Bootstrap 5 release date?

How to Build Progressive Web Apps (PWA) using Angular 9

What is new features in Javascript ES2020 ECMAScript 2020

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Random Password Generator Online

HTML Color Picker online | HEX Color Picker | RGB Color Picker

Apache Spark Tutorial For Beginners - Apache Spark Full Course

This video on Apache Spark Tutorial For Beginners - Apache Spark Full Course will help you learn the basics of Big Data, what Apache Spark is, and the architecture of Apache Spark. Yyou will understand how to install Apache Spark on Windows and Ubuntu. You will look at the important components of Spark, such as Spark Streaming, Spark MLlib, and Spark SQL. You will get an idea about implement Spark with Python in PySpark tutorial and look at some of the important Apache Spark interview questions

Dynamic Partition Pruning in Spark 3.0

Dynamic Partition Pruning in Spark 3.0 With the release of Spark 3.0, big improvements were implemented to enable Spark to execute faster and there came many

Why learn Apache Spark in 2020?

This video on "Apache Spark in 2020" will provide you with the detailed and comprehensive knowledge about the current IT Job trends based on Apache Spark and why learn Apache Spark in 2020? What is new in Apache Spark? What is Apache Spark? Top 5 Reasons to learn Spark. Salary trends of Spark Developer. Components of Spark. Skills required by Spark Developer. Companies using Apache Spark

Top 9 Reasons To Start Learning Apache Spark and Scala

Apache Spark is one of the most emerging technologies in today’s world. Here are 9 great reasons to learn Apache Spark and Scala that are good for you.

PySpark Tutorial For Beginners | Apache Spark With Python Tutorial

PySpark Tutorial For Beginners | Apache Spark With Python Tutorial will help you understand what PySpark is, the different features of PySpark, and the comparison of Spark with Python and Scala. Learn the various PySpark contents - SparkConf, SparkContext, SparkFiles, RDD, StorageLevel, DataFrames, Broadcast and Accumulator. You will get an idea about the various Subpackages in PySpark. You will look at a demo using PySpark SQL to analyze Walmart Stocks data