FAQs
What is the job title for this position?
The job title is Spark, Scala.
Where is the job location?
The job location is Hyderabad.
What is the experience range required for this position?
The experience range required is 4 to 16 years.
What is the walk-in date for this job?
The walk-in date for this job is 8th February 2025.
What skills are required for this job?
The required skills are Spark, Scala, and Python, along with knowledge of Airflow and Maven.
What are the primary responsibilities of this role?
The primary responsibilities include designing and developing scalable and efficient solutions using Spark, Scala, Python, and Airflow.
Is there a specific requirement for knowledge of Spark configuration?
Yes, at least 1 or 2 candidates should have good code and configuration knowledge of Spark, especially at the cluster configuration level.
How important is knowledge of Airflow for this position?
Knowledge of Airflow is important, and at least 1 or 2 candidates should have good to medium-level knowledge of it.
Do I need to have experience with Maven for this role?
Yes, a good understanding of Maven (including dependencies and project setups) is required for this role.
Are both Scala and Python skills necessary for this position?
Yes, a balance of both Scala and Python skills is needed, with the ideal candidate being able to handle both languages.