FAQs
What is the primary role of a Big Data Solutions Architect at Databricks?
The primary role of a Big Data Solutions Architect is to work with clients on their big data challenges using the Databricks platform, providing data engineering, data science, and cloud technology projects to help customers get the most value out of their data.
What kind of projects will I be working on in this role?
You will work on a variety of impactful customer technical projects, including designing and building reference architectures, creating how-to guides, and productionalizing customer use cases.
What programming languages should I be proficient in for this position?
You should be comfortable writing code in either Python or Scala.
What cloud ecosystems should I have experience with?
You should have a working knowledge of two or more common cloud ecosystems, such as AWS, Azure, or GCP, with expertise in at least one.
What experience is necessary regarding distributed computing?
Deep experience with distributed computing, particularly with Apache Spark™ and knowledge of Spark runtime internals, is required.
Will I need to travel for this position?
Yes, travel to customers will be required approximately 30% of the time.
Is there a certification requirement for this role?
Yes, a Databricks Certification is required.
What skills are necessary for technical project delivery?
Experience in managing scope and timelines for technical project delivery is necessary, along with documentation and white-boarding skills.
How does Databricks support diversity and inclusion in the workplace?
Databricks is committed to fostering a diverse and inclusive culture, ensuring hiring practices are inclusive and meet equal employment opportunity standards, without regard to various protected characteristics.
What benefits does Databricks offer employees?
Databricks strives to provide comprehensive benefits and perks tailored to meet the needs of all employees. Specific details can be found on our benefits website.