Logo of Huzzle

Big Data Solutions Architect

image

Databricks

10d ago

  • Job
    Full-time
    Senior Level
  • Data
    IT & Cybersecurity
  • Amsterdam
    Remote

AI generated summary

  • You should have strong data engineering skills, proficiency in Python/Scala, experience with AWS/Azure/GCP, Apache Spark expertise, CI/CD familiarity, MLOps knowledge, and strong client engagement skills.
  • You will design architectures, guide customers on big data projects, scope services, implement solutions, support operational issues, and collaborate with teams to ensure successful engagements.

Requirements

  • Proficient in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices
  • Comfortable writing code in either Python or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Travel to customers 30% of the time
  • Databricks Certification

Responsibilities

  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks
  • Provide an escalated level of support for customer operational issues
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues

FAQs

What is the primary role of a Big Data Solutions Architect at Databricks?

The primary role of a Big Data Solutions Architect is to work with clients on their big data challenges using the Databricks platform, providing data engineering, data science, and cloud technology projects to help customers get the most value out of their data.

What kind of projects will I be working on in this role?

You will work on a variety of impactful customer technical projects, including designing and building reference architectures, creating how-to guides, and productionalizing customer use cases.

What programming languages should I be proficient in for this position?

You should be comfortable writing code in either Python or Scala.

What cloud ecosystems should I have experience with?

You should have a working knowledge of two or more common cloud ecosystems, such as AWS, Azure, or GCP, with expertise in at least one.

What experience is necessary regarding distributed computing?

Deep experience with distributed computing, particularly with Apache Spark™ and knowledge of Spark runtime internals, is required.

Will I need to travel for this position?

Yes, travel to customers will be required approximately 30% of the time.

Is there a certification requirement for this role?

Yes, a Databricks Certification is required.

What skills are necessary for technical project delivery?

Experience in managing scope and timelines for technical project delivery is necessary, along with documentation and white-boarding skills.

How does Databricks support diversity and inclusion in the workplace?

Databricks is committed to fostering a diverse and inclusive culture, ensuring hiring practices are inclusive and meet equal employment opportunity standards, without regard to various protected characteristics.

What benefits does Databricks offer employees?

Databricks strives to provide comprehensive benefits and perks tailored to meet the needs of all employees. Specific details can be found on our benefits website.

Technology
Industry
1001-5000
Employees
2013
Founded Year

Mission & Purpose

Databricks is the data and AI company. More than 9,000 organizations worldwide — including Comcast, Condé Nast, H&M, and over 40% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world’s toughest problems. Attention: Databricks applicants Due to reports of phishing, we’re requesting that all Databricks applicants apply through our official Careers page at databricks.com/company/careers (good news — you are here). All official communication from Databricks will come from email addresses ending with @databricks.com or @goodtime.io (our meeting tool).