Job title: Databricks Platform Engineer Job type: B2B Contract Length: 12 months Role Location: Remote / Relocation to Abu Dhabi available Client Industry: IT Consultancy / Financial Services Role and Responsibilities:
• Collaborate with stakeholders during requirements clarification and sprint planning to ensure alignment with business objectives. • Design and implement technical solutions on Lakehouse platform (Databricks), including: Prototyping new Databricks capabilities. Exposing these capabilities to support Data Products strategy, and Data & AI ecosystem. • Integrate data platforms with enterprise tools, including:
Incident and monitoring systems (e.g., ServiceNow).
Identity management solutions.
Data observability tools (e.g., Dynatrace).
• Develop and maintain unit and integration tests to ensure quality and resilience. • Support QA teams during acceptance testing. • Act as a third-line engineer for production incidents, ensuring system stability and uptime. • Collaborate with cloud and infrastructure teams to deliver secure, scalable, and reliable solutions. Job Requirements:
Expert knowledge of Databricks.
Proficient in PySpark for distributed computing
Python for library development.
Advanced SQL skills for complex query optimisation (e.g., Oracle, MS SQL).
Experience with Git for version control.
Familiarity with monitoring tools (e.g., ServiceNow, Prometheus, Grafana).
Knowledge of scheduling tools (e.g., Stonebranch, Control-M, Airflow).
Proficiency in data quality frameworks (e.g., Great Expectations, ideally Monte Carlo).
Note: The primary focus is building platform capabilities rather than writing ETL pipelines.
• Agile Practices: Comfortable with sprint planning, stand-ups, and retrospectives. • Collaboration Tools: Skilled in Azure DevOps for project management. • Problem-Solving: Strong debugging and troubleshooting skills for complex data engineering issues. • Communication: Exceptional written and verbal skills, able to explain technical concepts to non-technical stakeholders.