Anúncio republicado automaticamente
Este anúncio foi sincronizado a partir da fonte externa Net Empregos para facilitar a pesquisa em OnlyJobs. Podes consultar o anúncio original em www.net-empregos.com.
És responsável por esta oferta? Contacta a nossa equipa para reclamar a conta da empresa e gerir os anúncios oficiais diretamente na plataforma.
Descrição
Join a team where Databricks is at the heart of the engineering strategyand where your work powers largescale financial platforms used across global markets Location Remote in Portugal Contract , direct contract with a global technology consulting leader Our client is a worldrenowned technology consulting .
This longterm program is part of a data and cloud modernization journey centred on Databricks, scalable architectures and highquality data products.
You will join an international engineering environment dedicated to transforming complex data into reliable, governed and businesscritical assets.
Responsibilities - Build, optimize and maintain end-to-end Databricks pipelines, ensuring performance, reliability and scalability.
- Apply Medallion Architecture principles to design structured, maintainable and productionready data layers.
- Develop distributed processing solutions with PySpark, aligned with best engineering practices.
- Design and orchestrate workflows using ADF and Airflow, improving automation and observability.
- Contribute to the evolution of Databricks Lakehouse standards, performance tuning and cluster optimization.
- Integrate CI/CD practices using Azure DevOps, with automated testing and version control.
- Collaborate closely with data scientists, architects and business stakeholders to operationalize data products and ensure alignment with financialsector requirements.
- Support governance, security and documentation across the Databricks ecosystem.
Required Skills - Strong, demonstrable experience with Databricks as the primary engineering platform.
- Solid understanding of Medallion Architecture and modern lakehouse principles.
- Proficiency in PySpark and distributed processing frameworks.
- Experience with Azure Data Factory, Airflow and workflow orchestration.
- Familiarity with Azure DevOps CI/CD, Git workflows and automated deployments.
- Comfortable working in Agile, collaborative and multicultural environments.
Nice to Have - Advanced experience with Azure DevOps for pipeline automation.
- Background in data testing or test framework development.
- Exposure to investment banking or financial services data landscapes.
Ready to take the next step?
Apply now and join an international Databricksdriven project shaping the future of digital innovation in financial services.
Candidate-se aqui Partilhar #my_centered_buttons { display: flex; justify-content: center; } Guardar Oferta Receber Alertas por Email Comunicar Problema / Queixa
Meteorologia no dia de início
sexta-feira, 9 de janeiro — Aguaceiros · Máx 16° · Mín 11° · Chuva 53% (0.4mm) · Vento 16 km/h
Previsão para os dias seguintes
-
sáb, 10 janParcialmente nubladoMáx 15° · Mín 10°
-
dom, 11 jan—Máx 16° · Mín 9°
-
seg, 12 janAguaceirosMáx 16° · Mín 12°
-
ter, 13 janAguaceirosMáx 15° · Mín 12°
Detalhes
- Tipo de listagem
- Oferta de emprego
- Tipo horário
- Full-time
- Categoria
- Tecnologia
- Estado
- —
- Localização
- Lisboa
- Início
- 08/01/2026