2026. 03. 29. | Teljes munkaidõ | Budapest | IQVIAStar schemas, feature-ready datasets, semantic layers) supporting analytics, ML workflows, and AI agent operations. Implement automated data validation, schema checks, and pipeline testing to ensure high-quality data delivery across systems. Preferred Contribute to data architectures supporting agent
Nézze később2026. 03. 29. | Teljes munkaidõ | Budapest | SanofiTo improve people's lives. Main responsibilities Analyze large datasets to uncover trends, patterns, and actionable insights that support strategic business decisions across US Brands. Identify and assess business needs that align with product strategy, ensuring a harmonized commercial data foundation
Nézze később2026. 03. 29. | Teljes munkaidõ | Budapest | Deutsche Telekom IT Solutions HUScalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development Build reusable, production-ready Python frameworks
Nézze később2026. 03. 29. | Teljes munkaidõ | Debrecen | Deutsche Telekom IT Solutions HUScalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development Build reusable, production-ready Python frameworks
Nézze később2026. 03. 29. | Teljes munkaidõ | Szeged | Deutsche Telekom IT Solutions HUScalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development Build reusable, production-ready Python frameworks
Nézze később2026. 03. 29. | Teljes munkaidõ | Budapest | Deutsche Telekom IT Solutions HUScalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development Build reusable, production-ready Python frameworks
Nézze később2026. 03. 29. | Teljes munkaidõ | Pécs | Deutsche Telekom IT Solutions HUScalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development Build reusable, production-ready Python frameworks
Nézze később2026. 03. 29. | Teljes munkaidõ | Budapest | IQVIAResponsibilities Mandatory Design, develop, and maintain scalable data pipelines and ETL processes supporting AI research and development. Design and maintain scalable data models (e.g., star schemas, feature‑ready datasets, semantic layers) for analytics, ML training, and agent workflows. Collaborate with AI
Nézze később