DE EN Provider/Privacy
CompanyInnovationsSustainabilityCareersInvestorsPress Products
CareersJob search
PGET Conversion
Tasks

Responsibilities:

  • Design, develop, and deploy complex data pipelines and workflows on Azure Databricks using Python, Scala, and SQL.
  • Implement and optimize data integration, transformation, and enrichment processes, focusing on performance, scalability, and cost-efficiency.
  • Oversee and manage financial operations related to data processing and cloud resource usage.
  • Implement strategies for cost optimization and budget management in Azure environments.
  • Develop and maintain the architecture of our data platform, ensuring it meets current and future business needs.
  • Provide architectural guidance and best practices for data processing and storage solutions.
  • Utilize Delta Lake to enable ACID transactions and efficient data management.
  • Work with Parquet and Delta formats to ensure optimal data storage and processing performance.
  • Design and implement data APIs and microservices to facilitate seamless data access and integration.
  • Partner with cross-functional teams to understand data requirements and develop effective technical solutions.
  • Document technical designs, data flows, and operational procedures.
  • Ensure data quality and reliability through automated testing and monitoring.
  • Implement and enforce security and compliance best practices for data handling in Azure environments.
  • Stay current with industry trends and advancements in data technologies and cloud services.
  • Propose and implement improvements to enhance data platform capabilities and efficiency.

Skill Requirements (M-Mandatory):

  • (M) Extensive experience as a data engineer, DevOps engineer, or similar role with a focus on data platforms.
  • (M) Advanced skills in Python and Scala; strong proficiency in SQL for data manipulation and querying.
  • (M) Hands-on experience with Azure Databricks for data processing.
  • (M) In-depth knowledge of Delta Lake, Parquet, and Delta formats.
  • (M) Proven experience in designing and maintaining data platform architecture.
  • (M) Experience in managing financial operations and cost optimization in cloud environments.
  • (M) Expertise in designing and implementing ETL/ELT processes and data pipelines.
  • Strong familiarity with Azure services and cloud resource management.
  • Proficiency with Git and CI/CD pipelines for automated deployments.
  • Excellent communication skills, with the ability to collaborate effectively with both technical and non-technical stakeholders.
  • Familiarity with Docker and Kubernetes for containerized deployments.
Qualifications
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent work experience).
  • 4+ years of relevant experience in data engineering or DevOps roles with a focus on data platforms.
  • Relevant Azure certifications (e.g., Azure Data Engineer) and additional industry certifications are a plus.
Benefits
Discounts for Employees Possible
Health Benefits
Mobile Phone for Employees Possible
Meal-Discounts
Company Retirement
Hybrid Work Possible
Mobility Offers
Events for Employees
Coaching
Flextime Possible
ContactMercedes-Benz Research and Development India Private Limited LogoMercedes-Benz Research and Development India Private Limited
Brigade Tech Gardens, Katha No. 119560037 BengaluruDetails to location
Apply