At KiwiTech, we're all about helping you get around. That's why we use cookies to improve our site, to offer information based on your interests and to interact with social media.  

x
Find a Career
With Heart

Data Engineer

Noida  
  7+ Years

Job Description

We are looking for a seasoned Data Engineer with extensive experience in designing and implementing data pipelines using Medallion Architecture, Azure Databricks, and Snowflake. The ideal candidate will be responsible for building scalable ETL pipelines, optimizing data flows, and ensuring data quality for large-scale data platforms.

Key Responsibilities:

  • Design, develop, and optimize data pipelines in Azure Databricks following Medallion Architecture (Bronze, Silver, Gold layers).
  • Implement and maintain ETL pipelines using Azure Data Factory (ADF), Databricks, and Python/PySpark.
  • Leverage Snowflake for data warehousing, including schema design, data loading, and performance tuning.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver robust data solutions.
  • Develop data models and manage data lakes for structured and unstructured data.
  • Implement data governance and security best practices.
  • Monitor and troubleshoot data pipelines for performance and reliability.
  • Stay up-to-date with industry trends and best practices in data engineering and cloud technologies.

Minimum Qualification

  • B.Tech/B.E. (Computer Science/IT/Electronics)
  • MCA
  • Computer diploma in development with 3+ years of experience compulsory

Apply for this job

Resume/CV*
Choose File
Or simply send us your resume here:
hr@kiwitech.com
Follow us on social media:
Chat with KiwiTech