Quantiphi Analytics
Quantiphi - Senior Data Engineer
Job Location
in, India
Job Description
Experience Level : 3 Years. Work location : Bangalore / Mumbai / Trivandrum (India). Job Description : Key Responsibilities : - Develop and manage data models, schemas, and pipelines in Snowflake. - Hands-on experience building and implementing data engineering pipelines and frameworks. - Optimize performance of Snowflake data warehouse and troubleshoot any issues. - Work with stakeholders to understand data requirements and translate them into technical solutions. - Engage in Data migrations and modernization efforts. - Ensure data security and compliance with relevant regulations and standards. - Provide guidance and best practices on Snowflake usage and data management. - Collaborate with data engineers, analysts, and other IT professionals to integrate Snowflake with other systems. - Strong understanding of data warehousing concepts, ETL processes, and data modelling. - Ability to do reverse re-engineering (business logic/queries) of SQL Server Stored procedures/ SSIS packages/code to DBT on Snowflake. - Proficiency in SQL and experience with other programming languages (e.g., Python, Java). - Familiarity with cloud platform of Azure and their integration with Snowflake. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration abilities. Required Skills & Experience : - 3 years of experience in data engineering or a related field, with a strong track record of working with large-scale data processing systems. - Expertise in ETL processes, data pipeline development, and data integration techniques. - Proficiency with SQL for data querying and management. - Strong hands-on experience with cloud data platforms (AWS, GCP, Azure) and services such as Redshift, BigQuery, Snowflake, or Azure Synapse. - Big Data technologies experience (Hadoop, Spark, Kafka) and familiarity with distributed systems for processing large datasets. - Expertise in programming languages like Python, Java, or Scala for writing data processing scripts and developing scalable solutions. - Experience with data warehousing concepts and tools, including data modeling, OLAP, and OLTP systems. - Familiarity with containerization tools like Docker and orchestration platforms such as Kubernetes for deploying data pipelines. - Data governance experience, including data privacy, compliance, and security best practices. - Strong understanding of data quality principles and ability to implement strategies for validation, cleansing, and error handling in data pipelines. - Knowledge of CI/CD practices, version control tools (Git), and automation tools to streamline the development and deployment of data pipelines. - Excellent communication and collaboration skills to interact effectively with technical and non-technical stakeholders. - Bachelors degree in Computer Science, Engineering, or a related field; advanced degrees are a plus. Preferred Skills : - Experience with machine learning models and enabling data engineers to build pipelines that serve as inputs for ML and AI workloads. - Familiarity with real-time data processing and stream processing tools (e.g., Apache Kafka, Apache Flink). - Experience with data visualization tools like Tableau, Looker, or Power BI for working with analysts and data scientists to create actionable insights. - Certification in cloud platforms (AWS, Google Cloud, Azure) is a plus. (ref:hirist.tech)
Location: in, IN
Posted Date: 11/28/2024
Location: in, IN
Posted Date: 11/28/2024
Contact Information
Contact | Human Resources Quantiphi Analytics |
---|