Risk Resources India
Data Engineer - ETL/Hadoop
Job Location
in, India
Job Description
ETL- Hadoop Role and Responsibilities : - Design Extract/Load/Transform (ETL) workflows to develop large scale data structures and pipelines to organize, collect and standardize data - Uses knowledge of Hadoop architecture, HDFS commands and experience in designing, optimizing queries to build data pipelines - Integrate data from a variety of sources, assuring that they adhere to data quality and accessibility standards - Implemented endpoints to read data from BigQuery and Hive - To provide user-defined functions for clients to read data from Hive and BigQuery directly without using any external connector jars - Collaborate with data science team to transform data and integrate algorithms and models into automated processes using programming skills - Python or Hive - Help in optimizing queries, and work to integrate the business requirements into existing campaign reporting data workflows - Design and implement scalable, configurable and self-learning marketing campaign platform Candidate Profile : - Bachelor's degree in Engineering/Computer Science or related quantitative field - Significant of advanced experience with data analysis, reporting and programming logic (Python, PySpark, Hive, SQL, shell scripting) - Experience interacting with decision-making business audience - Analyzes current information technology environment to identify and access critical capabilities and recommend solutions - Experiments with available tools and advices on new tolls in order to determine optimal solutions given the requirements dictated by the model/use case - Good software engineering fundamentals, strong problem solving and critical thinking ability. (ref:hirist.tech)
Location: in, IN
Posted Date: 11/24/2024
Location: in, IN
Posted Date: 11/24/2024
Contact Information
Contact | Human Resources Risk Resources India |
---|