Job Description
Senior Developer - Talend Developer (Data Warehousing & SQL) - Bangalore
Responsibilities
• Design & develop data flows, data models & data warehouses / data solutions
• Collaborate with report developers to source relevant data and build solution to support development of dashboards / reports
• Provide user support including incident management, data issues & maintenance of daily / weekly data refresh schedules & on-call responsibilities to meet business SLA’s
• Deliver required documentation for build and support responsibilities, including architecture and data flow diagrams
Education Required: Bachelor’s Degree in Computer Science/Information Systems/Mathematics/Sciences
Preferred: Master’s Degree in Computer Science/Information Systems
Required Skills / Qualifications
• Essential to have experience in Data Warehousing within depth knowledge in SQL.
• Well versed with Relational and Dimensional Modelling techniques like Star Schema, Snowflake Schema, Fact and Dimensional Tables.
• Experience in Data Warehousing to Extract, Transform and Load (ETL) using TALEND.
• Extensive experience in integration of various heterogeneous data sources definitions like Teradata, SQL Server, Oracle, Flat Files, Excel files loaded data in to Datawarehouse and Data marts using Talend Studio. Capable of implementing various CDC methods in the ETL pipeline.
• SAP or Sales force as a source with Talend Integration will be an added advantage.
• Experience in implementing audit framework for Talend.
• Capable of creating reusable components/joblets using Talend.
• Aware of Best practices and standards followed while creating data pipelines.
• Experience in scheduling tools Control M & Job Conductor (Talend Admin Console).
• Experience developing and deploying ETL solutions on-premises and/or cloud.
• Experience working in RDBMS and big data platform building data ingestion, data processing and analytical pipelines (Teradata, Snowflake preferred)
• Good experience with Big Data, Hadoop, HDFS, Map Reduce and Hadoop Ecosystem (Hive) technologies.
• Knowledge in Cloud massively parallel processing (MPP) databases (Snowflake preferred)
• Experience with streaming data processing frameworks (Spark Streaming, Kafka streams)
• Hands-on experience with Cloud data platform Solution building data pipelines in Azure/GCP
• SDLC Methodology (Agile / Scrum / Iterative Development).
• Knowledge of Kafka, Python or Java is an added advantage.
• Talend Data Integration - developer certification is an added advantage.