Job Description
Lead Consultant- Azure Engineer with Snowflake, Python, Pyspark, Scala
With a start-up spirit of ~100,000 curious and courageous minds, we have an expertise to go deep with the worlds biggest brandsand we have fun doing it. Now, were calling all rule-breakers and risk-takers who see the world differently and are bold enough to reinvent it. Come, transform with us. Transformation happens here. Come, be a part of our exciting journey! Are you the one we are looking for?
Inviting applications for the role of Lead Consultant- Azure Engineer with Snowflake+ Python+ Pyspark + Scala
In this role, you shall be responsible for design and build modern data pipelines and data streams.
Responsibilities
- Expose data to end users using python, pyspark, Snowflake, Azure API Apps or other modern visualization platform or experience
- Analyze current business practices, processes and procedures and identify future opportunities for leveraging Microsoft Azure data & analytics PaaS services.
- Support the planning and implementation of data platform services including sizing, configuration, and needs assessment
- DevSecOps and CI/CD deployments
- Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, Event Hub etc.
- Experience in Data Warehousing & Snowflake Computing platform
- Experience working on migrating data from on premise (Hadoop, Netezza) data lake to Cloud - Snowflake
- Combination of Python and Snowflakes- SnowSQL; writing SQL queries against Snowflake.
- Experience working on different source data - RDBMS, Flat files, XML, JSON
- Expertise in SQL, especially within cloud-based data warehouses like Snowflake and Azure
- Expertise with SnowSQL, advanced concepts (query performance tuning, time travel etc.) and features/tools (data sharing, events, SnowPipe etc.)
- Experience with end to end implementation of Snowflake cloud data warehouse or end to end data warehouse implementations on-premise
- Proven analytical and problem-solving skills
- Strong understanding of ETL concepts and work experience in any of ETL tool like Informatica, Data stage or Talend
- Ability to work independently and on multiple tasks/initiatives with multiple deadlines
- Effective oral, presentation, and written communication skills
- Data modelling and data integration
- Support/Troubleshooting
Minimum qualifications
- BE/MCA/B.Tech in Computer Science, Information Systems, Engineering, related fields
- Snowflake Certification / Azure Certification preferred-Good to have
- Extensive Experience on Data Analysis Projects & Expertise in writing complex DB SQL Queries
- Good Understating of Data-warehousing concepts (** Worked on Traditional Data warehouse projects, not just familiarity with any DB)
- Strong hands-on core python scripting exp. & Specialist in data analysis in Python (Numpy, Scipy, Matplotlib, Scikit-learn, Pandas, etc.)
- Solid experience in consulting or client service delivery experience on Snowflake and Azure
- Good experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions
- Extensive experience providing practical direction within the Azure Native services and Synapse Analytics
- Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
- Good hands-on experience in Azure and Big Data technologies such as Powershell, Python, SQL, ADLS/Blob, Spark/SparkSQL, Databricks, Hive and streaming technologies such as Kafka, EventHub, NiFI etc.
- Experience in designing ETL and ELT using Azure Data Factory and SSIS technologies.
- Expertise in secure and complaint design for azure data factory which isolation of Vnet, secure data in motion from different sources to ADLS and to Snowflake
- Good interpersonal, problem solving and verbal communication skills
- Process orientated awareness of Lean Six Sigma
Preferred qualifications
- Experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders
- Extensive IT experience in numerous technologies spanning across Data Modeling, Data Warehouse Management and Database Development for different domains like Banking, Capital Markets, Insurance.
- Solid understanding of Data Life Cycle including Profiling, Mining, Migration, Quality, Integration, Master Data Management and Metadata Management Services.
- Microsoft Certified Azure Data Engineer and Snowflake Certified Developer. Having in depth understanding on cloud infrastructure and services like Azure Data factory, Azure Data lake, Databricks, Azure Synapse, Snowflake, Azure analysis services, Azure SQL Server, Azure Blob Storage, Power BI and Power App.
- Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflakes, SnowSQL
- Ability and desire to learn and adapt in complex work environment
- Outstanding communication and critical thinking skills
- Good analytical and problem solving skills and ability to balance team and client discussions.