Snowflake Developer:
No of position: 1
Job Location: Bangalore/Hyderabad/Chennai
Job Mode- Hybrid
NP- 0-15 days
· Snowflake data engineers will be responsible for designing and developing various artifacts in the effort of Data migration from legacy databases to Snowflake cloud data warehouse.
· A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
· Developing solutions using a combination of Python, Airflow, and Snowflake
· Developing scripts using SnowSQL / Unix / Python etc. to do Extract, Load and Transform data
· Provide support for Data Warehouse issues such as data load problems etc.
· Understanding data pipeline requirements and which tools to leverage to get the job done
· Testing and clearly documenting implementations, so others can easily understand the requirements, implementation, and test conditions.
· Minimum 2 years of developing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
· 4 years of hands on experience with building productized data ingestion and processing pipelines using standard data pipeline techniques (Spark, Python, ADF, etc..)
· 2 years of hands-on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as SQL Server, Teradata, Oracle or DB2
· Expertise in implementing role-based access controls on Snowflake.
· Ability to write stored procedures and complex queries in Snowflake
· Excellent presentation and communication skills, both written and verbal
· Ability to problem solve and architect in an environment with unclear requirements
· 3+ experience working with structured AND unstructured data.
· Good knowledge and hands-on experience in following best practices in terms of security standards, reusability, auditing, performance, enterprise coding standards etc.
· Experience developing and managing data warehouses on a terabyte or petabyte scale.
· Experience with core competencies in Data Structures, Rest/SOAP APIs, JSON, etc.
· Strong experience in massively parallel processing & columnar databases.
· Strong understanding of Azure pricing model for Data processing architectures
· Expert in writing SQL.
· Hands on experience in integrating with BI tools like Power BI, Tableau.
· Deep understanding of advanced data warehousing concepts and track record of applying these concepts on the job.
· Experience with common software engineering tools (e.g., Git, JIRA, Confluence, or similar)
· Ability to manage numerous requests concurrently and strategically, prioritizing when necessary.
· Good communication and presentation skills.
· Dynamic team player.