Notice: Undefined variable: ub in /home/jalada/public_html/delict/common/helper/IP.php on line 98

Notice: Undefined variable: ub in /home/jalada/public_html/delict/common/helper/IP.php on line 109

Warning: Cannot modify header information - headers already sent by (output started at /home/jalada/public_html/delict/common/helper/IP.php:98) in /home/jalada/public_html/delict/config/Session.class.php on line 191

Warning: Cannot modify header information - headers already sent by (output started at /home/jalada/public_html/delict/common/helper/IP.php:98) in /home/jalada/public_html/delict/config/Session.class.php on line 192
Delict Technology Services Pvt.Ltd

AZURE Data Engineer

Posted On : 2022-09-23

Ref.No : CGI/CGI/1413

No.of Openings: 1

Work Permit : Not Applicable

Designation :AZURE Data Engineer

Industries :IT/ Computers - Software

Expiry Date :2022-10-21

Skills : Non- SAP   /  Azure Data

Total Experience : 5 to 6 Year

Relevant Experience : 5 to 6 Year

Job Type : Permanent

Job Location : Bangalore/Hyderabad/Chennai

Notice Period : 1 to 10 Days




Job Description


AZURE Data Engineer:

No of position: 1

Job Location: Bangalore/Hyderabad/Chennai

Job mode- Hybrid

NP- 0- 15 days

 

·       Candidates with the skills (not all but some) AZURE Databricks, integration from complex system outside of Azure network & from different VNETs

·       Production support after deployment during the warranty period.

·       Experience in implementing streams of data using Event Hubs or other streaming integration tools

·       5 years of work experience with ETL, and business intelligence & AZURE data architectures.

·       5 years of experience in implementing pipelines/control flows using Azure Data Factory or Synapse Pipelines.

·       2+ years of experience working with Azure Databricks.

·       3+ years of hands-on experience integrating with Snowflake DB in Azure/AWS

·       2-3 years of experience working on Spark SQL, Hive SQL, USQL, Kusto Query

·       2-3 years of experience working on ADLS, Cosmos DB, Cassandra DB, Mongo DB, Azure Synapse, Azure SQL Server.

·       5 years of experience on creating the frameworks towards building the data pipelines

·       3+ experience working with structured AND unstructured data.

·       Good knowledge and hands-on experience in following best practices in terms of security standards, reusability, auditing, performance, enterprise coding standards etc.

·       Experience developing and managing data warehouses on a terabyte or petabyte scale.

·       Experience with core competencies in Data Structures, Rest/SOAP APIs, JSON, etc.

·       Strong experience in massively parallel processing & columnar databases.

·       Strong understanding of Azure pricing model for Data processing architectures

·       Expert in writing SQL.

·       Hands on experience in integrating with BI tools like Power BI, Tableau.

·       Deep understanding of advanced data warehousing concepts and track record of applying these concepts on the job.

·       Experience with common software engineering tools (e.g., Git, JIRA, Confluence, or similar)

·       Ability to manage numerous requests concurrently and strategically, prioritizing when necessary.

·       Good communication and presentation skills.

·       Dynamic team player.