Under Development Features: Talent Bank/Pool, VMS Intigration, Analytics, Social Integration, Reports, API Integration, Resource and Timesheets Management, Company Admin

Snowflake Data Engineer (293 views)

Dallas, TX
October 27, 2020

***** Direct Client Requirement*****                     

Title                        Snowflake Data Engineer

Location                 : Dallas, TX

Rate                       : DOE /If your experience and skills match call us immediately for submission

Duration                : 12 Months

Interview Type     : Skype or Phone

Work Status         : Successful applicants must be legally authorized to work in the U.S.

Job Type               : C2C,C2H, W2

Experience          : 12+ YEARS

Prefer W2           : U S C/ G C /H1B Transfer/OPT/CPT/H4 EAD and other US work authorization are accepted

Job Description:

Snowflake data engineers will be responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.
A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
Need to have professional knowledge of AWS /Azure Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
Developing scripts Unix, Python etc. to do Extract, Load and Transform data .
Provide production support for Data Warehouse issues such data load problems, transformation translation problems
Translate requirements for BI and Reporting to Database design and reporting design.
Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Exp. Vertica

Basic Qualifications

Minimum 1 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
3+ years of hands on experience with building product ionized data ingestion and processing pipelines using Java, Spark, Scala, Python
2+ years of hands on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata, Oracle or DB2 Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements .
Preferred Skills Bachelor‣s degree in Computer Science, Engineering, Technical Science or 3 years of technical architecture and build experience with large scale solutions
Minimum 1 year of experience in architecting large scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders Experience building data ingestion pipeline using Talend, Informatica Experience in working with AWS, Azure and Google data services Prior experience in client facing projects.

******Referral Bonus Available: Refer your friends or colleagues, get referral bonus******

Thanks
Srinivas/Siva
Srinivas@Sohanit.com/siva@sohanit.com
PH:402-241-9551/402-241-9606
Apply here or Please send to resumes@sohanit.com

Position Keywords: Java, Spark, Scala, Python,Snowflake Data Warehouse,AWS, Azure

Pay Rate: DOE/If your experience and skills match call us immediately for submission

Job Duration: long term

% Travel Required: None

Job Posted by: Consulting Services

Job ID: OOJ - 2449

Work Authorization: Successful applicants must be legally authorized to work in the U.S

Don't have time now?
Get a reminder in your inbox