Under Development Features: Talent Bank/Pool, VMS Intigration, Analytics, Social Integration, Reports, API Integration, Resource and Timesheets Management, Company Admin

Under Development Features: Talent Bank/Pool, VMS, Analytics, Social/API Integration, Company Admin

AWS Cloud Data Engineer with PySpark (244 views)

 Carlsbad, CA
August 11, 2020

**** Direct Client Requirement****

Title: AWS Cloud Data Engineer with PySpark

Location: Carlsbad, CA

Rate: DOE /If your experience and skills match call us immediately for submission

Duration: 12+ Months

Interview Type: Skype or Phone

Work Status:  Successful applicants must be legally authorized to work in the U.S.

Job Type:  C2C,C2H,W2

Experience: 7 YEARS

Preffer W2: U S C/G C/H1B Transfer/OPT/CPT/H4 EAD and other US work authorization are accepted

local candidate preferred

 

Must have strong experience in PySpark, Python and SQL

Job Details:

  • 7 to 9 years working experience in data integration and pipeline development with data warehousing .
  • Experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems
  • Strong real-life experience in python development especially in pySpark in AWS Cloud environment.
  • Design, develop test, deploy, maintain and improve data integration pipeline.
  • Experience in Python and common python libraries.
  • Strong analytical experience with database in writing complex queries, query optimization, debugging, user defined functions, views, indexes etc.
  • Strong experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools.
  • Experience with continuous deployment(CI/CD)
  • Databricks, Airflow and Apache Spark Experience is a plus.
  • Experience with databases (PostgreSQL, Redshift, MySQL, or similar)
  • Exposure to ETL tools including Informatica and any other .
  • BS/MS degree in CS, CE or EE.

Thanks
Vinay/Siva
vinayp@Sohanit.com/siva@sohanit.com
PH:402-241-9613/402-241-9606
Apply here or Please send to resumes@sohanit.com

Position Keywords: PySpark, Python and SQL,Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems

Pay Rate: DOE/If your experience and skills match call us immediately for submission

Job Duration: 12 Months

% Travel Required: None

Job Posted by: Consulting Services

Job ID: OOJ - 2212

Work Authorization: Successful applicants must be legally authorized to work in the U.S

Don't have time now?
Get a reminder in your inbox