Under Development Features: Talent Bank/Pool, VMS Intigration, Analytics, Social Integration, Reports, API Integration, Resource and Timesheets Management, Company Admin

Data Lake Developer (731 views)

Minnesota (Remote)
April 26, 2022

***** Direct Client Requirement*****

Title                                    : Data Lake Developer

Location                             : Minnesota (Remote)

Rate                                    : DOE /If your experience and skills match call us immediately for submission

Duration                             : 6 Months

Interview Type                  : Skype or Phone

Job Type                           : W2

Experience                        : 5 Years

Job Description               :


  • Analyze and define data requirements
  • Establish automated data extraction process for Sciforma, SWIFT and other PPM tools (i.e., API or other preferred methods)
  • Develop data lake structure and populate with data
  • Develop reports/queries for advanced data analytics
  • Develop Power BI reports/data visualization using advanced queries
  • Provide knowledge transfer

Minimum Qualifications

  • Five years of experience in data lake development
  • Two engagements with two different entities in a role involving data lake development

Preferred Qualifications

  • Experience in data lake configuration, setup, and bringing in data from various data sources using ETL, API
  • Experience in Azure Cloud services and solutions
  • Experience working with enterprise data warehouse
  • Experience as an ETL/ELT Developer using various ETL/ELT tools such as Azure Synapse Pipelines and Azure Data Factory, Apache Spark Pools using python scripts
  • Experience in Azure DevOps Services using Azure Git Repos, Azure ¬data studio, Azure Analytics, data mapping, deployment artifacts and release packages for test & production environment
  • Experience in building end-end scalable data solution, from sourcing raw data, transforming data to producing analytics reports
  • Experience in Python (ETL and Data Visualization libraries)
  • Experience in Azure SQL databases across SQL DB, Managed instance & Data warehouse
  • Experience in Azure platform services such as blob storage, event hubs, monitoring services
  • Experience in creating data structures optimized for storage and various query patterns, for example Parquet
  • Experience in building secured Power BI reports, dashboards, paginated reports
  • Experience in working in an Agile SDLC methodology

Description of Project

  • Seeking to build a data lake repository of project and portfolio management (PPM) data for the MNIT enterprise to enable the organization to make data-driven decisions for its IT projects. For this project, MNIT is seeking a senior-level data lake developer. The database will be used as a single source of truth for both enterprise and MNIT partner agency reporting needs, along with making data available for advanced analytics purposes. The work will be done in phases. The pilot phase will be focused on getting data from Sciforma and the statewide integrated financial tool (SWIFT) (project financials). The long term goal is to capture data from all PPM tools being used throughout MNIT’s partner agencies in the data lake. Additional phases may be added via amendment to the work order.

*****Referral Bonus Available: Refer your friends or colleagues, get referral bonus*******




PH: 470-410-8564  EXT 116/470-410-3404 EX 106

Apply here or please send to resumes@sohanit.com

Follow us on LinkedIn and Twitter for daily active requirements

LinkedIn: https://www.linkedin.com/company/sohanit-inc/?viewAsMember=true

Twitter: https://twitter.com/SohanITInc1

Position Keywords: Swift,API,MS Power BI,Cloud Computing,Cloud,Azure,Apache Spark,Python,SQL,Database,Sciforma,GIT

Pay Rate: DOE /If your experience and skills match call us immediately for submission

Job Duration: 6 Months

% Travel Required: None

Job Posted by: Consulting Services

Job ID: OOJ - 4576

Don't have time now?
Get a reminder in your inbox