*** Direct Client Requirement****
Title: Azure Cloud Data Engineer
Location: NW Washington, DC
Duration: Long Term
Interview Type: Webcam Interview
Work Status: Successful applicants must be legally authorized to work in the U.S.
Job Type: W2!!
Experience: 6 YEARS
- At least 5 years’ experience driving impact in a similar capacity at companies creating cutting edge tech using Azure Cloud.
- Azure Data Factory Pipelines – Create new Azure Data Factory pipelines in cloud based data warehousing systems such as Azure Synapse.
- Experience with Azure services: Azure Data Lake Gen 2, Data factory, Data Fllows, and Synapse (Azure SQL DW).
- Data Quality and Anomaly Detection – Improve existing tools (dbt cloud) to measure data quality through metrics and automatic alerting.
- Experience in working with Azure Databricks environment for data transformation.
- Experience in working with unstructured data (text) and making data available for down stream NLP analysis.
- Data Modeling – Partner with data consumers to improve existing data models and build different facets of the business for analytic use cases, Build star and snowflake schemas.
- Development of data pipelines to third party API’s both internal providers and external
- Experiace in working with NoSQL data bases like MongoDB and pull data from those databases.
- Expertise building data pipelines on large complex data sets using Spark or other open source frameworks
- Expertise in a scripting language like Python (or similar) and a query language like SQL
- Knowledge of scheduling, logging, monitoring, alert frameworks
- Experience in Source/Target: ADLSGen2, PostGres DB, Azure SQL DW.
- Orchestration Using Databricks Jupyter Notebooks preferred.
- Experience working in large scale/distributed SQL, NoSQL (MongoDB) environments.
- Experience in modeling and implementing ETL / ELT on columnar MPP database technologies.
- Experience with Agile software development process.
- Proven experience deploying machine learning algorithms to production
- Demonstrated proficiency in writing high-quality and scalable code and integrating with git version control systems
- Experience leading successful data engineering projects and operationalizing machine learning algorithms
- Experience with streaming architectures e.g. Kafka, Stream, PubSub. Responsibilities:
- Build & Devlier Data pipline connecting various enterprise data sources both RDBMS, NoSQL & APIs.
- Clean and process the data for Machine Learning consumption.
- Load the ML predictions to Azure PostGres
- Provide Business Intelligence (PowerBI) and Data Warehousing (DW) solutions and support by leveraging project standards and leading analytics platforms
- Evaluate and define functional requirements for BI and DW solutions
- Define and build data integration processes to be used across the organization
- Build conceptual and logical data models for stakeholders and management
- Analyze and validate data accuracy of report results
- Work directly with management understand requirement; and propose and develop best business solution that enables effective decision-making, and drive business objectives
- Interpreting data presented in models, charts, and tables and transforming it into a format that is useful to the business and aids effective decision making
- Use of statistical practices to analyze current and historical data to make predictions, identify risks, and opportunities enabling better decisions on planned/future events
- The ability to understand the business problem and determine what aspects of it require optimization; articulate those aspects in a clear and concise manner
- Partner with business analysts, application engineers, data scientists, leveraging the appropriate tools, solutions, and/or processes as part of their data mining, profiling, blending, and analytical activities.
- Collaborate in establishing and evolving development, testing, and documentation standards, as well as related code reviews.
Apply here or Please send to email@example.com
Position Keywords: AZURE CLOUD , AZURE SQL , AZURE , ELT , NOSQL , DATA TRANSFORMATION , MONGODB , DATA WAREHOUSING , TESTING , PYTHON , RDBMS , SNOWFLAKE , ETL
Pay Rate: DOE
Job Duration: 12 Months
% Travel Required: None
Job Posted by: Consulting Services
Job ID: OOJ - 2049
Work Authorization: Successful applicants must be legally authorized to work in the U.S.