Under general supervision, combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.
The Dept. of Early Care & Development (DECAL) is seeking a highly skilled and proactive Data Engineer to join our dynamic team and support the modernization of our data estate. This role is integral to the migration from legacy systems and the development of scalable, secure, and efficient data solutions using modern technologies, particularly Microsoft Fabric and Azure-based platforms. The successful candidate will contribute to data infrastructure design, data modeling, pipeline development, and visualization delivery to enable data-driven decision-making across the enterprise.
Work Location & Attendance Requirements:
• Must be physically located in metro Atlanta.
• On-site: Tuesday to Thursday, per manager’s discretion
• Mandatory in-person meetings:
o All Hands
o Enterprise Applications
o On-site meetings
o DECAL All Staff
• Work arrangements subject to management’s discretion
Experience Required: 5+ years
Key Responsibilities:
• Design, build, and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks.
• Implement medallion architecture (Bronze, Silver, Gold) to support data lifecycle and data quality.
• Support the sunsetting of legacy SQL-based infrastructure and SSRS, ensuring data continuity and stakeholder readiness.
• Create and manage notebooks (e.g., Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark.
• Build and deliver curated datasets and analytics models to support Power BI dashboards and reports.
• Develop dimensional and real-time data models for analytics use cases.
• Collaborate with data analysts, stewards, and business stakeholders to deliver fit-for-purpose data assets.
• Apply data governance policies including row-level security, data masking, and classification in line with Microsoft Purview or Unity Catalog.
• Ensure monitoring, logging, and CI/CD automation using Azure DevOps for data workflows.
• Provide support during data migration and cutover events, ensuring minimal disruption.
Technical Stack:
• Microsoft Fabric
• Azure Databricks
• SQL Server / SQL Managed Instances
• Power BI (including semantic models and datasets)
• SSRS (for legacy support and decommissioning)
Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, or related field
• 5+ years of experience in data engineering roles, preferably in government or regulated environments
• Proficiency in SQL, Python, Spark.
• Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake)
• Experience with Power BI data modeling and dashboard development
• Familiarity with data governance tools (Microsoft Purview, Unity Catalog)
• Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design
• Strong communication and collaboration skills.
Preferred Qualifications:
• Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate
• Knowledge of CI/CD automation with Azure DevOps
• Familiarity with data security and compliance (e.g., FIPS 199, NIST)
• Experience managing sunset and modernization of legacy reporting systems like SSRS
Soft Skills:
• Strong analytical thinking and problem-solving abilities
• Ability to collaborate across multidisciplinary teams
• Comfort in fast-paced and evolving technology environments
This role is critical to our shift toward a modern data platform and offers the opportunity to influence our architectural decisions and technical roadmap.
Required/Desired Skills
Skill | Required /Desired | Amount | of Experience |
---|
Experience in data engineering roles, preferably in government or regulated environments | Required | 5 | Years |
Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake) | Required | 5 | Years |
Experience with Power BI data modeling and dashboard development | Required | 5 | Years |
Familiarity with data governance tools (Microsoft Purview, Unity Catalog) | Required | 5 | Years |
Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design | Required | 5 | Years |
Bachelor’s degree in Computer Science, Information Systems, or related field | Required | 0 | |
Questions
No. | Question |
---|
Question1 | Absences greater than two weeks MUST be approved by CAI management in advance, and contact information must be provided to CAI so that the resource can be reached during his or her absence. The Client has the right to dismiss the resource if he or she does not return to work by the agreed upon date. Do you agree to this requirement? |
Question2 | What is your candidate's email address? |
Question3 | If selected for engagement, your candidate's hourly Pay Rate must be at least -? Your candidate can be paid more; however, the hourly SRP Rate cannot exceed -. Do you agree to these requirements? |
Question4 | The maximum mark-up for this engagement’s SRP rate is 35%. To be competitive on pricing, a mark-up below the 35% threshold is suggested. Do you agree to propose a mark-up at or below 35%? |
Question5 | This assignment is contingent upon customer renewal and availability of adequate funding. Do you agree to this requirement? |
Question6 | If selected for engagement, your candidate will be expected to start no later than 2 weeks (10 business days) after the client's selection date? Do you agree to this requirement? |