Unlimited Job Postings Subscription - $99/yr!

Job Details

Quick Reminder - DataOps & Build Engineer - Remote - USA

  2026-01-16     Lorven Technologies     all cities,AK  
Description:

Role: DataOps & Build Engineer

Data Analytics

Location: Remote - USA

Project Duration: 6 to 9 Months of contract

We are seeking an experienced and visionary DataOps & Build Engineer to lead the architecture and optimization of a next-generation data platform.

This critical role requires 8+ years of expertise to drive technical direction, mentor teams, and automate complex CI/CD pipelines in a fast-paced environment.

You will be instrumental in bridging development and operations to ensure a scalable, high-performance data lifecycle that powers enterprise-level decision-making.

Key Responsibilities:

  • Establish DataOps Framework: Define, document, and champion the organizational framework and guidelines for DataOps-including release management processes, environment promotion strategy, and data quality standards.
  • Best Practice Dissemination: Create and enforce standard operating procedures (SOPs) for data pipeline development, CI/CD, and testing across the engineering teams, ensuring consistency and adherence to architectural standards
  • Data Pipeline Automation: Design and implement robust continuous integration and continuous delivery (CI/CD) pipelines for data code and infrastructure
  • Workflow Orchestration Implementation: Configure, optimize, and manage the deployment of data workflows using orchestrators such as Dagster or Talend, focusing on automated testing and deployment steps.
  • Version Control & Repository Management: Enforce best practices for source code management (e.g., Gitflow), branching strategies, and repository organization across all data projects.
  • Infrastructure as Code (IaC): Work with Infrastructure teams to automate provisioning and management of data platform resources efficiently within AWS.
  • Resilience and Failure Recovery: Design and implement automated rollback and self-healing mechanisms within pipelines to quickly recover from transient failures.
  • Monitoring and Logging: Set up comprehensive monitoring, logging, and alerting using Cloud native tools, or other tools to ensure visibility into pipeline performance and quickly identify and resolve issues
  • Security and Compliance: Ensure data security and compliance by implementing IAM policies, encryption, and other security measures in AWS, adhering to best practices for handling sensitive data
  • Testing Frameworks: Implement automated testing strategies across the data lifecycle, including unit tests, integration tests, and data quality validation checks (e.g., column integrity, schema drift) to ensure data reliability before deployment
  • Resource and Cost Optimization: Implement automated policies and monitoring to track and control cloud resource consumption, ensuring that pipelines run efficiently and cost-effectively
Candidate Profile:
  • 8+ years of hands-on experience in Data Engineering, DevOps, or a dedicated DataOps role, focused heavily on automation and operational excellence
  • Proven experience implementing CI/CD practices specifically for data pipelines and data infrastructure
  • Strong conceptual understanding of data warehousing, ETL/ELT methodologies, and cloud-native architecture.
  • Automation First Mindset: A strong drive to automate repetitive tasks and eliminate manual intervention in the data lifecycle
  • Collaboration: Excellent communication skills, capable of working effectively with Data Engineers, Data Scientists, and Infrastructure teams
  • Insurance industry experience preferred but not mandatory
  • Tools:
    • Cloud Environment: AWS (S3, IAM, VPC, etc.)
    • Pipeline Build: Dagster or Talend
    • Ingest & Transform: dbt Core, AWS Glue, or Flexter
    • Streaming/Integration: Confluent or AWS Streaming Services


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search