The Data Engineer supports Bechtel’s Infrastructure AI and Data program by designing, building, and optimizing scalable data pipelines that enable analytics, reporting, and AI-driven insights. This role blends hands-on data engineering with technical collaboration, contributing to data architecture decisions while ensuring data quality, performance, and reliability across enterprise platforms.
Responsibilities:
- Design, develop, and maintain complex ETL and ELT pipelines ingesting data from multiple sources
- Optimize data workflows and performance within Azure Databricks environments
- Implement robust data quality checks, validation rules, and monitoring processes
- Collaborate with data architects and functional subject matter experts to gather requirements
- Contribute to data modeling and lakehouse architecture decisions for analytics use cases
- Document data engineering processes and promote best practices across the team
Requirements:
- Bachelor’s degree in computer science or a related field
- 5 or more years of experience in data engineering or a related role
- Strong proficiency in SQL and advanced experience with Python or Scala
- Hands-on experience with cloud-based data platforms, preferably Microsoft Azure
- Solid understanding of data modeling, ETL/ELT processes, and performance tuning
- Ability to troubleshoot complex data workflows and guide technical discussions
Benefits:
- Opportunity to work on global infrastructure and engineering programs
- Exposure to advanced analytics, AI, and enterprise-scale data platforms
- Collaborative environment with global teams and technical experts
- Strong focus on career growth, learning, and long-term professional development
This role offers the opportunity to help shape enterprise data capabilities that support large-scale, mission-critical projects worldwide.