Data Engineer
Spectraforce
Richmond, Virginia
16 hours ago
Job Description
Location: Virgina
Key Responsibilities:
•Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
•Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
•Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
•Develop backend and automation tools using Golang and/or Python as needed.
•Implement scalable, secure, and high-quality data solutions across AWS servies such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
•Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
•Perform root-cause analysis and implement automation to prevent recurring issues.
•Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
•Ensure compliance with enterprise governance, data quality, and cloud security standards.
•Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.
Key Requirements and Technology Experience:
•Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda).
•Proficiency in Python with experience building scalable data pipelines or ETL processes.
•Strong hands-on experience with Spark/PySpark for distributed data processing.
•Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
•Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
•Experience with Golang for scripting, backend services, or performance-critical processes.
•Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
•Familiarity with CI/CD workflows, Git, and automated testing.
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 70.00/hr.
Key Responsibilities:
•Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
•Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
•Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
•Develop backend and automation tools using Golang and/or Python as needed.
•Implement scalable, secure, and high-quality data solutions across AWS servies such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
•Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
•Perform root-cause analysis and implement automation to prevent recurring issues.
•Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
•Ensure compliance with enterprise governance, data quality, and cloud security standards.
•Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.
Key Requirements and Technology Experience:
•Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda).
•Proficiency in Python with experience building scalable data pipelines or ETL processes.
•Strong hands-on experience with Spark/PySpark for distributed data processing.
•Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
•Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
•Experience with Golang for scripting, backend services, or performance-critical processes.
•Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
•Familiarity with CI/CD workflows, Git, and automated testing.
Applicant Notices & Disclaimers
- For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 70.00/hr.