Data Engineer
Spectraforce
Alpharetta, Georgia
a day ago
Job Description
Title: Data Engineer
Location: Alpharetta, GA
Note: hybrid (1-5 days in office, Wednesdays required) Fully onsite for first 2 weeks
Duration: 6 months, likely extensions.
About the role:
Ranked must-haves:
Work type: Primarily new data pipelines and application build; some maintenance on existing solutions.
Pluses
Familiarity with C/C++ and additional scripting languages.
Test-driven development knowledge.
Prior experience in highly regulated industries (financial services, healthcare, insurance, legal tech).
Basic Functions:
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 60.00/hr.
Location: Alpharetta, GA
Note: hybrid (1-5 days in office, Wednesdays required) Fully onsite for first 2 weeks
Duration: 6 months, likely extensions.
About the role:
- This role will be working on a new project which is Big Data/Cloud-Driven Initiative. This role is important to enable the ingestion, transformation, and management of massive datasets across cloud and on-prem environments. They will be supporting client's data-driven product strategy, ensuring scalable, accurate, and efficient data pipelines for business and customer-facing platforms
- Client is a global leader in analytics and information services. This role offers opportunity to work with cutting-edge Big Data technologies (Databricks, Azure, AWS, Informatica, SSIS). Background in insurance is highly preferred
- This team supports internal development teams, QA, and business stakeholders building data-driven solutions and products. They have collaborative, innovation-driven environment. They have exposure to highly visible projects impacting data strategy and customer solutions.
- This role will help in accomplishing tasks like designing, building, and maintaining data pipelines; enabling business intelligence and analytics through scalable and accurate data transformation and integration.
· Daily breakdown:
- 40% Designing & developing ETL/data pipelines (SSIS, Informatica, Databricks, Python, Java)
- 30% Collaborating with dev/QA/business teams (Agile/Scrum participation, requirement gathering, solutioning)
- 20% Troubleshooting, bug fixes, technical issue resolution
- 10% Documentation, keeping up with new tools/tech
Ranked must-haves:
- ETL/Data Engineering: SSIS, Informatica, Databricks – 5+ years
- Programming: Python and/or Java – 3+ years
- Cloud: Azure and/or AWS – 2–3+ years
- Unix/Linux proficiency – 2+ years
- Git/GitHub/GitLab experience – 1+ year
- Bachelor’s degree in Computer Science, Engineering, or related field preferred.
- Certifications in cloud (Azure/AWS) or Informatica/Databricks a plus.
Work type: Primarily new data pipelines and application build; some maintenance on existing solutions.
Pluses
Familiarity with C/C++ and additional scripting languages.
Test-driven development knowledge.
Prior experience in highly regulated industries (financial services, healthcare, insurance, legal tech).
Basic Functions:
- Write and review portions of detailed specifications for the development of system components of moderate complexity.
- Complete moderate to complex bug fixes.
- Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
- Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
- Resolve technical issues as necessary.
- Keep abreast of new technology developments.
- Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.
- Proficiency with data manipulation languages SSIS, Informatica
- Ability to work with complex data models.
- Proficiency in Java Proficiency in other development languages including but not limited to: Python, C/C++ etc.
- Familiar with Unix/Linux Servers and commands
- Familiarity of industry best practices code coverage.
- Hands on experience with Cloud Technologies Azure/AWS.
- Hand on experience in Databricks
- Experience working in software development methodologies (e.g., Agile, Waterfall)
- Experience working with Git GitLab/GitHub.
- Knowledge of test-driven development.
- Ability and desire to learn new processes and technologies
- Excellent oral and written communications skills.
- Write and review portions of detailed specifications for the development of system components of moderate complexity.
- Complete moderate to complex bug fixes.
- Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
- Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
- Resolve technical issues as necessary.
- Keep abreast of new technology developments.
- Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.
Applicant Notices & Disclaimers
- For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 60.00/hr.