Job Description
Work Location: Remote
- FL, US
- St. Louis, MO, US
Job description
You could be the one who changes everything for our 28 million members by using technology to improve health outcomes around the world. As a diversified, national organization, Centene's technology professionals have access to competitive benefits including a fresh perspective on workplace flexibility.
Position Purpose: Develops and operationalizes data pipelines to make data available for consumption (reports and advanced analytics), including data ingestion, data transformation, data validation / quality, data pipeline optimization, and orchestration. Engages with the DevSecOps Engineer during continuous integration and continuous deployment.
- Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction (scripts, programs, automation, assisted by automation, etc.)
- Designs, develops, implements, tests, documents, and operates large-scale, high-volume, high-performance data structures for business intelligence analytics
- Designs, develops, and maintains real-time processing applications and real-time data pipelines
- Ensure quality of technical solutions as data moves across Centene’s environments
- Provides insight into the changing data environment, data processing, data storage, and utilization requirements for the company and offers suggestions for solutions
- Develops, constructs, tests, and maintains architectures using programming language and tools
- Identifies ways to improve data reliability, efficiency, and quality; use data to discover tasks that can be automated
- Performs other duties as assigned
- Complies with all policies and standards
Education/Experience: A Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science) and requires 2 – 4 years of related experience.
Or equivalent experience acquired through accomplishments of applicable knowledge, duties, scope and skill reflective of the level of this position.
Technical Skills:
- One or more of the following skills are desired.
- Experience with Big Data; Data Processing
- Experience with Other: diagnosing system issues, engaging in data validation, and providing quality assurance testing
- Experience with Data Manipulation; Data Mining
- Experience with Other: • Experience working in a production cloud infrastructure
- Experience with one or more of the following C# (Programming Language); Java (Programming Language); Programming Concepts; Programming Tools; Python (Programming Language); SQL (Programming Language)
- Knowledge of Microsoft SQL Servers; SQL (Programming Language)
Soft Skills:
- Intermediate - Seeks to acquire knowledge in area of specialty
- Intermediate - Ability to identify basic problems and procedural irregularities, collect data, establish facts, and draw valid conclusions
- Intermediate - Ability to work independently
Must-haves
- Golang and ETL experience
Less Common Requirements
- Big Data; Data Processing
Required Skills
- A Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science) and requires 2 – 4 years of related experience.Or equivalent experience acquired through accomplishments of applicable knowledge, duties, scope and skill reflective of the level of this position.
- For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 43.27/hr.