mycareers logo


Showing: 3329  jobs
Snowflake Developer
Spectraforce
Chicago, Illinois

5 hours ago

Job Description

Job Title: Snowflake Developer
Work location: Chicago, IL (Hybrid - onsite 3 days per week)

Duration: 09 months

Project Overview:
This is to support the new work forthcoming in Risk:
Migration of legacy data workloads to Snowflake.
Development to support source system replacements.
Murex Risk docket.

Looking for an experienced Snowflake developer responsible for enhancing and supporting data ingestegress integrations with Risk & Compliance Applications.

Contractor’s Role:
The role will revolve around the building and supporting of the Snowflake data workloads as well as working with current application development team to drive superior business value, enhanced customer experience and compliance with NT standards.

Experience Level:
Senior resource: 10+ years

Skills/Qualifications (must haves):
• Expertise in Design/Development in Snowflake, Python.
• Expertise in Data Analysis/Analytics skills.
• Expertise in SQL.
• Experience with Airflow.
• Strong in PL/SQL and UNIX shell scripting.
• Experience working with XML transformation and consumption of messages from queues will be a plus.
• Hands-on improve operational stability via automation, such as auto healing where possible, and raise design changes to Data analyst, SMEs and/or architects, as well as opportunities to application /service managers.
• Work some off hours as demanded by projects, operations, and stakeholders, such as release, testing and/or critical bug resolutions.

Nice to have:
• Experience with ETL tools such as DataStage.
• Experience with control-M

Tasks and Responsibilities:
•  Migrate staging and output layers from legacy platforms (e.g., Oracle, SQL Server) to Snowflake-based schemas.
•  Design and implement scalable data models in Snowflake, ensuring alignment with business logic and reporting needs.
•  Build and maintain Airflow DAGs to automate data movement between Snowflake schemas
•  Use orchestration tools such as Airflow, Azure DevOps, and Control-M to manage deployments and scheduling
•  Ingest data from different sources such as flat files (CSV, TXT, XML, JSON) and relational databases (Oracle, SQL Server)
•  Transform raw data into structured formats based on business requirements, ensuring consistency and accuracy across layers.
•  Clean and restructure unformatted or semi-structured files to make them compatible with Snowflake ingestion pipelines.
•  Implement error detection and handling logic, including creation of error tables to capture rejected records (e.g., duplicates, schema mismatches)
•  Monitor pipeline health and troubleshoot data quality issues proactively. 
Applicant Notices & Disclaimers
  • For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
 
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 60.00/hr.

Don't miss your next Big Opportunity!

Get notified when we find an opportunity for you