mycareers logo


Showing: 1 Data Services Cloud Engineer jobs in Phoenix, Arizona
Data Services Cloud Engineer
Spectraforce
Phoenix, Arizona
Remote

9 months ago

Job Description

Job Title: Data Services Cloud Engineer
Duration: 6 months to start (possible temp to hire)
Location:  Phoenix, AZ/100% Remote (Can sit anywhere in the US but Must work Phoenix hours)
Shift/schedule: Monday - Friday, 8am - 5pm PST (preferred).

Interview process: 1:1 1st round interview with direct manager, then 2nd round panel interview with team (Sr. Director will join the panel interview).

Job Description:
Projects to work on:
Roadmap items actively working on in 1st quarter of optimization of red shift environment. Taking a look at data types of tables to make sure they're the right sizes and tables. Opportunities to improve sharing data within the clusters. Also dealing with never ending of expending costs of running more and more activity on clusters - try to find way to reduce costs of scans. And much more....
 
Top 3 Skillsets:
  • Proven experience (3-5 years) with Python for automation tasks, using libraries like Boto3.
  • Hands-on expertise (3-5 years) in managing AWS data services, specifically Redshift, DMS, and S3.
  • In-depth experience (2-3 years) with Infrastructure as Code (IaC) using AWS CDK or CloudFormation.
    Cloud Engineer
 
Are you a skilled Cloud Engineer ready to take ownership of a large-scale AWS data lake environment?
At client, you’ll join a lean team responsible for hands-on maintenance and performance optimization of our AWS infrastructure, ensuring scalability, security, and efficiency. If you’re enthusiastic about applying your technical expertise in a collaborative and evolving environment, this role is for you.


Key Accountabilities and Priorities:
  • Proactively manage and optimize AWS Redshift clusters for performance and scalability.
  • Implement and monitor AWS DMS for data migration and ongoing replication.
  • Design, develop, and optimize ETL workflows using AWS Glue for data integration.
  • Administer Storage Gateway and Transfer Family to ensure secure and efficient data transfer across environments.
  • Build and maintain serverless solutions using AWS Lambda.
  • Oversee the organization and storage of data within AWS S3 buckets.
  • Develop and manage CI/CD pipelines for automated deployments, using tools like AWS CodePipeline and CDK.
 
Required Qualifications:
  • Proven hands-on experience with AWS Redshift, DMS, Glue, Lambda, and S3.
  • Expertise in Python programming (3-5 years), especially for AWS automation tasks.
  • Experience designing and deploying IaC with AWS CDK or CloudFormation.
  • In-depth knowledge of data lake architecture and best practices.
  • Track record of optimizing data architectures and resolving performance bottlenecks
  • Proven ability to troubleshoot issues in complex AWS environments, especially related to storage, transfer, and processing of data.
 
Preferred Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • AWS certifications, such as AWS Certified Solutions Architect or AWS Certified Data Analytics.
  • Experience building CI/CD pipelines with CodePipeline.
  • Hospitality Industry experience is a plus.
 
Keywords:
AWS, Python, IAC, Infrastructure as Code, CloudFormation, CDK, Redshift, Glue, Lambda, DMS, Storage Gateway, Transfer Family, S3, Redshift 
Applicant Notices & Disclaimers
  • For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
 
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 60.00/hr.

Don't miss your next Big Opportunity!

Get notified when we find an opportunity for you