mycareers logo


Showing: 3356  jobs
Fabric Developer / Sr. Data Engineer
Spectraforce
Phoenix, Arizona

3 hours ago

Job Description

Job Title – Fabric Developer / Sr. Data Engineer
Location – Phoenix, AZ (Onsite 5 days)
Duration – 6 Months
 
Job Description:
Responsibilities:
  • Builds and maintains ELT/ETL pipelines using Microsoft Fabric tools, enabling efficient data ingestion from multiple resources.
  • Applies transformations, cleanses, and enriches data to ensure it is ready for analyzing and reporting
  • Handles large datasets, optimizing storage and retrieval for performance.
  • Implements automation for data processing and integration workflows, reducing manual intervention
  • Works with Platform Architects to ensure infrastructure supports data requirements.
  • Partners with Report developers to ensure that data is in a usable format and ready for analysis.
  • Ensuring code reusability and parameterization
  • Focuses on creating interactive, intuitive reports and dashboards using Microsoft Fabric's reporting tools.
 
Qualifications:
  • Data Factory (in Fabric): Designing and orchestrating data ingestion and transformation pipelines (ETL/ELT).
  • Data Engineering Experience (Spark): Using Notebooks (PySpark, Spark SQL, Scala) and Spark Job Definitions for complex data processing, cleansing, enrichment, and large-scale transformations directly on OneLake data.
  • Lakehouse Items: Creating and managing Lakehouse structures (Delta tables, files) as the primary landing and processing zone within OneLake.
  • OneLake / ADLS Gen2: Understanding storage structures, Delta Lake format, partitioning strategies, and potentially managing Shortcuts.
  • Monitoring Hubs: Tracking pipeline runs and Spark job performance.
  • Core Responsibilities (Fabric Context): Building ingestion pipelines from diverse sources; implementing data cleansing and quality rules; transforming raw data into curated Delta tables within Lakehouses or Warehouses; optimizing Spark jobs and data layouts for performance and cost; managing pipeline schedules and dependencies; ensuring data security and governance principles are applied to pipelines and data structures.
  • Excellent Communicator and collaboration skills
  • Bachelor’s degree in Computer Science, Engineering, or a relevant field
  • Azure/AWS Cloud Certifications will be a plus
  • Experience with Manufacturing domain will be a plus.
  • Should be self-driven and be able to drive the projects to delivery
 
Years of Experience - 9 Years 
Applicant Notices & Disclaimers
  • For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
 
At SPECTRAFORCE, we are committed to maintaining a workplace that ensures fair compensation and wage transparency in adherence with all applicable state and local laws. This position’s starting pay is: $ 40.00/hr.

Don't miss your next Big Opportunity!

Get notified when we find an opportunity for you