About the Role
We're looking for a motivated contractor to help design, build, and maintain data engineering workflows that transform raw enterprise data into reliable, production-ready tables. This is a great opportunity for someone early in their career who is eager to get hands-on experience with real-world data pipelines and systems.
You'll serve as the systems contact owner for Workplace data, with a primary focus on combining and consolidating ingestion tables into clean, production-ready tables that power reporting and analytics.
What You'll Do
Data Engineering & Pipeline Development
- Combine raw ingestion tables into production-ready, business-consumable data models
- Build and maintain data pipelines using Apache Airflow for workflow orchestration
- Process and transform data using Python, pandas, and numpy
- Help ensure data quality, governance, and compliance standards are met across production tables
- Monitor and troubleshoot pipeline health and data freshness
Data Architecture & Modeling
- Support the design and implementation of data models including star schemas and fact/dimension tables
- Write SQL queries to extract, transform, and load data from enterprise databases and data warehouses
- Help maintain documentation on data sources, table lineage, and schema decisions
- Act as the systems contact owner — the go-to person for understanding Workplace data sources and table structures
Stakeholder Collaboration
- Work with business stakeholders and teammates to understand data needs and requirements
- Communicate progress, blockers, and timelines clearly
- Contribute to documentation that helps both technical and non-technical teammates understand the data
Required Qualifications
Technical Skills
- SQL: Solid working knowledge of SQL including joins, aggregations, and filters — window functions and CTEs are a plus
- Python: Working proficiency in Python; familiarity with pandas or numpy is a plus
- Data Concepts: Basic understanding of data modeling, ETL/ELT patterns, and data warehousing concepts
- Pipeline Tools: Exposure to or interest in workflow orchestration tools like Apache Airflow
- Attention to Detail: Strong data quality mindset — able to spot inconsistencies and trace issues upstream
Soft Skills & Work Style
- Curiosity: Eager to learn and grow in a hands-on data environment
- Communication: Able to clearly communicate progress and ask good questions when blocked
- Ownership: Willing to take responsibility for a data domain and see tasks through to completion
- Collaboration: Comfortable working cross-functionally with analytics, engineering, and business teams
- Resourcefulness: Able to problem-solve independently and navigate ambiguity
Nice to Have
- Exposure to dbt (data build tool)
- Familiarity with data warehouse solutions like Snowflake, Hive, or BigQuery
- Any experience with cloud platforms (AWS, GCP, Azure)
- Coursework or projects involving data pipelines, databases, or business intelligence
-
Benefits Offered: Medical, Dental, Vision, 401K
Magnit Global (the operator of this Talent Community) is a global leader of contingent talent services. Our success and our clients’ success are built on a foundation of service excellence. We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable law, including the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Unincorporated LA County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: client provided property, including hardware (both of which may include data) entrusted to you from theft, loss or damage; return all portable client computer hardware in your possession (including the data contained therein) upon completion of the assignment, and; maintain the confidentiality of client proprietary, confidential, or non-public information. In addition, job duties require access to secure and protected client information technology systems and related data security obligations.