05 ene
Johnson Controls
Saltillo
**What you will do?
**
Join us in the Procurement Execution Center (PEC) as a **Data Engineer Associate **as part of a is a diverse team of data and procurement individuals.
In this role, you will be responsible for supporting the End-to-End (E2E) management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to enable a successful BI, Reporting and Analytics for the PEC.
This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.
**How you will do it?
**
- Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse.
- Build, design **and support **ETL/ELT processes with Azure Data Factory (ADF) and/or Python, which once deployed, will require to be executed daily and weekly.
- Assemble large data sets that meet functional / non-functional business requirements.
- Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models.
- Maintain conceptual, logical, and physical data models along with corresponding metadata.
- Utilizes the DevOps pipeline deployment model, including automated testing procedures.
- Executes data stewardship and data quality across our data marts, to cleanse, enrich and correct issues in our data, using knowledge bases and business rules.
- Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities.
- Other Data Engineering duties as assigned.
**What are we looking for?
**
- Bachelor's degree in related field (Engineering, Computer Science,
Data Science or similar)
- 1+ years of relevant experience in data analytics roles, data engineering, software engineering or other relevant data roles.
- SQL knowledge and experience working with relational databases.
- Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data quality/stewardship, distributed systems, and metadata management.
- Data manipulation with any programming language (Python + Pandas, SQL, etc.)
and/or ETL/ELT development experience (1+ years, SSIS or ADF are preferred.)
- Ability to resolve ETL/ELT problems by proposing and implementing tactical/Strategic solutions.
- Experience with object-oriented function scripting languages: Python, Scala, C#, etc.
- Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud.
- Excellent problem solving, critical thinking, and communication skills
- Experience with git/repo management **is a plus**
- Due to the global nature of the role, proficiency in English language is a must
Muestra tus habilidades a la empresa, rellenar el formulario y deja un toque personal en la carta, ayudará el reclutador en la elección del candidato.