24 feb
Encora
Xico
.
**Important Information**Experience: + 7 yearsJob Mode: Full-timeWork Mode: Work from home**Job Summary**As a Cloud Data Engineer specializing in Fabric OneLake, you will be responsible for designing, building, and managing our cloud-based data infrastructure.
You will play a critical role in the development and optimization of our data lake, ensuring scalability, reliability, and security.
**Responsibilities and Duties**- Design and implement scalable and secure data solutions using Fabric OneLake technology.- Developing and maintaining scalable data pipelines and architectures that support data ingestion, processing, storage,
and delivery across multiple sources and destinations.- Collaborate with IT and business teams to understand data needs and deliver high-quality scalable solutions.- Develop and maintain data pipelines, architectures, and data sets.- Ensure optimal data delivery architecture for end-to-end data flow from ingestion to analytics.- Work with stakeholders to assist with data-related technical issues and support data infrastructure needs.- Create data tools for analytics and data scientist team members to assist them in building and optimizing our product.- Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
**Qualifications and Skills**- A bachelor's degree in computer science, Engineering, or a related field, or equivalent work experience.- Proficiency in SQL and Python, and familiarity with other programming languages and frameworks such as Scala, R, or Spark.- Experience with cloud-based data services and platforms such as Azure, AWS, or GCP, and with data warehouse and ETL tools such as Snowflake, SSIS, or Informatica.- Knowledge of data modeling, data quality, data governance,
and data security best practices and standards.- Proven experience with Azure DataBricks, Azure Data Factory, and other Azure services.- Strong analytic skills related to working with unstructured datasets.- Experience with big data tools: Hadoop, Spark, Kafka, etc.- Experience with data pipeline and workflow management tools.- Experience with Azure SQL DB, Cosmos DB, or other database technologies.- Experience with stream-processing systems.- Strong project management and organizational skills.- Strong communication, collaboration, and problem-solving skills, and a passion for learning new technologies and methodologies.
**Preferred Tech Skills**:- Experience with Fabric OneLake development and management.- Knowledge of networking within Azure Data Bricks,
including VNET settings and firewall rules.- Ability to set up linked services within Azure Data Factory and execute ADB notebooks.- Familiarity with on-premise to cloud data migration and managing data across hybrid environments.- **Cloud Platforms**: Proficient in Azure, including Azure DataBricks, Azure Data Factory, and Azure SQL Data Warehouse.- **Programming Languages**: Strong command of Python, Scala, and SQL
Muestra tus habilidades a la empresa, rellenar el formulario y deja un toque personal en la carta, ayudará el reclutador en la elección del candidato.