Software Engineering, Data Science
Bogotá, Bogota, Colombia
Posted on Wednesday, September 27, 2023
COME JOIN US AT YUNO!💜
We are seeking a Semi-Sr Data Engineer to join our team. You will play a crucial role in designing, developing, and maintaining our data infrastructure to support the growth and success of our payment processing platform. You will be responsible for ensuring the reliability, scalability, and efficiency of our data systems, and you will work with various data sources, including HDFS, CSV, Parquet, APIs, Web Services, and more.
🟣 What would be my challenge in Yuno?
-Implement any type of extraction ( Manuals, mails, SFTP, and Apis)
-Build solutions for application integrations, task automation and any relevant data automation using proven design patterns.
-Design and build data processing pipelines for large volumes of data that are performant and scalable.
-Build and maintain the infrastructure required for extraction, loading, transformation and storage of data from multiple data sources using custom scripted.
-Collaborate with the team to develop and maintain a robust and scalable data infrastructure to support our data needs.
-Implement and enforce data governance policies and best practices to ensure data quality, security, and compliance.
-Manage and optimize data warehousing solutions for efficient storage and retrieval of data.
-Develop and maintain data lake solutions for storing and managing diverse data types.
-Use big data technologies and frameworks to process and analyze large datasets efficiently.
-Work with distributed data systems and technologies to handle high volumes of data.
-Integrate data from various sources, including HDFS, CSV, Parquet, APIs, Web Services, and more.
-Bachelor's degree in Computer Science, Information Technology, or a related field.
-Proven experience as a Data Engineer or similar role in a data-intensive environment.
-Strong proficiency in Python and Scala.
-Knowledge of data infrastructure design and management.
-Familiarity with data governance principles and practices.
-Experience with ETL processes and tools.
-Proficiency in working with Data Warehouses and Data Lakes.
-Familiarity with big data technologies and distributed data systems.
-Ability to integrate data from various sources, including HDFS, CSV, Parquet, APIs, Web Services, etc.