Ver oferta completa
DATA ENGINEER
Descripción de la oferta de empleo
At Niuro, we connect elite tech teams with top-tier U.
.
companies.
Our mission is to simplify global talent acquisition through innovative solutions that maximize efficiency and quality.
We empower projects by providing autonomous, high-performance tech teams, and we offer continuous professional growth opportunities to ensure everyone involved is set for success in innovative and challenging projects.
Key Responsibilities Ingest, process, and integrate data from multiple systems and platforms using Big Data tools.
Design and implement data pipelines to transfer data from source to destination systems, ensuring alignment with integration standards and existing frameworks.
Guarantee high data quality, availability, and compliance across applications and digital products.
Collaborate with product teams to identify required datasets and evaluate existing sources (e.
., update frequency, data reliability, known issues).
Contribute to the delivery of scalable, strategic data engineering solutions aligned with our data roadmap.
Requirements Experience.
3+ years in data engineering or a similar role.
Tech Stack.
Cloud & Big Data.
Proficient with AWS services including Lambda, S3, Glue.
Data Orchestration.
Experience designing and managing workflows using Airflow.
Data Modeling & Transformation.
Hands-on experience with dbt.
Data Warehousing.
Solid knowledge of Snowflake.
Databases.
Experience with MySQL or SQL Server.
Programming.
Strong Python skills, including scripting, automation, and unit testing.
Other Skills.
Good understanding of data quality and governance practices.
Familiarity with various data sources and types in data mining and analytics.
Intermediate to advanced English for international team collaboration.
Nice to Have Experience deploying dbt in production environments.
Knowledge of Data Lake and Lakehouse architectures.
Familiarity with CI/CD tools and Git-based version control (GitHub, GitLab, or Azure DevOps).
Experience with Docker for containerized data workflows.
What We Offer 100% Remote – Work from anywhere in the world.
Growth Opportunities – Be part of shaping a robust and modern data ecosystem.
.
companies.
Our mission is to simplify global talent acquisition through innovative solutions that maximize efficiency and quality.
We empower projects by providing autonomous, high-performance tech teams, and we offer continuous professional growth opportunities to ensure everyone involved is set for success in innovative and challenging projects.
Key Responsibilities Ingest, process, and integrate data from multiple systems and platforms using Big Data tools.
Design and implement data pipelines to transfer data from source to destination systems, ensuring alignment with integration standards and existing frameworks.
Guarantee high data quality, availability, and compliance across applications and digital products.
Collaborate with product teams to identify required datasets and evaluate existing sources (e.
., update frequency, data reliability, known issues).
Contribute to the delivery of scalable, strategic data engineering solutions aligned with our data roadmap.
Requirements Experience.
3+ years in data engineering or a similar role.
Tech Stack.
Cloud & Big Data.
Proficient with AWS services including Lambda, S3, Glue.
Data Orchestration.
Experience designing and managing workflows using Airflow.
Data Modeling & Transformation.
Hands-on experience with dbt.
Data Warehousing.
Solid knowledge of Snowflake.
Databases.
Experience with MySQL or SQL Server.
Programming.
Strong Python skills, including scripting, automation, and unit testing.
Other Skills.
Good understanding of data quality and governance practices.
Familiarity with various data sources and types in data mining and analytics.
Intermediate to advanced English for international team collaboration.
Nice to Have Experience deploying dbt in production environments.
Knowledge of Data Lake and Lakehouse architectures.
Familiarity with CI/CD tools and Git-based version control (GitHub, GitLab, or Azure DevOps).
Experience with Docker for containerized data workflows.
What We Offer 100% Remote – Work from anywhere in the world.
Growth Opportunities – Be part of shaping a robust and modern data ecosystem.
Ver oferta completa
Detalles de la oferta
Empresa
- Niuro
Localidad
- En todo Chile
Dirección
- Sin especificar - Sin especificar
Fecha de publicación
- 11/04/2025
Fecha de expiración
- 10/07/2025