Ver oferta completa
PYTHON DEVELOPER
Descripción de la oferta de empleo
We are looking for a Python Developer with expertise in handling large datasets and working with modern data platforms.
This role will involve data scraping, API integration, and the use of third-party scraping tools to manage records, which are expected to scale into the tens of millions.
Efficient data processing, categorization, and optimization are key components of this project..
Funciones del cargo Data Management & Scraping.
Handle large datasets by developing efficient data pipelines, including scraping data from various sources (APIs, third-party scraping tools), with a focus on extracting reused items from various marketplaces and consolidating the data for further analysis.
Airflow Automation.
Use Apache Airflow to schedule Python jobs for regular data cleaning, optimization, and scraping tasks.
While prior Airflow experience is preferred, strong Python skills are sufficient for quick upskilling.
Supabase & Data Integration.
Integrate data into Supabase for the US platform, mirroring the existing UK platform setup.
Experience with PostgreSQL and Supabase will be crucial for this task.
GCP Experience.
Work within a Google Cloud Platform (GCP) environment, managing cloud storage, computing resources, and database solutions.
Frontend Collaboration (Next.
s).
Collaborate with the frontend team working with Next.
s to ensure backend integration aligns with frontend requirements, particularly in mirroring the UK platform’s structure.
Requerimientos del cargo Python Development.
Strong experience in Python, particularly in data processing, API integration, and managing large datasets.
SQL & PostgreSQL.
Advanced proficiency in SQL, including writing complex queries and managing databases optimized for large-scale data.
Apache Airflow (Preferred).
Experience using Airflow for scheduling, automation, and managing Python-based workflows.
Supabase & PostgreSQL.
Familiarity with Supabase as a backend service and working knowledge of PostgreSQL databases.
Google Cloud Platform (GCP).
Experience using GCP services like Cloud Storage, Cloud Functions, or Cloud SQL in data-heavy environments.
Git & Vercel.
Ability to work with version control (Git) and deploy on platforms like Vercel as part of a modern DevOps pipeline.
Next.
s (Bonus).
Although not the primary focus, knowledge of Next.
s would be an advantage for better backend/frontend integration Conditions
This role will involve data scraping, API integration, and the use of third-party scraping tools to manage records, which are expected to scale into the tens of millions.
Efficient data processing, categorization, and optimization are key components of this project..
Funciones del cargo Data Management & Scraping.
Handle large datasets by developing efficient data pipelines, including scraping data from various sources (APIs, third-party scraping tools), with a focus on extracting reused items from various marketplaces and consolidating the data for further analysis.
Airflow Automation.
Use Apache Airflow to schedule Python jobs for regular data cleaning, optimization, and scraping tasks.
While prior Airflow experience is preferred, strong Python skills are sufficient for quick upskilling.
Supabase & Data Integration.
Integrate data into Supabase for the US platform, mirroring the existing UK platform setup.
Experience with PostgreSQL and Supabase will be crucial for this task.
GCP Experience.
Work within a Google Cloud Platform (GCP) environment, managing cloud storage, computing resources, and database solutions.
Frontend Collaboration (Next.
s).
Collaborate with the frontend team working with Next.
s to ensure backend integration aligns with frontend requirements, particularly in mirroring the UK platform’s structure.
Requerimientos del cargo Python Development.
Strong experience in Python, particularly in data processing, API integration, and managing large datasets.
SQL & PostgreSQL.
Advanced proficiency in SQL, including writing complex queries and managing databases optimized for large-scale data.
Apache Airflow (Preferred).
Experience using Airflow for scheduling, automation, and managing Python-based workflows.
Supabase & PostgreSQL.
Familiarity with Supabase as a backend service and working knowledge of PostgreSQL databases.
Google Cloud Platform (GCP).
Experience using GCP services like Cloud Storage, Cloud Functions, or Cloud SQL in data-heavy environments.
Git & Vercel.
Ability to work with version control (Git) and deploy on platforms like Vercel as part of a modern DevOps pipeline.
Next.
s (Bonus).
Although not the primary focus, knowledge of Next.
s would be an advantage for better backend/frontend integration Conditions
Ver oferta completa
Detalles de la oferta
Empresa
- Moventi
Localidad
- En todo Chile
Dirección
- Sin especificar - Sin especificar
Fecha de publicación
- 26/12/2024
Fecha de expiración
- 26/03/2025
Desarrollador rpa rocketbot - remoto
B-tech consulting
Manejo de python nivel intermedio o avanzado... el área de automatización de b-tech consulting te invita a ser parte de nuestros ingenieros desarrolladores expertos en rpa, en un grato ambiente laboral, con desafíos constantes y crecimiento laboral... el trabajo es 100% remoto......