We are looking for a Data Developer to join a multidisciplinary team that enjoys challenges and is specialized in technological transformation, combining human expertise with AI to create scalable tech solutions.
In Short
Join a multidisciplinary team as a Data Developer.
Work with Data Lake and Delta Lake structures.
Experience with cloud platforms like AWS, Azure, or GCP.
Proficient in Python, Spark (PySpark), and SQL.
Knowledge of unit testing and code versioning.
Experience in Agile methodologies.
Familiarity with AWS services such as Glue, Lambda, DynamoDB, S3, and Kinesis is a plus.
Experience with Infrastructure as Code (IaC) and Databricks.
Knowledge of Docker and streaming data flows is advantageous.
Presence in the Campinas Metropolitan Region office is mandatory.
Requirements
Consolidated knowledge in Data Lake / Delta Lake structures.
Experience in cloud environments: AWS (preferably), Azure, GCP.
Strong skills in Python, Spark (PySpark), and SQL.
Experience with unit testing.
Version control experience.
Experience in Agile teams.
Benefits
Work in a dynamic and innovative environment.
Opportunity to work with cutting-edge technologies.
Collaborate with a diverse team of professionals.
Engage in continuous learning and development.
Participate in exciting projects with major clients.