As a GCP Data Engineer at Experience.com, you will design, build, and maintain scalable ETL/ELT pipelines, leveraging GCP data analytics tools to process and analyze data.
In Short
Design and maintain ETL/ELT pipelines using PySpark and SQL.
Work on data extraction, transformation, and loading processes.
Utilize GCP tools like BigQuery and DataProc for data analysis.
Optimize data workflows for performance and reliability.
Collaborate with teams to develop data integration solutions.
Ensure data accuracy and quality through strong database designs.
Implement monitoring for data pipelines.
Requirements
4+ years of experience with PySpark and SQL.
Strong proficiency in Python programming.
Knowledge of GCP Data Analytics ecosystem.
Experience with Airflow/Composer.
Strong analytical and problem-solving skills.
Benefits
Work in a dynamic and innovative environment.
Opportunity to impact customer satisfaction and engagement.
Collaborate with a talented team of professionals.