Our client - a growing startup that keens on complex tasks related to Big Data solutions, is looking for a skilled Big Data Engineer to join their team remotely.
As a Big Data Engineer, you will play an important role in the development of data products for advertising working on the german media project. Building ETL pipelines using the newest Big Data technologies also will be a part of your daily workflow.
WHAT YOU SHOULD HAVE:
- 3+ years of experience as Big Data Engineer or similar role (Data Scientist, Data Analyst);
- 3+ years of experience using Python;
- Proficiency in SQL;
- Experience with using or implementing REST-APIs to exchange data;
- Practical knowledge of ETL concepts, tools, and technologies (e.g. Airflow, Dataflow) as well as of DWH concepts, architectures, and technologies (e.g. BigQuery, Redshift);
- Experience with Big Data Solutions in the Cloud (e.g. Big Query, Composer, Cloud Functions, Cloud Run);
- Experience with Docker, Kubernetes as well as with event streaming technologies (e.g. Kafka, Pub/Sub, Kinesis);
- Knowledge of Gitlab;
- At least Intermediate English.
WOULD BE A PLUS:
- Great team culture;
- Opportunities for professional growth and career;
- Competitive salary.