If you love working with data in a cool environment… keep reading!
Our client was founded in Switzerland. It successfully operates across Germany, Austria, and Switzerland. The company is customer-centric and utilizes an ecosystem of key brokers and insurance companies to meet customer needs of trust and simplification.
Your mission 🚀
✅ Designing and implementing the data processing pipelines with a powerful ETL.
✅ Investigating, evaluating and proposing different "data solutions", e.g., select the best DB for a given type of data, implement a Spark code to speed-up processes, implement parts of DAG in different languages if needed, etc.
✅ Improving the performance of ETL DAGs with high-quality code.
✅ Automatize the training, building, and deployment of a series of Machine Learning algorithms that are used for very different applications.
Perks, my friend!
Flexible working hours including some days remote option
Private Health Insurance
26 working days of vacation + 24th and 31st December
Discounted gym membership (Gymforless)
50% of your public transport costs paid by the company
A stack of the most modern technologies and working gadgets
Free coffee, fruit and snacks.
Company merch and tees. There is no better way to get your company name out there than by arming your team with branded goodies!
Several company benefits – grants for public transport tickets, training and coaching
Great offices: sunny terrace, foosball and a massive ping pong table
If this is you – holy cow! 🐄
• +3 years of experience programming in Python.
• Great if you feel comfortable with different kinds of databases (RDBMS, NoSQL, Big-data systems...).
• ETL, data pre-processing, or data analysis.
• You have experience in container development with Docker.
• Skilled programmer (e.g., unit test, production/staging experience).
• You have excellent verbal and written communication skills; ability to communicate effectively with different levels of management, as well as the business and technical communities.
• Fluent in English is a must.
• You are a big fan of new technologies, clean architecture, coding in a decoupled way.
If this is also you – JACKPOT! 🍒🍒🍒
• Great if you have experience with Apache Spark & Apache Airflow.
• Nice to have experience connecting multiples types of database.
• You have experience synchronizing database through API.
• Marvelous if you have experience with Salesforce.
• You have experience creating architectures in AWS with distributed load in multiples EC2 machines.
Let’s have a chat and GetWith us :)