Picnic is data-driven. As one of our engineers, you play an important role in each aspect of our business. From route planning, delivery of groceries, to analysing supply chains. You ensure...
Picnic is data-driven. As one of our engineers, you play an important role in each aspect of our business. From route planning, delivery of groceries, to analysing supply chains. You ensure that each level of our operation is supported, adjusted, and predicted with data.
There’s no two ways about it – you’re a Data Wizard / Witch. You bring out detailed information and quirky insights while manipulating data. You find what others can’t and glean business opportunity from numbers. Collaborating with our analysts, you find practical solutions to persistent problems.
By working towards a reliable data pipeline, you allow the team to mine and crunch data. You analyse, experiment, and promote statistics that pique your interest. Test, evaluate, and evolve your ideas alongside our dedicated Distribution team.
More interested in customer behaviour? That’s fine – work on in-app analytics to ensure our mobile store remains smooth, speedy, and robust. You have the opportunity to work on what you love while writing the future of in-app grocery shopping!
- Design, implement, and maintain scalable data pipelines
- To solve data challenges, collaborate with domain experts and analysts
- Develop advanced data reporting and visualizations
- Apply data modelling methodologies and contribute to a robust data platform
- Master’s degree in Computer Science or equivalent
- Experience with SQL and relational databases
- Knowledge of one or more programming languages (Python or Java preferred)
- Strong understanding of data models (Data Vault and Kimball) and data warehouses
- Commitment to excellence, performance, and efficiency
- Critical thinking and initiative with hands-on and nothing-is-impossible mindset
- Highly analytical and curious intellect
- Capacity to articulate technical problems and projects to all teams
Technologies we use:
- Python, Pentaho Data Integration with custom components developed in Java
- Snowflake, PostgreSQL, MongoDB, Tableau
- Spark, Elastic MapReduce, Snowplow, Kinesis
- AWS, Docker, Terraform, Kubernetes, Vault