Crossing the data chasm
ResQ Club is a Finnish zero-waste food marketplace that connects sustainable restaurants, cafes, and grocery stores with consumers that “rescue” fresh, quality food that would otherwise be at risk of going to waste. ResQ partners can drastically reduce their food waste with their proprietary location-based mobile and web service, as it enables consumers to find and rescue surplus food in their proximity. Every meal purchased via ResQ is one less meal thrown away, helping their urban communities to waste less and be more sustainable.
ResQ Club required an end-to-end data strategy to overhaul their existing systems into a unified source of truth for the entire business.
After roughly two years of operation, ResQ Club had achieved strong traction and early product-market fit with customers in Finland, and the business was ramping up to scale. The team’s internal data systems were scattered and decentralised, with different teams using different data tools that produced conflicting metrics and insights. The organisation did not have any dedicated data personnel. The data infrastructure was put together on an ad hoc basis by software engineers and non-technical data consumers. ResQ club knew that they needed to implement a data visualisation tool (Looker) based on a central source of truth, but they did not have any expertise on how to go about it.
- Overhaul their data strategy and make faster, data-driven decisions
- Create a data infrastructure with a centralised source of truth
- Define and measure core business metrics that would help to scale
- Implement Looker and create a culture of self-service analytics
Phase 1: Investigation & value mapping
We started the engagement by asking key stakeholders about their needs and pain points with data in order to map out the business value of each element of the engagement. After diagnosing data usage among the team, we evaluated the technical architecture of the existing stack.
Phase 2: Events stream
The first step to generating a reliable architecture was to standardise the data inputs. We partnered with the engineers and data consumers to create business definitions of each metric, and we then created an events stream to accurately collect the right data points.
Phase 3: Data modelling
With the new events stream pipelines functional, we progressed to modelling. We implemented the data models using Redshift, which converted the raw events into usable data structures that mapped to business use cases.
Phase 4: Visualisations with Looker
Now that the foundations of the data infrastructure were laid, we connected Looker to the data models for business consumption and visualisations.
The entire ResQ team benefited from our work.
- Single source of truth data models housed in Redshift
- Mainstream adoption of Looker throughout the team, with a culture of higher data fluency and data-driven reasoning
- Faster, more confident decision-making by ResQ leaders.