Industry
Food Salvaging
Company size
25-50
Established
2015
Value
~10m USD
Location
Helsinki
ResQ Club is a Finnish zero-waste food marketplace that connects sustainable restaurants, cafes, and grocery stores with consumers that “rescue” fresh, quality food that would otherwise be at risk of going to waste. ResQ partners can drastically reduce their food waste with their proprietary location-based mobile and web service, as it enables consumers to find and rescue surplus food in their proximity. Every meal purchased via ResQ is one less meal thrown away, helping their urban communities to waste less and be more sustainable.
ResQ Club required an end-to-end data strategy to overhaul their existing systems into a unified source of truth for the entire business.
After roughly two years of operation, ResQ Club had achieved strong traction and early product-market fit with customers in Finland, and the business was ramping up to scale. The team’s internal data systems were scattered and decentralised, with different teams using different data tools that produced conflicting metrics and insights. The organisation did not have any dedicated data personnel. The data infrastructure was put together on an ad hoc basis by software engineers and non-technical data consumers. ResQ club knew that they needed to implement a data visualisation tool (Looker) based on a central source of truth, but they did not have any expertise on how to go about it.
“Tasman can be everything that you need from a data perspective. When we reached out to them, we knew where we wanted to get, but we really didn’t know how to get there. They helped us to accelerate our journey and cross the data chasm. We were happy to pay Tasman £10k each month since it helps us make £100k decisions every week. Tasman has done a 10/10 job for us.”
Aku-Jaakko Saukkonen, COO, ResQ Club
We started the engagement by asking key stakeholders about their needs and pain points with data in order to map out the business value of each element of the engagement. After diagnosing data usage among the team, we evaluated the technical architecture of the existing stack.
The first step to generating a reliable architecture was to standardise the data inputs. We partnered with the engineers and data consumers to create business definitions of each metric, and we then created an events stream to accurately collect the right data points.
With the new events stream pipelines functional, we progressed to modelling. We implemented the data models using Redshift, which converted the raw events into usable data structures that mapped to business use cases.
Now that the foundations of the data infrastructure were laid, we connected Looker to the data models for business consumption and visualisations.
The entire ResQ team benefited from our work.