Asset 2
X ResQ Club

Crossing the data chasm

Context

ResQ Club is a Finnish zero-waste food marketplace that connects sustainable restaurants, cafes, and grocery stores with consumers that “rescue” fresh, quality food that would otherwise be at risk of going to waste. ResQ partners can drastically reduce their food waste with their proprietary location-based mobile and web service, as it enables consumers to find and rescue surplus food in their proximity. Every meal purchased via ResQ is one less meal thrown away, helping their urban communities to waste less and be more sustainable.

Challenge

After roughly two years of operation, ResQ Club had achieved strong traction and early product-market fit with customers in Finland, and the business was ramping up to scale. The team’s internal data systems were scattered and decentralised, with different teams using different data tools that produced conflicting metrics and insights. The organisation did not have any dedicated data personnel. The data infrastructure was put together on an ad hoc basis by software engineers and non-technical data consumers. ResQ club knew that they needed to implement a data visualisation tool (Looker) based on a central source of truth, but they did not have any expertise on how to go about it.

Objectives

      ResQ Club required an end-to-end data strategy to overhaul their existing systems into a unified source of truth for the entire business. Specifically, they hoped to:
      - Overhaul their data strategy and make faster, data-driven decisions
      - Create a data infrastructure with a centralised source of truth
      - Define and measure core business metrics that would help to scale
      - Implement Looker and create a culture of self-service analytics

What We Did

Phase 1: Investigation & value mapping

We started the engagement by asking key stakeholders about their needs and pain points with data in order to map out the business value of each element of the engagement. After diagnosing data usage among the team, we evaluated the technical architecture of the existing stack.

Phase 2: Events stream

The first step to generating a reliable architecture was to standardise the data inputs. We partnered with the engineers and data consumers to create business definitions of each metric, and we then created an events stream to accurately collect the right data points.

Phase 3: Data modelling

With the new events stream pipelines functional, we progressed to modelling. We implemented the data models using Redshift, which converted the raw events into usable data structures that mapped to business use cases.

Phase 4: Visualisations with Looker

Now that the foundations of the data infrastructure were laid, we connected Looker to the data models for business consumption and visualisations.

The Result

  The entire ResQ team benefited from our work. 
  - Single source of truth data models housed in Redshift
  - Mainstream adoption of Looker throughout the team, with a culture of higher data fluency and data-driven reasoning
  - Faster, more confident decision-making by ResQ leaders

Testimonies

Tasman can be everything that you need from a data perspective. When we reached out to them, we knew where we wanted to get, but we really didn’t know how to get there. They helped us to accelerate our journey and cross the data chasm. We were happy to pay Tasman £10k each month since it helps us make £100k decisions every week. Tasman has done a 10/10 job for us.
Aku-Jaakko Saukkonen, former COO, ResQ Club