We are hiring an Analytics Engineer

19th February 2020

By Thomas In’t Veld.

We’re hiring an Analytics Engineer to join our Data Platform team and help us:

  • Select and set up the data collection stack for our clients
  • Design and develop client-specific SQL data models that produce clean, structured and interpreted data sets for the business and other data function
  • Build a suite of standardised data models, event grammars, frameworks, and scripts we can use across clients in order to reduce the time-to-value.

We’re looking for candidates who are based in the UK, the Netherlands, or Sweden (all cities are fine as we work remote).

We strongly encourage candidates of all different backgrounds and identities to apply. Each new hire is an opportunity for us to bring in a different perspective, and we are always eager to further diversify our company. Tasman is committed to building an inclusive, supportive place for you to do the best and most rewarding work of your career.

You can apply here or keep reading for more information on the role!


Tasman was founded in 2017 and both our client base and our team are rapidly growing. Our team acts as an interim Data Team for our clients, helping them scale their data capabilities, no matter how early they are in their journey. We combine deep expertise and a suite of best-in-class tools to provide end-to-end support, from strategy through to execution.

We’ve worked hand-in-hand with over 15 high-growth companies in a range of industries. We’ve helped Lifecake to build a product for sharing childhood memories with family, Kaia Health to combine physical, psychological and educational expertise to tackle chronic diseases, and Asana Rebel in supporting those who want to get fit and start a healthy lifestyle. Other clients include Bleach London, The Business of Fashion, The Collective, Flash Pack, Marco Polo Learning, Gousto, RIXO, and ResQ Club.


The Data Platform team is responsible for data collection and modeling. You’ll join this team, reporting directly to one of our co-founders, Christophe Bogaert, and working closely with our clients and the other teams at Tasman.

As members of the Data Platform team, we get to work with a broad variety of data generating systems that cover the full spectrum of business functions. Our preference for mobile apps and websites is on Snowplow, but we also work with GA, Segment, and other vendors. For marketing, we have developed integrations with and standard models for Facebook Ads, Google Ads, Twitter Ads, Apple Search Ads, Mailchimp, and Hubspot. For operations, we often develop models for Salesforce, Shopify, Woocommerce, and Netsuite. We reuse our standard models as much as possible and tailor them to the unique business setup of our client and, when we come across new platforms, we combine analytical techniques and software craftsmanship to add them to our toolkit.

We also work in close collaboration with the Data Analysts and Data Scientists at Tasman to design data models that connect to business needs, both client-specific and generalised. Through a process of ideas and iterations we are able to extend models elegantly to accommodate additional analytical needs as they arise through data science activities.

Data collection

We work with our clients to select and set up the right tools for event data collection (e.g. Snowplow or Segment; managed or open source) and data extraction (e.g Fivetran, Stitch, or our own scripts). Our aim is to put in place a robust and scalable data collection stack that meets all the client’s business and legal requirements and sets their internal data team up for long-term success by making it easy to collect high-quality granular data.

Most of the work we do on data collection is client-specific, but we also maintain a common event grammar that we have developed over time and use across clients, as well as a set of scripts to extract data from various sources.

Data modeling

We design, implement, document, and maintain a wide range of SQL data models for our clients. These models integrate data from all different sources and generate clean, structured and interpreted data sets that are ready for consumption by the business or by the other data functions. We run these models on cloud-native data warehouses (BigQuery, Redshift, and Snowflake) and we use a new generation of data build tools like Dataform and dbt (managed or open source) to bring a number of best practices from software engineering to data modeling. We also work with our Data Scientists to make their descriptive and predictive models robust and scalable so they can run in a production environment.

As with data collection, most of the work we do on data modeling is client-specific. However, a lot of that work is built on a common set of data models that we have developed over time and use across clients. As we deliver data models for clients, we often find ourselves extending or improving the common data models before we develop the client-specific model. The cumulative effect of these standardisation efforts is that we shorten the time-to-value with each new client we work with.

You will...

  • Help our clients select and set up the right infrastructure for data collection and modeling. Work in close collaboration with the other teams at Tasman to design and develop SQL data models that deliver clean, structured and interpreted data to the downstream consumers.
  • Expand and improve the suite of tools that form the foundation of all the client work we do.
  • Get to work with an experienced team. Coaching and mentoring is an important part of the road to success for us.
  • Have the opportunity to become an expert on data collection and modeling. You will develop strong opinions, whilst retaining a certain level of pragmatism, and you will have the opportunity to share those opinions by writing blog posts and presenting at conferences.
  • Help train the internal data team at our client to ensure a smooth handover as we near the end of a project.


  • You have strong SQL experience and understand the difference between queries that work and queries that are maintainable and scalable. Some experience with cloud-native data warehouses is preferred (we use BigQuery, Redshift, and Snowflake). Experience with other languages, especially with R and Python, is a plus but not a requirement.
  • You have at least 2 years of relevant experience. You won’t know how all the systems work on day one so we don’t expect you to hit the ground running, but a good understanding of the fundamentals and experience working in a similar role will help you get up to speed quickly.
  • You are self-motivated and prefer to learn by doing, failing, and trying again. You have a can-do attitude to learning new skills and new tools. Not only do we have to keep abreast of the state-of-the-art, we also have to adapt sometimes to clients’ toolsets.
  • You are a good communicator who is able to work directly with data analysts, data scientists, engineers, product managers, marketers, and executive teams. Being able to communicate with clarity and precision is of critical importance to this role.
  • You are a clear thinker who is comfortable explaining their thinking, discussing different approaches to a problem, and working collaboratively and creatively to identify new solutions.
  • You care about the details and have a mature attitude to documentation, security, and process—all of which are important and inform everything we do.
  • You might have a STEM degree. You might not. That’s not necessarily what we’re looking for. We care about what you can do and how you do it, not about how you got here. A strong track record of conscientious, thoughtful work speaks volumes.
  • You enjoy working across multiple problems with multiple clients at any given moment. To do this effectively, you understand the importance of planning and estimating and know how to balance competing priorities.
  • This is a remote job. You’re free to work where you work best: home office, co-working space, coffeeshops. Whilst we currently have a co-working space in London and will soon get one in Amsterdam as well, you should be comfortable working remotely—we all do some or most of the time!


  • A competitive package with share options
  • 25 days holidays (on top of public holidays)
  • A latest generation MacBook Pro
  • A flexible work environment
  • A budget for home office


Please send an application that speaks directly to this position. Address some of the work we do and tell us about your role in Tasman’s future and Tasman’s role in yours. There’s no benefit to writing a novel. Keep it short and get across what matters to you. Clear communication is crucial in the work we do, and you should think of this as our first impression of you.

We will process applications on an ongoing basis which means there is no deadline. However, we will make an offer as soon as we find the right person for the role so don’t wait too long before applying! At this stage, regrettably, we are also not able to offer internships or visa sponsorships.

Expect us to take one or two weeks to review the application. We’ll let you know whether you’ve advanced to the next stage of the application process, which will be a phone screen. If that goes well, we’ll proceed to the in-depth interviews. Our interviews are one hour, all remote, each with one of the co-founders of Tasman.

You can do all of this on our careers page over here: Analytics Engineer.

We look forward to hearing from you!

jobsanalytics engineeranalyticsdata modelsqlpythondataformdbtlookerredshiftbigquery