Job Description

Product, Tech & AI · London Marathon Events · Permanent · Hybrid

Job description

London Marathon Events (part of the London Marathon Group) works to inspire, champion and increase activity in all ages, abilities and demographics. We do this through our successful event portfolio (which includes the TCS London Marathon and TCS Mini London Marathon, Brighton Marathon, The Big Half, Vitality London 10,000, Vitality Westminster Mile, Standard Chartered Great City Race and Swim Serpentine), through our extensive outreach programmes working with schools and community groups in London and across the UK and through our network of wholly owned or partially owned companies, such as Maverick, Run 4 Wales, Loch Ness Marathon Ltd, Caledonian Concepts, Athletic Ventures and Friday Night Lights
Our events raise millions for charities every year to improve the lives of individuals and communities and we passionately believe in the power of sport.
Our values are integral to who we are, how we work and what we do:
  • Impact – together we create positive change
  • Innovate – together we enable new ideas
  • Everyone – together we champion inclusivity
  • Customer first – together we go the extra mile for all
Diversity, inclusion and wellbeing are at the heart of everything we do, and we want to ensure they are at the heart of our company. We strongly value diversity within our workforce and recognise that different people bring different perspectives, lived experience, ideas and culture to the company. This difference brings with it great strengths, including diversity of thought.
So, if you’re ready to be a force for good, lead change and want to make a difference to society, keep reading.
The Role
As a Data Engineer, you will be responsible for engineering our data platform which is essential to the running and evolution of our mass participation event portfolio. This involves working with our SQL databases, coding in Python, integrating data from API’s, building automated ingest and curation data pipelines, and leveraging Azure cloud services to populate our core databases and downstream systems with accurate timeous data used for key business functions and analytics.
You will be reporting into the Head of Data and coming into a collaborative team who share our values and genuinely want to achieve our organisational vision of ‘Inspiring Activity’. This will involve working closely with and supporting the data needs of various LME teams including the Marketing Team, Commercial Team, Customer Engagement Team, Operational Event Experience Team, our IT function and our Charities team who support the thousands of Charities that we work with.
To be successful in this role you will need to be entrepreneurial and curious, able to break down complexity, have a can-do attitude and partner with business stakeholders to help create the solutions that power our exceptional portfolio of mass participation events. You will have support from our internal data team and 3rd party and outsourcing partners we who collectively help build and deliver a reliable and robust data platform.
Our data platform is Azure cloud native, where we leverage services like Data Lake Storage Gen2, SQL Database, DataFactory, ContainerApps (with Docker), LogicApps, FunctionApps, WebApps with API Management and API Gateways, Log Analytics, KeyVault, and we code in SQL and Python. We have recently introduced Fabric to broaden our analytics capabilities and you would be involved in the buildout of this extension to our platform.
Key Responsibilities
  • Collaborate with colleagues to ensure we create a best-in-class data platform providing low-cost, actionable insights and to drive data-driven automation and decisions across the organisation.
  • Architect, maintain and optimise key datasets for use across the company.
  • Design, build, test and deploy software within an Agile Software Development Life-Cycle, leveraging AI tools where appropriate.
  • Production of high quality, well tested code that can be deployed using modern CI/CD tooling (we use Azure DevOps).
  • Undertake monitoring of pipelines, system resource utilisation, availability and correspond with end users to investigate issues and resolve queries.
  • Peer review requirements and delivery of any outsourced work.
  • Understand the concepts and principles of data modelling and produce relevant data models for the subject areas within our business.
  • Review data quality exceptions and resolve data issues, as well as platform housekeeping tasks.
  • Monitor and implement solutions in accordance with the standards and policies outlined in the firm’s data governance framework.
  • Communicate clearly and consistently with colleagues, business stakeholders and suppliers.
  • Employ a growth mindset. Meaning you are open to new challenges and to learn new skills. When work needs to be done, you are happy to learn the skills needed to deliver it.
This list of duties and responsibilities is not exhaustive meaning the role may also include the undertaking of additional tasks as required
Required Skills
Technical
  • Proficiency in Python and SQL
  • Expert MS Excel (Formulas, Pivots)
  • Demonstrable knowledge and skills in the Azure Cloud Stack - Data Lake Storage, DataFactory, ContainerApps, LogicApps, FunctionApps, WebApps, API Management, API Gateways, Log Analytics, KeyVault and MS SQL Server related services.
  • Skilled in Microsoft T-SQL (DQL and DDL – incl. sequences, stored procedures, functions, views, database monitoring and query optimisation)
  • Experience with Microsoft Fabric and the Medallion Data Lake architecture
  • Attention to detail and problem-solving abilities.
  • Data visualisation skills
  • Familiarity with MS Power Platform (PowerQuery, PowerAutomate) a plus
Leadership Competencies
  • Able to make good decisions and communicate effectively with those from both technical and non-technical backgrounds
  • Works well with others and values different perspectives
  • Focused on customer needs and accountable for actions
  • Builds trust and supports a positive team culture
Required Experience

Essential

  • Software development experience in a commercial data centric role, coding daily in Python and SQL
  • Good understanding of database architectures, data modelling, database health
  • Master Data Management principles, including data quality, lineage and taxonomies.
  • Experience working in an Agile Software development environment

Desired

  • Microsoft Azure Data Engineering qualification
  • Dynamics 365 and/or Power BI experience
  • Experience producing business insights through data analytics
  • Experience working in the mass participation sporting event industry OR experience participating in mass participation events