Lead Data Engineer
We are looking for a strong and dynamic “hands on” Lead Data Engineer to join our existing London Market’s Tech Team and help us to create a professional data engineering practice and immediately get involved in architecting and building a greenfield cloud based data analytics platform to support our new data strategy.
We are looking for an enthusiastic and self-motivated individual with a thirst for cloud agnostic tech, data processing/analytics, machine learning and good data governance knowhow to help grow and lead the overall team in this area.
The successful candidate will help architect and build new data solutions and initial PoCs using cloud-based technologies specifically on Azure. Previous experience using solutions like Snowflake, Kafka, Matillion (or similar ELT tooling), Serverless, Spark, Python, and R would be extremely useful.
Also previous usage of Azure blob storage, Azure Data Factory and Azure Data Lake could well be helpful too but is not essential. You should certainly have some familiarity with infrastructure technologies like Terraform and Docker on at least one of the big 3 cloud providers.
The role requires someone with a strong understanding of how to set up the appropriate tools and technologies in the cloud to securely process big and small, unstructured and structured data – from ingestion, cleansing, transformation, enrichment, outlier/anomaly detection, auto matching/classification and finally to the delivery of accurate data for insights, analysis and reporting.
Knowing how to deploy fast R&D solutions and robust operational/production systems is key and having previous experience of implementing secure, scalable and automated data pipelines and self-service business intelligence solutions is a must.
We need someone who can get stuff done and deliver value incrementally to the business – who completely understands what agility really is and always looks to keep things simple and avoids excessive amounts of code where possible, understands best practices in the cloud and the importance of DevOps and ensures that we can easily build, deploy and monitor our solutions when they go into production – keeping an eye on infrastructure costs.
Working alongside the existing tech team, a new Data governance/operations team and our Data Science team, the candidate will need excellent communication skills and the ability to mentor and educate others in the various data processing technologies that we adopt and implement.
As Lead Data Engineer, this role carries the responsibility and ownership of making technology choices and influencing the overall design and direction of London Market’s digital transformation in relation to the way it processes and utilises its various data assets. In return we will ensure you can fully self-organise and have autonomy and flexibility in your decision making.
You will have at least 7 years’ experience in Data engineering and previous experience of growing and leading a team of data engineering experts.
The role is ideally suited to someone interested in working with the latest cloud technologies in a dynamic environment, in the face of high performance, complex data wrangling and high variety of data asset challenges.
Key areas of work:
- Working with senior stakeholders to understand high value business problems that can be solved through the application of data processing and analytical systems
- Helping to architect, setup and support a new cloud-based analytics platform for the business
- Helping to create and manage a specialist Data Engineering practice for the London Market’s business
- Ensuring your team supports the Data Science Team by building data pipelines for specific insights and provides them with an environment for their R&D and experimentation
- Ensuring your team supports the Data Governance and Operations Team by providing robust, secure and scalable production data analytics solutions
- Understand business requirements and refine into development tasks and estimate their complexity
- Research, evaluate and adopt new technologies with a right tool for the job mentality
- Prototype, fail rapidly and iterate.
- Focus on both speed of delivery and quality, with suitable pragmatism – ensuring your solutions are always “appropriate” and not overly complex or over-engineered
- Quick progression of projects from PoC to post-production stage
- Communication and presentation of ideas to colleagues in all parts of the wider business
- Recruitment, Mentoring & code reviews for the Data Engineering Team
What should excite you about the role:
- The business:
- Leading brand across the insurance space
- Diverse product offering across homeowner and commercial insurance
- The working environment:
- Great working space in the heart of the city
- Buy-in from the senior management to implement data analytics
- A collaborative IT department that is using the latest technology
- Global community of analytical professionals
- The role:
- Highly visible role with access to the business leadership
- Continued professional development and exposure to business
- Competitive compensation that rewards high achievers
- Passion for and experience of building data analytical platforms to deliver business value
- Excellent attention to detail and accuracy is a must
- Excellent troubleshooting skills and the ability to quickly identify the source of any data errors or performance issues and solve these
- Track record of taking initiative and delivering projects end-to-end; clear evidence of being self-driven/ motivated
- Immense curiosity, high energy and desire to go the extra mile to make a difference
- Having an awareness of trends in the data engineering space
- Leading and nurturing the team
- Ability to set up secure, automated and scalable data platforms / technologies in the Cloud (particularly Azure)
- Good working knowledge of various database types and when to use them, i.e. SQL (Snowflake, MS SQL Server, Postgres), Columnar, Graph, Key-Value, and Document
- Solid experience working with at least one NoSQL database like Cassandra, Neo4J, and/or Elasticsearch
- Solid experience working with some ELT/ETL tools like NiFi, Airflow, Matillion, SSIS, and SnapLogic
- Solid experience of working with and implementing BI platforms like Tableau, Looker, Sisense, Power BI, and Qlik
- Solid understanding on how to incorporate data lineage, discovery and governance within production solutions
- Ability to code in at least two of these languages: Java, C#, Python, Scala and R
- Excellent working knowledge of the SQL and DDL Languages
- Experience of ensuring data is properly backed up, replicated and is always fit for purpose
- Good knowledge of database design patterns – including OLTP, OLAP and overall schema design
Desirable Skills / Experience:
- Excellent knowledge of data streaming and event streaming technologies and patterns
- Good understanding around Machine Learning – ideally anomaly detection, classification, predictive analytics, NLP & recommendation (when working alongside our Data Scientists)
- Good knowledge around Continuous Integration, auto-build and auto-deployment technologies – like Jenkins, Flyway, Liquibase
- Good knowledge of infrastructure tooling like Terraform or Docker
- Hands on experience in the Insurance Sector, Financial Services or Data-Driven Start-up
- Consulting experience
- A degree in applicable area
Apply now for further information
You can follow Hiscox on LinkedIn, Glassdoor and Instagram (@HiscoxInsurance)