Data Platform Engineer

  • London, United Kingdom
  • Full Time (Permanent)
  • Data Platform

About the role

As a Data Platform Engineer, you’ll help shape and scale our data infrastructure, making analytics faster, more reliable, and cost-efficient. You’ll work with AWS, Snowflake, Python, and Terraform, building tooling, and onboarding new data sources. 

You'll collaborate closely with teams across the business, ensuring our platform is secure, scalable, and easy to use.

 
Our team mission

We want to maximise business value by improving efficiency in our Analytics, enhancing Lendable's competitive advantage through data utilisation. 

This means our Data Platform is designed to streamline the efficiency of analysis in extracting insights from data, while also ensuring cost control, compliance, and security.

What will you be doing?

This is a non-exhaustive list of the activities you would engage with in the role:

  1. Develop new tooling for product teams to boost their efficiency

  2. Work with the wider team in maintaining, evolving and scaling data infrastructure solutions on AWS and Snowflake.

  3. Onboard new ingestion sources and maintain the smooth running and monitoring.

  4. Ensure platform robustness through automated testing and monitoring of data pipelines aligned to the expected scaling of Lendable.

  5. Collaborate with stakeholders to translate business requirements into scalable technical solutions.

  6. Optimise existing CI/CD pipelines for faster cycle times and increased reliability.

  7. Implement security best practices for data management and infrastructure on cloud platforms.

  8. Design and deploy infrastructure as code to manage cloud resources efficiently.

  9. Assist in the troubleshooting and resolution of production issues to minimise downtime and improve user satisfaction.

What we're looking for

  • Strong software development, particularly in Python or a similar language.

  • Solid engineering practices, including automated testing, deployment systems, and configuration as code.

  • Experience with building data intensive applications and containerised services

  • Experience with cloud services such as AWS, GCP, or equivalent (preference for AWS).

  • Experience with infrastructure as code, preferably Terraform.

  • Knowledge of columnar databases, such as Snowflake.

  • Experience in developing and optimising CI/CD pipelines, with a preference for GitHub Actions.

  • Excellent communication skills for effective collaboration with business analysts and stakeholders, ensuring technical solutions meet business needs.

  • Experience with data ingestion tools, like Fivetran.

Advantageous

  • Exposure to deploying applications with Kubernetes.

  • Experience with Data Orchestrator tools (Airflow, Prefect, etc.)

  • Experience with  Data Observability tools (Montecarlo, Great Expectations, etc.)

  • Experience with  Data Catalog tools (Amundsen, OpenMetadata, etc.)

Interview Process

  • Call with the talent team

  • Take home task

  • Tech interview

  • CPTO interview