Senior Python Data Engineer (London, UK)

Senior Python Data Engineer (London, UK)


Parkopedia was founded with the mission of being able to answer any parking question, anywhere in the world. Today, Parkopedia is the world’s leading connected car services provider used by millions of drivers and organisations such as Apple, TomTom and 20 automotive brands ranging from Audi to Volkswagen.

We are looking for Data Engineers to help support our Data Science team’s ingestion, ETL, infrastructure, and the productionisation of models. The system is currently responsible for making sense of over a billion data points per day.

Why will you want this job? Because you have a deep love for engineering, want to be in a machine learning and data science ecosystem, and you get a kick from delivering great code into production. You'll enjoy being a valued member of the close knit team where your opinion counts, there is a lot of scope to be creative and come up with new ideas, and you'll feel at home working with extremely bright colleagues where learning is a top priority.

We are part of a global team but are based in the London office, which is a short walk from London Bridge.


  • Developing ultrafast and reliable APIs
  • Designing and deploying big data capable infrastructure and software components using infrastructure as code (AWS CDK)
  • Developing highly scalable big data processing pipelines with Spark
  • Developing streaming processing pipelines with Kafka
  • Championing software best practices, including mentoring junior engineers and data scientists


We are really open to different backgrounds, as what we do is pretty unique, however you really need to have the following as the base:

  • Minimum of 4 years professional software or data engineering experience or a PhD in a relevant subject
  • Strong computer science background
  • Data oriented engineer, attentive to details
  • Extensive experience in Python, including a thorough knowledge of the Python data science/engineering ecosystem (e.g. Pandas, Numpy)
  • Experience with containerisation (e.g. Docker), and container orchestration (e.g. k8s, ECS)

Also, you will have a combination of several of the following skills:

  • Experience with PySpark and working with big data, bonus points if you know geospatial data
  • Experience with AWS, bonus points if you know CDK
  • Experience with API development, ideally FastAPI or Starlette
  • Experience with workflow management tools, ideally Apache Airflow
  • Experience with setting up CI/CD pipelines, ideally in an ML environment
  • Experience with Kafka
  • Experience optimising and preparing for production code initially developed by data scientists


Parkopedia is committed to building a great work environment for all our employees. Here are just a few of the benefits that we offer:

  • Unlimited annual leave - yup, time off is as important as time in the office, we all need to unwind and recharge our batteries!
  • Flexible working hours
  • Training allowance
  • Annual company retreat
  • Private medical insurance
  • Time off for volunteering
  • Cycle to work scheme
  • Gym membership
  • Eye care and flu vouchers

We are an equal opportunities employer and believe in the power of a diverse and inclusive team. We welcome applications from everyone, regardless of race, sex, disability, religion/belief, sexual orientation or age.

Apply for this job