Preloader

Loading

Junior Data Engineer

  • Data Science
  • Fully Remote
  • 1 month ago
  • UK

Job Information

  • icon
    Salary Pound 30,000–40,000 / Yearly
  • icon
    Shift Morning
  • icon
    No. of Openings 1 opening
  • icon
    Job Level : Entry-Level
  • icon
    Job Experience : 1-3 Years
  • icon
    Job Qualifications Bachelor’s Degree

Job Description

DataQuarry Analytics Ltd. is a UK-based data science consultancy and product company delivering end-to-end machine learning, data engineering and cloud-analytics solutions for digital-first organizations. Since 2017, our team of data engineers, statisticians, and financial analysts supports clients across fintech, healthcare, e-commerce and sustainability — transforming raw data into actionable insights with scalable pipelines and ML/AI infrastructure. As a remote-first organisation, we serve customers across Europe and North America, combining statistical rigor with modern MLOps and cloud infrastructure.

Role & Responsibilities

  • Assist in building and maintaining ETL/data-engineering pipelines to ingest, clean, and transform data from clients’ systems into production-ready datasets.
  • Work under supervision to support data ingestion, data cleaning, and integration tasks (e.g. SQL, pipelines, scripting).
  • Help integrate data workflows into cloud infrastructure (AWS / Azure / GCP) to support scalable data storage and processing.
  • Support model training / data-preparation tasks for ML/analytics projects — collaborating with data scientists and senior engineers.
  • Write clear documentation for data flows, pipelines, and maintain version control (Git / GitHub / GitLab).
  • Provide support with data governance and privacy compliance (e.g. GDPR-aligned data handling) when required.
  • Collaborate with cross-functional, distributed remote teams — communicate effectively, manage time zones, and contribute to team tasks.

Required Skills & Qualifications

  • Basic familiarity with Python or another scripting language; experience or interest in data processing / data-engineering tasks.
  • Basic knowledge of SQL, data cleaning / ETL pipelines, and data ingestion workflows.
  • Willingness to learn cloud infrastructure (AWS / Azure / GCP) and basic cloud-based data storage / processing.
  • Willingness to follow coding best practices (version control, documentation), collaborate with remote teams, and adapt to varied client domains.
  • Strong communication, adaptability, attention to detail, and time-management — suitable for remote work across time zones.
  • Bachelor’s degree (or equivalent) in Computer Science, Data Science, Statistics, Engineering, Mathematics or related field; or relevant technical diploma / certification.
  • 1–3 years of relevant experience (or equivalent through academic projects, internships, bootcamps etc.).

 

Uploading