Finance
Bulgaria, Poland, Portugal, Romania, Spain, Ukraine
Remote
Senior Data Platform Engineer (Python).
Bulgaria, Poland, Portugal, Romania, Spain, Ukraine
Remote
Who we are:
Adaptiq is a technology hub specialising in building, scaling and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.
About the Product:
Finaloop is building the data backbone of modern finance — a real-time platform that turns billions of eCommerce transactions into live, trustworthy financial intelligence. We deal with high-volume, low-latency data at scale, designing systems that off-the-shelf tech simply can’t handle. Every line of code you write keeps thousands of businesses financially aware — instantly.
About the Role:
We’re hiring a Senior Data Platform Engineer to build the core systems that move, transform, and power financial data in real time. You’ll be part of the core engineering group building the foundational infrastructure that powers our entire company.
You’ll work closely with senior engineers and the VP of Engineering on high-scale architecture, distributed pipelines, and orchestration frameworks that define how our platform runs.
It’s pure deep engineering — complex, impactful, and built to last.
Key Responsibilities:
- Designing, building, and maintaining scalable data pipelines and ETL processes for our financial data platform
- Developing and optimizing data infrastructure to support real-time analytics and reporting
- Implementing data governance, security, and privacy controls to ensure data quality and compliance
- Creating and maintaining documentation for data platforms and processes
- Collaborating with data scientists and analysts to deliver actionable insights to our customers
- Troubleshooting and resolving data infrastructure issues efficiently
- Monitoring system performance and implementing optimizations
- Staying current with emerging technologies and implementing innovative solutions
Required Competence and Skills:
- 7+ years experience in Data Engineering or Platform Engineering roles
- Strong programming skills in Python and SQL
- Experience with orchestration platforms and tools (Airflow, Dagster, Temporal or similar)
- Experience with MPP platforms (e.g., Snowflake, Redshift, Databricks)
- Hands-on experience with cloud platforms (AWS) and their data services
- Understanding of data modeling, data warehousing, and data lake concepts
- Ability to optimize data infrastructure for performance and reliability
- Ability to design, build, and optimize Docker images to support scalable data pipelines
- Familiarity with CI/CD concepts and principles
- Fluent English (written and spoken)
Nice to have skills:
- Experience with big data processing frameworks (Apache Spark, Hadoop)
- Experience with stream processing technologies (Flink, Kafka, Kinesis)
- Knowledge of infrastructure as code (Terraform)
- Experience deploying, managing, and maintaining services on Kubernetes clusters
- Experience building analytics platforms or clickstream pipelines
- Familiarity with ML workflows and MLOps
- Experience working in a startup environment or fintech industry
The main components of our current technology stack:
- AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Why Us?
We provide 20 days of vacation leave per calendar year (plus official national holidays of a country you are based in).
We provide full accounting and legal support in all countries we operate.
We utilize a fully remote work model with a powerful workstation and co-working space in case you need it.
We offer a highly competitive package with yearly performance and compensation reviews.

