Contact Us

Data Engineer (Python, Kafka, Flink)

We’re hiring! Join ItexUS today!

We are looking for a Senior Software Engineer/Data Engineering (Python) to join our team and contribute to building and maintaining data pipelines for a large-scale aviation platform.

This is a Data Engineering role focused on pipeline development, not general backend or pure Python development. You will work with real-time and batch data processing, supporting data flow across multiple systems.

In this role, you will primarily focus on developing and maintaining data pipelines using Python, Kafka, Flink, and Kestra, working with already defined schemas and handling large-scale data movement.

About projects:

You will join a long-term project in the aviation industry for a US-based client.

The platform is a cloud-based, AI-driven solution designed to improve aircraft uptime and operational efficiency. It supports airlines and aviation teams worldwide by providing real-time insights, predictive analytics, and data-driven decision-making tools.

The system integrates multiple aviation data sources, processes large volumes of operational and regulatory data, and delivers it through dashboards and analytics tools.

As part of the team, you will work on:

  • Building real-time and batch data pipelines
  • Supporting data flow across multiple systems
  • Ensuring reliable data delivery for analytics and AI use cases
  • Working with a high-load system (~1M records/day)

Location: Poland preferred, open to candidates across Europe

Responsibilities:

  • Build and maintain real-time data pipelines using Kafka, Flink, and Kestra
  • Develop and support batch pipelines using Python
  • Write Python scripts for ETL and real-time integrations (60–70% of time)
  • Move data from SQL Server to MariaDB, Solr, and the data lake
  • Work with predefined schemas (no schema design required)
  • Maintain and improve existing pipelines (bug fixing, optimization)
  • Write SQL queries (mainly SELECT)
  • Handle data processing in a high-load environment without modern DWH tools
  • Perform data cleanup and maintenance activities
  • Support infrastructure-related tasks (deployment, monitoring, data lifecycle management)

Requirements:

  • 4+ years of experience in Data Engineering
  • Strong proficiency in Python (ETL, data processing)
  • Experience building real-time pipelines (Kafka, Flink, Kestra)
  • Experience building batch pipelines using Python
  • Solid SQL knowledge (SELECT queries)
  • Experience working with high-load systems
  • Ability to work with existing/legacy codebases
  • Understanding of data pipeline architecture and data flow
  • Willingness to work with low-level/manual solutions (no Snowflake/Databricks)
  • English level B2+

Nice to have:

  • Experience with Apache Solr / MariaDB
  • Experience with data lake environments
  • Experience with AWS / Cloudflare / Kubernetes
  • Understanding of data lifecycle management
  • Experience in security-related tasks (XDR, threat remediation)

We would be happy to share more details during our meeting!

Want to join Itexus team? Go for it!

Contact me for more details

Maria Karseko

Maria Karseko HR Specialist [email protected]

Name
Job position
Phone number
x

Contact me
for more details

Maria Karseko

Maria Karseko HR Specialist [email protected]