




Summary: Seeking a Senior Data Integration Developer to implement ETLs using SQL and Python, perform data modeling, and collaborate in an Agile environment. Highlights: 1. Implement ETLs using SQL and Python 2. Perform data modelling to design efficient and scalable data models 3. Work in an Agile environment delivering small increments We are seeking a highly skilled remote **Senior Data Integration Developer** to join our team and build ETLs. As a Senior Data Integration Developer, you will be responsible for communicating with Business analytics to get business requirements, implementing ETLs using SQL and Python, and data modelling. You will work in an Agile environment, delivering small increments out of a single product backlog. If you are passionate about data engineering and have a proven track record in implementing ETLs using SQL and Python, we invite you to be part of our team. **Responsibilities** * Communicate with Business analytics to get business requirements * Implement ETLs using SQL and Python * Perform data modelling to design efficient and scalable data models * Conduct unit testing to ensure high\-quality deliverables * Work in an Agile environment, delivering small increments out of a single product backlog * Collaborate with cross\-functional teams to ensure seamless integration of data * Provide technical guidance and support to junior developers **Requirements** * A minimum of 3 years of experience in Data Integration, demonstrating your expertise in ETL/ELT Solutions, Python * Proven experience in implementing ETLs using SQL and Python, showcasing your ability to deliver high\-quality solutions * Hands\-on experience in data modelling, enabling you to design efficient and scalable data models * Working knowledge of AWS technologies, including PySpark, Glue, Redshift, and Kinesis (good to have) * Excellent communication skills and strong critical thinking capabilities to effectively convey feedback and insights * Good organizational skills and a detail\-oriented mindset, crucial for meticulous testing efforts * Fluent spoken and written English at an Upper\-Intermediate level or higher, enabling effective communication **Nice to have** * Familiarity with data architecture services related to data modelling and AWS big data architecture * Experience with data delivery services based on ETL tools (PDI, BODS, Glue, Python) and data engineering based Redshift DB functionalities


