




Summary: Seeking a highly proficient Senior Big Data Engineer to develop and implement scalable and resilient systems using data lake solutions and stream processing engines. Highlights: 1. Develop and implement scalable, resilient systems utilizing Data Lake solutions 2. Build and maintain data pipelines using stream processing engines 3. Extensive experience in Databricks, Python, AWS, and Data Warehousing We are seeking a highly proficient **remote Senior Big Data Engineer** with extensive experience in Databricks, Python, AWS, and Data Warehousing. The ideal candidate should be well\-versed in software development lifecycle methodologies and proficient in advanced SQL coding. As a Senior Big Data Engineer, you will be responsible for developing and implementing scalable and resilient systems, utilizing data lake solutions, and stream processing engines to meet business goals. **Responsibilities** * Develop and implement scalable, resilient systems utilizing Data Lake solutions * Participate in the full software development lifecycle (SDLC) utilizing Agile methodologies * Build and maintain data pipelines using stream processing engines such as Kafka, Spark, etc. * Perform advanced data analysis and modeling using relevant programming languages * Ensure data quality and integrity through proper testing and validation techniques **Requirements** * Bachelor's/Master's Degree in Computer Science, Information Systems, or equivalent * At least 3 years of relevant experience in Data Software Engineering * Proficient with software development lifecycle (SDLC) methodologies like Agile, Test\-driven development * Extensive experience in Databricks, Python, AWS, and Data Warehousing * Advanced SQL coding skills * B2\+ English level **Nice to have** * Experience with Apache Airflow and Apache Spark * Expertise in Relational Database Design


