Jeavio
Jeavio - Senior Data Engineer - Python/ETL
Job Location
vadodara, India
Job Description
We are seeking an experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering and AWS infrastructure, with hands-on experience in building and maintaining data pipelines and the necessary infrastructure components. The role will involve using a mix of data engineering tools and AWS services to design, build, and optimize data Responsibilities : - Design, develop, and maintain data pipelines using Airflow and AWS services. - Implement and manage data warehousing solutions with Databricks and PostgreSQL. - Automate tasks using GIT / Jenkins. - Develop and optimize ETL processes, leveraging AWS services like S3, Lambda, AppFlow, and DMS. - Create and maintain visual dashboards and reports using Looker. - Collaborate with cross-functional teams to ensure smooth integration of infrastructure components. - Ensure the scalability, reliability, and performance of data platforms. - Work with Jenkins for infrastructure and functional areas of expertise : - Working as a senior individual contributor on a data intensive project. - Strong experience in building high performance, resilient & secure data processing pipelines preferably using Python based stack. - Extensive experience in building data intensive applications with a deep understanding of querying and modeling with relational databases preferably on time-series data. - Intermediate proficiency in AWS services (S3, Airflow). - Proficiency in Python and PySpark. - Proficiency with ThoughtSpot or Databricks. - Intermediate proficiency in database scripting (SQL). - Basic experience with Jenkins for task to Have : - Intermediate proficiency in data analytics tools (Power BI / Tableau / Looker / ThoughSpot). - Experience working with AWS Lambda, Glue, AppFlow, and other AWS transfer services. - Exposure to PySpark and data automation tools like Jenkins or CircleCI. - Familiarity with Terraform for infrastructure-as-code. - Experience in data quality testing to ensure the accuracy and reliability of data pipelines. - Proven experience working directly with U. client stakeholders. - Ability to work independently and take the lead on and experience : - Bachelors or masters in computer science or related fields. - 5 years of needed : - Databricks. - PostgreSQL. - Python & Pyspark. - AWS Stack. - Power BI / Tableau / Looker / ThoughSpot. - Familiarity with GIT and/or CI/CD tools. (ref:hirist.tech)
Location: vadodara, IN
Posted Date: 5/13/2025
Location: vadodara, IN
Posted Date: 5/13/2025
Contact Information
Contact | Human Resources Jeavio |
---|