Deqode
AWS Data Engineer - ETL/PySpark
Job Location
pune, India
Job Description
Here is the Job Description : Location : Viman Nagar, Pune Mode : 5 Days Working Required Tech Skills : Proficient in PySpark and Python, with expertise in developing scalable data processing solutions. Strong understanding of Data Structures, enabling efficient algorithm design and implementation. Skilled in SQL query optimization, ensuring performance efficiency in data retrieval and manipulation. Expertise in NoSQL databases, understanding schema design and integration with large-scale applications. Solid fundamentals of Object-Oriented Programming (OOPs) for building modular and maintainable code. Hands-on experience with AWS Cloud services, leveraging cloud infrastructure for data processing and analytics. Expertise in Big Data technologies, including distributed computing and large-scale data processing. Experience with Data Lakes, ensuring structured and semi-structured data storage and retrieval. Proficient in AWS Glue for ETL processes, transforming raw data into meaningful insights. Skilled in AWS Athena, executing complex queries on large datasets stored in Amazon S3. Hands-on experience with AWS Kinesis, managing real-time data ingestion and processing. Knowledge of AWS S3, utilizing cloud storage for scalable and cost-effective data solutions. Experience in integrating PySpark with AWS services, optimizing performance in data processing workflows. Strong problem-solving skills, with the ability to optimize Spark jobs and enhance computational efficiency. Capable of designing and developing data-driven solutions, catering to modern business requirements. (ref:hirist.tech)
Location: pune, IN
Posted Date: 5/10/2025
Location: pune, IN
Posted Date: 5/10/2025
Contact Information
Contact | Human Resources Deqode |
---|