AKSHAYA BUSINESS IT SOLUTIONS PRIVATE LIMITED

Data Engineer - ETL/SQL

Job Location

bangalore, India

Job Description

Responsibilities : - Design, build, and maintain robust and scalable ETL/ELT pipelines using programming languages, scripting, and data integration tools to ingest, transform, and load data from diverse sources into data warehouses, data lakes, or other data storage systems. - Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. - Develop and optimize data models for efficient storage, retrieval, and analysis of large datasets. - Implement and maintain data quality checks, validation rules, and monitoring processes to ensure the accuracy, consistency, and reliability of data. - Work with various database technologies, including relational databases (PostgreSQL, MySQL, SQL Server) and NoSQL databases (MongoDB, Cassandra). - Utilize big data technologies and frameworks such as Hadoop, Spark, Hive, or Kafka to process and analyze large volumes of data. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services (S3, Redshift, Snowflake, BigQuery, Azure Data Lake Storage, Databricks). - Automate data workflows, monitoring, and alerting processes using scripting languages and automation tools. - Troubleshoot and resolve issues related to data pipelines, data quality, and data infrastructure. - Optimize the performance and cost-efficiency of data processing jobs and storage solutions. - Create and maintain comprehensive technical documentation for data pipelines, data models, and data infrastructure. - Stay up-to-date with the latest trends and advancements in data engineering technologies and best practices. - Participate in code reviews and provide technical guidance to other data engineers. - Ensure adherence to data governance policies and security best Skills : - Programming Languages : Strong proficiency in at least one programming language commonly used in data engineering, such as Python (with libraries like Pandas, PySpark), Scala, or Java. - Databases : Expertise in working with relational databases (PostgreSQL, MySQL, SQL Server) and writing complex SQL queries. Experience with NoSQL databases is a plus. - ETL/ELT Tools : Experience with ETL/ELT tools and frameworks (Talend, Informatica, Apache NiFi, AWS Glue, Azure Data Factory, dbt). - Big Data Technologies : Hands-on experience with big data technologies and frameworks such as Hadoop, Spark, Hive, or Kafka. - Cloud Platforms : Proven experience working with at least one major cloud platform (AWS, Azure, or GCP) and its data services. - Data Warehousing : Strong understanding of data warehousing concepts, data modeling techniques (dimensional modeling), and data lake architectures. - Data Quality : Experience in implementing data quality checks and validation processes. - Automation and Scripting : Proficiency in scripting languages (Bash, Python) for automating data workflows. - Version Control : Proficient in using Git for version control. - CI/CD (Beneficial) : Familiarity with CI/CD pipelines for data engineering workflows is a plus. - Data Streaming (Beneficial) : Experience with real-time data streaming technologies like Kafka or Kinesis is a Skills : - Strong analytical and problem-solving skills with the ability to understand complex data requirements and translate them into technical solutions. - Excellent communication (verbal and written) and interpersonal skills to collaborate effectively with data scientists, analysts, and business stakeholders. - Ability to work independently and manage tasks effectively in a fast-paced environment. - Strong attention to detail and a commitment to data quality and accuracy. - Ability to learn and adapt to new technologies and frameworks : - Bachelors or Masters degree in Computer Science, Engineering, Data Science, or a related quantitative field. - Minimum of 8 years of hands-on experience as a Data Engineer. - Proven track record of successfully designing, building, and maintaining scalable data pipelines and infrastructure. - Strong technical skills in programming, databases, and ETL/ELT processes. - Experience with big data technologies and cloud platforms. - Excellent problem-solving and communication Points : - Experience with specific cloud data services on AWS, Azure, or GCP. - Experience with data governance and data catalog tools. - Familiarity with data science workflows and requirements. - Relevant cloud or data engineering certifications. - Experience with data visualization tools (Tableau, Power BI). (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 5/15/2025
View More AKSHAYA BUSINESS IT SOLUTIONS PRIVATE LIMITED Jobs

Contact Information

Contact Human Resources
AKSHAYA BUSINESS IT SOLUTIONS PRIVATE LIMITED

Posted

May 15, 2025
UID: 5153870684

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.