Jenni Kayne

Senior Data Engineer

Click Here to Apply

Job Location

Los Angeles, CA, United States

Job Description

Jenni Kayne is a California-based lifestyle brand that aims to empower an elevated approach to everyday living. Whether it's our edited style ethos or coveted interiors sensibility, we work hard to create a world that's inviting and intentional. From our stores across the country to our operations and corporate teams, we believe in the power of a workplace that's built on diversity and inclusion—where the varied voices and viewpoints of our community pave the way.

About This Role:

Jenni Kayne is looking for a Sr. Data with expertise in design, development, test and deployment of large Lake House (Data Lake and enterprise data warehouse) data solutions using cloud technologies and modern data stack. This is an exciting role for individuals looking for an entrepreneurial environment with clear ownership and opportunity to make direct business impact.

Role and Responsibilities:

As the Senior Data Engineer, your primary responsibilities include the following:

  • Building data pipelines: Create, maintain, and optimize workloads from development to production for specific business use cases.
  • Responsible for using innovative and modern tools, techniques and architectures to drive automation of most-common, repeatable data preparation and integration tasks with goal of minimizing reducing defects and improving productivity.
  • Responsible of data engineering architecture and framework to seamlessly integrate different business application and data sources with different formats (API, XML, JSON , CSV etc.)
  • Create strategy for master data for customer, product by unifying disparate source of customer and product across digital and offline channels (retail)
  • Develop and follow data integration and data quality standards across all development initiatives according to the organization's policies as well as best practices.
  • Assist in data management infrastructure, governance data observability, integration with metadata management tools and techniques (TBD in future).
  • Continuously tracking data consumption in collaboration with Analytics and Data sciences teams to prioritize the highest impact projects.
  • Triage data issues, analyzing end to end data pipelines and working with data analyst, business users in troubleshooting and resolving data quality or pipeline issues.
  • Work in agile model alongside data architect, data analysts, data scientist, business partners and other developers in delivery of data
  • Build and continuously manage the data lake and enterprise data-warehouse pipelines and shared transformation libraries for code reusability, speed to market and lineage.

Qualifications:

  • Requires a bachelor's degree or equivalent experience.
  • Requires at least 6 years of prior relevant experience.
  • Hands on experience with programming languages including SQL, Python on cloud data platforms like Snowflake, Redshift etc.
  • Strong technical understanding of data modeling (dimensional model), master data management, data integration, data architecture, data warehousing and data quality techniques
  • Working knowledge of Git repositories (bitbucket, GitHub), CI/CD (Jenkins etc.) and software development tools, including incident tracking, version control, release management, subversion change management (Atlassian toolset – Jira/Confluence), testing tools and systems and scheduling software (Airflow)
  • Experience working with popular BI software tools like Looker, Tableau, Qlik, PowerBI etc.
  • Nice to have: Experience with enterprise ELT platforms like Talend, Fivetran and flexibility to build an in-house transformation code base using SQL, Python, Airflow etc.
  • Basic experience in working with data governance and data security and specifically information stewards and privacy and security officers in moving data pipelines into production with appropriate data quality, governance and security standards and certification.
  • Adept in agile methodologies and capable of applying DevOps and increasingly Data Operations principles to data pipelines to improve the integration, reuse and automation of data flows to improve data trust and democratization.

Physical Requirements:

  • Prolonged periods sitting at a desk and working on a computer
  • Must be able to move and lift heavy objects (15 pounds or more) from time to time as required

Additional Notes:

This job description is not all inclusive. In addition, Kayne, LLC DBA Jenni Kayne reserves the right to amend this job description at any time. Kayne, LLC DBA Jenni Kayne is committed to a diverse and inclusive work environment.

The annual base salary range for this position is $120,000 - $165,000. The base salary is determined by experience, education, skills, and location.

Location: Los Angeles, CA, US

Posted Date: 12/6/2023
Click Here to Apply
View More Jenni Kayne Jobs

Contact Information

Contact Human Resources
Jenni Kayne

Posted

December 6, 2023

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.