hero

THE FUTURE OF TECH IS YOURS TO BUILD

Learn more about opportunities in Alkeon’s VC Portfolio
companies
Jobs

Data Engineer - Employee Success

Own Company

Own Company

Software Engineering, Data Science
Hyderabad, Telangana, India
Posted on Mar 18, 2026

Description

Salesforce’s Employee Success (ES) Product Management team is advancing AI-driven data initiatives to expand our data capabilities across all ES functions. A key priority is developing a robust data and AI foundation to support innovation, automation, and data-driven decision-making.

We are seeking a motivated Data Engineer to join our team and help build and maintain the data foundations that power ES analytics and AI-driven agents. You will support the development of data pipelines, assist in data modeling efforts, and help ensure data quality across our Snowflake and Data360 environments. Working alongside senior engineers and ES business partners, you will learn to translate functional requirements into technical solutions while gaining hands-on experience with modern data engineering tools and practices. This is an excellent opportunity for someone early in their career who wants to grow their skills in a fast-paced environment at the intersection of data engineering, people analytics, and AI.

Key Responsibilities

  • Data Engineering: Develop and maintain ETL/ELT pipelines using Apache Airflow and Python under the guidance of senior engineers.

  • Data Foundations: Assist in building and maintaining data models within Snowflake and Salesforce Data360, ensuring data accuracy and consistency.

  • Graph Development: Support the team in maintaining Neo4j data graphs, writing basic Cypher queries, and loading data into graph structures.

  • DevOps & CI/CD: Participate in CI/CD workflows using Git, following team standards for code versioning, testing, and deployment.

  • Data Quality: Execute QA testing on pipelines and datasets, flag anomalies, and help document data lineage and transformation logic.

  • Integration Support: Assist in building and maintaining APIs and data connections between Snowflake, AWS services (S3, Lambda), and other platforms.

  • Documentation: Create and maintain clear technical documentation for pipelines, data models, and processes.

  • Collaboration: Work with senior engineers and business partners to understand requirements and contribute to solution delivery.

Required Skills/Experience

  • Relevant work experience, with 5+ years related information systems

  • Foundational proficiency in SQL, Python, Informatica IICS, dbt.

  • Proficiency in Apache Airflow and Snowflake.

  • Understanding of data modeling concepts (dimensional modeling, Star/Snowflake schema).

  • Understanding of API protocols (REST/JSON) to support Agentic workflows.

  • Familiarity with Git and version control workflows.

  • Exposure to or willingness to learn Apache Airflow, Snowflake, and cloud data platforms.

  • Strong attention to detail and a quality-first mindset.

  • Curiosity and eagerness to learn — you don't need to know everything, but you should want to figure it out.

  • Good communication skills and a collaborative, team-first attitude.

  • Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.)