IN.JobDiagnosis logo

Job Title:

Software Engineer III - Data Engineer

Company: Wayfair

Location: Bangalore, Karnataka

Created: 2026-01-09

Job Type: Full Time

Job Description:

Candidates for this position are preferred to be based in Bangalore, and will be expected to comply with their team's hybrid work schedule requirements.Who We Are:Wayfair is on a path to be the world’s largest online destination for the home. We are the largest tech first platform in the home category. Our marketplace offers over 30 million products from 23,000 suppliers and we served 23 million customers in 2024 alone.Wayfair is investing heavily in building a world class advertising business and the Wayfair Advertising team owns features that form the core of our onsite advertising business. We ensure that millions of ads served are relevant to our customers by enabling advertisers and agencies to connect with the right customers at the right time with the right products. We are highly motivated, collaborative and fun loving with an entrepreneurial spirit and bias for action. With a broad mandate to experiment and innovate, we are growing at an unprecedented rate with a seemingly endless range of new opportunities.What What you'll do:Drive the design, development, and launching of new data models, data pipelines, and data products focussed on Search and Recommendations.Helping teams push the boundaries of analytical insights, creating new product features using data, and powering machine learning modelsBuild cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization.Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructureBe a technical mentor to junior engineersWe Are a Match Because You Have:Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience6+ years relevant work experience in the Data Engineering field with web scale data setsExpertise with big data technologies & tools like Hadoop, Spark, Hive, Presto, Airflow etc.Experience in cloud platforms such as GCP using technologies like Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related technologies in AWS or AzureComfortable designing and implementing DW Architecture, OLAP technologies, and star/snowflake-schemas to enable self-service tooling. Expertise in at least one object-oriented or scripting language (Java, Scala, Python etc.) and SQLExperience with real-time data streaming tools like Flink, Kafka, Beam or any similar tools. Experience with designing data models for traditional relational databases or big data stores.Strong understanding of algorithms, data structures, data architecture, and technical designs.Excellent communication and presentation skills, strong business acumen, critical thinking, and ability to work cross functionally through collaboration with engineering and business partners.Experience with domain-driven design, event modeling, and event sourcing is preferred

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.