Job Title:
Data Engineer
Company: Creditsafe Technology
Location: Hyderabad, Telangana
Created: 2026-03-09
Job Type: Full Time
Job Description:
Creditsafe, the global business intelligence experts, specialize in business credit checking and all round B2B solutions. As the world’s most used provider of online business credit reports, we’ve changed the way business information is used worldwide through our passion and drive to deliver superior business data. With credit information on over 430 million businesses across the globe, Creditsafe delivers the most accurate and up-to-date information available in an easy-to-use format for businesses of all sizes. All major credit insurers also endorse Creditsafe, meaning our credit scores & limits are one of the most trusted in the industry and can predict almost 70% of all insolvencies up to 12 months in advance. Our investment in creating the world’s most predictive scorecard ensures our clients are aware of potential risks in advance, aiding you to make the right moves to protect yourself and your business. We make our company credit reports as simplified as possible, so everyone in a business can use them without a financial background. Thanks to our ease of use, international reach and continuous improvement of our products, Creditsafe are proud to maintain a 95% customer retention rate. With 26 offices across 16 countries, Creditsafe offer instant access to company credit reports in over 200 countries throughout the world. Job Summary:You will be working closely with the database teams and data engineering building specific systems facilitating the extraction and transformation of Creditsafe data. The role will define and build data pipelines that will enable faster, better, data-informed decision-making both within the business and for Creditsafe customers. This is an opportunity to gain exposure to big data architectures and mpp processes. Key Responsibilities: · Provide mentorship to team members by teaching standards for code maintainability and performance.· Perform a role as part of an Agile team to develop, test and maintain high quality systems that fulfil business needs.· Extracting data from various data sources for example relational databases, files and API’s)· Help evolve our data platform with a view towards growth and high throughput.· Execute practices such as continuous integration and test driven development to enable the rapid delivery of working code.· Design and build metadata driven data pipeline using Python and SQL in accordance with guidelines set by the Data Architect· Ship medium to large features independently using industry standard processing patterns Required Skills & Qualifications: · Experience required: 4-6y.· Solid development experience within a commercial environment creating production grade ETL pipelines in python· Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred)· Hands on experience with data orchestrators such as Airflow, Prefect, Dagster or Luigi (Airflow Preferred)· Knowledge of Agile development methodologies· Awareness of cloud technology particularly AWS.· Knowledge of automated delivery processes· Experience designing and building autonomous data pipelines· Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications)· Ability to write efficient code and comfortable undertaking system optimisation and performance tuning tasks· Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred)Benefits: Competitive Salary. Performance Bonus Scheme. 25 Days Annual Leave (plus 10 bank holidays). Hybrid working model. Healthcare & Company Pension. Global Company gatherings and events. E-learning and excellent career progression opportunities. Gratuity Parents insurance and accidental insurance. Cab for women.