Job Title:
Freelance Data Pipeline Engineer – Vector / Security Telemetry
Company: ThreatXIntel
Location: New delhi, Delhi
Created: 2026-03-17
Job Type: Full Time
Job Description:
Company DescriptionThreatXIntel is a growing Cybersecurity, IT Staffing, and Consulting company delivering end-to-end technology and security solutions. Our services include cloud security, web and mobile application security testing, DevSecOps, vulnerability assessments, IT consulting, and professional staffing services.We support global corporate clients by hiring and deploying skilled professionals across IT and cybersecurity domains while helping organizations strengthen security, optimize operations, and scale efficiently. ThreatXIntel is committed to enabling business growth through secure, reliable, and high-quality technology solutions.Role OverviewWe are seeking a Freelance Data Pipeline Engineer with strong expertise in Vector to design and implement scalable, modular, and reusable data flow pipelines for large-scale security telemetry environments.The consultant will be responsible for building platform-agnostic ingestion frameworks capable of handling multi-source telemetry data and integrating with downstream analytics platforms such as Snowflake, Splunk, Azure Data Explorer (ADX), and Log Analytics.This role requires hands-on experience in Vector-based data ingestion pipelines, schema normalization using OCSF, and advanced data transformation and enrichment across large-scale telemetry sources.Key ResponsibilitiesData Pipeline ArchitectureDesign and implement scalable data ingestion pipelines using VectorBuild modular and reusable ingestion frameworksHandle ingestion from multiple sources including:SyslogKafkaHTTPAzure Event HubsBlob StorageData Processing & TransformationImplement data transformation logic including filtering, enrichment, and dynamic routingSupport format transformations such as:JSONCSVXMLLogfmtSchema & Data GovernanceImplement schema normalization using Open Cybersecurity Schema Framework (OCSF)Build field mapping templates and schema validation logicEnsure governance and security compliance for telemetry pipelinesSecurity & Data IntegrityImplement SSL/TLS security controls and client authenticationMaintain data lineage, metadata tagging, and correlation IDsEnsure minimal data loss, duplication, or transformation driftObservability & MonitoringIntegrate pipeline monitoring and anomaly detectionImplement logging for pipeline failures and transformation errorsSupport observability platforms and operational monitoringIntegration with Analytics PlatformsDeliver telemetry data into platforms such as:SnowflakeSplunkAzure Data Explorer (ADX)Log AnalyticsAnvilogicCollaboration & DocumentationWork closely with security, analytics, and platform engineering teamsMaintain documentation for ingestion patterns, transformation libraries, and governance standardsMandatory SkillsVector (Data Pipeline Platform)Security Telemetry Data PipelinesKafka / Event StreamingData Transformation & EnrichmentOCSF (Open Cybersecurity Schema Framework)Data Pipeline ArchitectureSnowflake / Splunk / ADX integrationsPython / Groovy / JavaScript scriptingData Governance & Schema NormalizationObservability & Pipeline MonitoringRequired Experience7+ years of experience in Data Engineering or Security Data PlatformsStrong hands-on experience building Vector-based data pipelinesExperience managing large-scale telemetry ingestion (100+ data sources)Experience integrating with security analytics platformsExperience designing scalable ingestion frameworks