»  Software & IT »  Data Engineer/Sr Data Engineer-Python,SQL,Big Data

4Yrs -10 Yrs       Gurgaon     Openings:

[doc,pdf ]

The Company

Our client is a world leader in information services. They combine data, analytics and expertise to provide solutions for business, finance and government across the world. The company has 50,000 customers across 140 countries, including 80% of fortune global 500 and 94 of the 100 largest US corporations

They provide an automotive dealer with AI/behavior prediction analytics software and marketing solutions that improve the vehicle purchase process and results. The company’s cloud-based technology helps dealers precisely predict automobile-buying behavior and automates the creation of microtargeted customer communications, leading to proven higher sales and more consistent customer retention.

The Job

The company is seeking to hire a Senior Data Engineer who provide software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI.


Key responsibilities include:


  • Build & support data ingestion and processing pipelines.
  • This will entail extract, load, and transform of ‘big data’ from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies
  • Build real-time monitoring dashboards and alerting systems.
  • Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marking platform, customer behaviour, retention etc.
Your Profile
  • Experience in Big Data and Data Engineering.
  • Strong knowledge of advanced SQL, data warehousing concepts, DataMart designing.
  • Have strong experience in in modern data platform components such as Spark, Python, etc.
  • Experience with setting up and maintaining Data warehouse (Google BigQuery, Redshift, Snowflake) and Data Lakes (GCS, AWS S3 etc.).
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra / MongoDB.
  • Experience with data pipeline and workflow management tools: Airflow, Dataflow, Dataproc, etc.
  • Experience with Distributed Versioning Control environments such as GIT, Azure DevOps
  • Building Docker images and fetch/promote and deploy to Production. Integrated Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform.
  • Should be able to convert business queries into technical documentation.
  • Good to have experience in one of the Cloud providers – GCP, Azure, AWS.
  • Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
  • Agile software development methodologies.