Cognizant logo

Databricks

Cognizant
Full-time
On-site
Telangana
Technology & Engineering


Job Summary

We are seeking a skilled Developer with 4 to 8 years of experience to join our team in a hybrid work model. The ideal candidate will have expertise in Spark in Scala Delta Sharing Databricks Unity Catalog Admin and other Databricks tools. A strong background in Property & Casualty Insurance is essential. This role involves working on innovative projects that enhance our data processing capabilities and contribute to our companys success.


Responsibilities

  • Develop and maintain scalable data processing solutions using Spark in Scala to enhance data analytics capabilities.
  • Implement and manage Delta Sharing to ensure seamless data exchange across platforms.
  • Administer Databricks Unity Catalog to maintain data governance and security.
  • Utilize Databricks CLI for efficient management of Databricks resources and workflows.
  • Integrate Amazon S3 for robust data storage solutions and ensure data accessibility.
  • Leverage Python for scripting and automation to streamline data processing tasks.
  • Design and execute complex queries using Databricks SQL to extract meaningful insights.
  • Optimize data storage and retrieval using Databricks Delta Lake for improved performance.
  • Develop and manage Databricks Workflows to automate data pipelines and processes.
  • Implement PySpark for large-scale data processing and analysis to support business objectives.
  • Collaborate with cross-functional teams to align data solutions with business needs.
  • Ensure data quality and integrity through rigorous testing and validation processes.
  • Stay updated with the latest industry trends and technologies to drive innovation. Qualifications
  • Possess strong expertise in Spark in Scala and Databricks tools for data processing.
  • Demonstrate proficiency in Python and PySpark for scripting and automation.
  • Have experience with Amazon S3 for data storage and management.
  • Exhibit a solid understanding of Property & Casualty Insurance domain.
  • Show capability in managing data governance with Databricks Unity Catalog.
  • Display strong analytical skills with Databricks SQL for data insights.
  • Have a proactive approach to learning and implementing new technologies.