Cognizant logo

Databricks

Cognizant
Full-time
On-site
Telangana
Technology & Engineering


Job Summary

We are seeking a highly skilled Sr. Developer with 8 to 12 years of experience to join our team. The ideal candidate will have extensive experience in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. This is a hybrid role with day shifts and no travel required.


Responsibilities

  • Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing and transformation.
  • Implement and manage Delta Sharing to facilitate secure and efficient data sharing across various platforms.
  • Administer Databricks Unity Catalog to ensure proper data governance and access control.
  • Utilize Databricks CLI for seamless integration and automation of Databricks workflows.
  • Design and implement Delta Live Pipelines to automate data processing and ensure data quality.
  • Leverage Structured Streaming to handle real-time data processing and analytics.
  • Apply risk management techniques to identify and mitigate potential data-related risks.
  • Integrate and manage Apache Airflow for orchestrating complex data workflows.
  • Utilize Amazon S3 for scalable and secure data storage solutions.
  • Implement data warehousing solutions using Amazon Redshift for efficient data querying and reporting.
  • Develop and maintain Python scripts for data processing and automation tasks.
  • Utilize Databricks SQL for querying and analyzing large datasets.
  • Implement Databricks Delta Lake to ensure data reliability and performance.
  • Manage Databricks Workflows to streamline data processing and analytics tasks.
  • Develop and maintain PySpark applications for large-scale data processing.
  • Collaborate with cross-functional teams to ensure seamless data integration and delivery.
  • Provide technical expertise and support to team members and stakeholders.
  • Ensure all data solutions align with company goals and industry best practices.
  • Continuously monitor and optimize data workflows for performance and efficiency.
  • Stay updated with the latest industry trends and technologies to ensure the company remains competitive.
  • Contribute to the companys mission by delivering high-quality data solutions that drive business success.


Qualifications

  • Must have extensive experience with Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark.
  • Strong problem-solving skills and the ability to work independently.
  • Excellent communication and collaboration skills.
  • Proven track record of delivering high-quality data solutions.
  • Ability to work in a hybrid work model.
  • Strong understanding of data governance and security best practices.
  • Experience with real-time data processing and analytics.
  • Knowledge of data warehousing and ETL processes.
  • Familiarity with cloud-based data storage and processing solutions.
  • Ability to manage and optimize complex data workflows.
  • Strong attention to detail and commitment to data quality.
  • Ability to stay updated with the latest industry trends and technologies.