Job Summary
We are seeking an experienced Architect with 12 to 16 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Delta Sharing Databricks Unity Catalog Admin and other related technologies. This role requires a deep understanding of data architecture and the ability to design and implement scalable solutions. The position is hybrid with day shifts and no travel required.
Responsibilities
Design and implement scalable data architecture solutions using Snowflakes Spark in Scala to meet business requirements and enhance data processing capabilities.Oversee the integration of Delta Sharing and Databricks Unity Catalog Admin to ensure seamless data sharing and governance across platforms.Provide expertise in Databricks CLI and Delta Live Pipelines to automate and streamline data workflows improving efficiency and reliability.Lead the development of structured streaming solutions to enable real-time data processing and analytics enhancing decision-making processes.Collaborate with cross-functional teams to manage risk and ensure data security and compliance with industry standards.Utilize Apache Airflow for orchestrating complex data workflows ensuring timely and accurate data delivery.Implement data storage solutions using Amazon S3 and Amazon Redshift optimizing for performance and cost-effectiveness.Develop and maintain Python scripts to support data processing and analysis tasks ensuring code quality and maintainability.Leverage Databricks SQL and Databricks Delta Lake to perform advanced data analytics and support business intelligence initiatives.Manage Databricks Workflows to automate data pipeline processes reducing manual intervention and increasing productivity.Utilize PySpark to process large datasets efficiently enabling faster insights and data-driven decision-making.Collaborate with stakeholders to understand business needs and translate them into technical solutions that drive company objectives.Continuously evaluate and adopt new technologies to improve data architecture and stay ahead of industry trends.Qualifications
Possess a strong background in Spark in Scala and related technologies with proven experience in designing data solutions.Demonstrate expertise in Delta Sharing and Databricks Unity Catalog Admin for effective data management and governance.Have hands-on experience with Databricks CLI and Delta Live Pipelines for automating data workflows.Show proficiency in structured streaming and risk management to ensure data security and compliance.Be skilled in using Apache Airflow Amazon S3 and Amazon Redshift for data orchestration and storage solutions.Exhibit strong programming skills in Python and PySpark for data processing and analysis.Have a solid understanding of Databricks SQL and Delta Lake for advanced analytics and business intelligence.