We are seeking an experienced Data Engineer with proven expertise in SAS, Alteryx, and Databricks to join our financial data and analytics team. The ideal candidate will have a solid understanding of banking operations, regulatory requirements, and large-scale data integration, with the ability to design and maintain robust data pipelines that support analytics, machine learning, and reporting for mission-critical financial applications.
Data Pipeline Development & Management
Design, build, and maintain scalable, efficient ETL/ELT pipelines using SAS, Alteryx, and Databricks.
Integrate data from diverse banking systems, transaction platforms, and external data providers.
Ensure data quality, consistency, and accuracy across the organization.
Data Processing & Transformation
Perform large-scale data cleaning, transformation, and enrichment for use in analytics and reporting.
Optimize data workflows leveraging Spark in Databricks for distributed processing.
Develop reusable Alteryx workflows for recurring data processing tasks.
Collaboration & Business Support
Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver data solutions that enable business insights.
Support the integration of data sources for use in financial risk modeling, fraud detection, customer analytics, and compliance reporting.
Compliance & Governance
Implement and adhere to data governance, security, and privacy best practices in line with banking regulatory requirements (e.g., Basel III, GDPR, PCI-DSS).
Maintain clear documentation of data pipelines, workflows, and processes.
Continuous Improvement
Identify opportunities to automate manual data processes and improve data pipeline efficiency.
Stay up to date with new features and best practices for SAS, Alteryx, and Databricks in financial services.
Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
Experience:
5+ years of experience as a Data Engineer or in a similar data-focused role.
Strong hands-on experience with SAS, Alteryx, and Databricks.
Solid understanding of ETL design, data modeling, and data warehouse concepts.
Experience working with financial datasets, transaction processing systems, and banking analytics.
Technical Skills:
Proficiency in SQL for complex queries and performance optimization.
Strong experience with Spark in Databricks for big data processing.
Familiarity with Python, Scala, or R for data manipulation and automation.
Experience with cloud platforms (Azure, AWS, or GCP), ideally Azure Databricks.
Experience with financial reporting and compliance-driven data workflows.
Knowledge of API integration and automation in Alteryx.
Exposure to BI/visualization tools (Tableau, Power BI).
Understanding of real-time data streaming concepts in a financial context.
Strong analytical and problem-solving skills.
Excellent communication skills for cross-team collaboration.
Detail-oriented mindset, especially when working with sensitive financial data.
Ability to work independently and under tight deadlines.