We are seeking a highly skilled BI/Data Engineer to join our client's team in Toronto, ON!
The successful candidate will be responsible for designing, developing, testing, deploying data pipelines within Azure Databricks. This position requires an individual with a deep technical proficiency, understanding of Data Warehouse concepts and an expertise in Python, PySpark, Delta Lake and SQL.
What the Role Entails:
- Build and maintain data pipelines in Azure Databricks.
- Develop scalable and reliable solutions using Python and PySpark.
- Construct complex SQL queries for data profiling, efficient data manipulation and retrieval.
- Work with various teams to translate business needs into technical specifications.
- Optimize data processes for maximum efficiency and reliability.
- Ensure data quality and integrity.
- Develop and maintain data models, data flow diagrams, and data dictionaries.
- Participate in all project phases, delivering accurate estimates for analysis, coding, testing and documentation while maintaining effective communication about progress.
- Offer technical expertise to analyze requirements and devise solution that are practical, scalable, and easy to maintain.
- Promptly identify and report any challenges affecting task completion, with revise timelines and remediation plans.
- Provide on-going 3rd level support of applications to reduce the impact of defects and related incidents.
- Commit to continuous learning and improvement.
- Proactively evaluate system performance and quality, propose enhancements and implement updates for optimal functionality.
- Offer technical expertise in requirements analysis and solution to ensure the solution is both fit for purpose and use, with scalability and maintenance in mind.
The Successful Candidate will have:
- Bachelor’s or Master’s degree in Computer Science or related field.
- Microsoft certification preferred.
- Minimum 7 years in BI/Data Engineering/Data Warehousing
- Proven experience with Azure Databricks, and other Azure data services
- Must have expertise in Python, PySpark, and SQL
- Knowledgeable in data modeling, data warehousing and ETL processes
- Experience in Power BI
- Skilled in data security
- Strong customer focus
- Demonstrated ability to handle multiple priorities successfully
- Strong analytical and problem-solving abilities.
- Excellent verbal and written communication skills
- Property and Casualty insurance background is considered an asset