About QuadReal Property Group
QuadReal Property Group is a global real estate investment, operating and development company headquartered in Vancouver, British Columbia. Its assets under management total $67.1 billion. From its foundation in Canada as a full-service real estate operating company, QuadReal has expanded its capabilities to invest in equity and debt in both the public and private markets. QuadReal invests directly, via programmatic partnerships and through operating companies in which it holds an ownership interest.
QuadReal seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.
Role Description:
We are growing our Enterprise Data & Analytics (ED&A) team at QuadReal!
As a DevOps/DataOps Engineer at QuadReal, you'll collaborate closely with our growing ED&A team and have a significant impact on the future of the company. The ideal candidate would be someone who wants to build, improve, and incorporate technologies that make the lives of our community more enriched, and who is motivated by delivering customer value.
The Enterprise Data & Analytics (ED&A) team at QuadReal is looking for a brilliant DevOps/DataOps Engineer to design, build and deploy foundational DevOps capabilities and workflows, empowering Data Engineers, Data Analysts, and Data Scientists across QuadReal with a trusted, relevant, and reliable data analytics platform. As a DevOps/DataOps Engineer, you will create engineering efficiencies as the owner of our enterprise data platform. You will have the opportunity to design and code highly optimized workflows and integrations for continuous integration and continuous deployment. The backbone of a DevOps/DataOps Engineer at QuadReal’s success will be communication, stakeholder management, and a passion for DevOps improvements.
This role can be based out of our Vancouver, Toronto, or Calgary office locations.
Responsibilities
Defining and setting development, test, release, update, and support processes for DevOps/DataOps operations
Develop, test, and maintain core system components in our Enterprise Data Platform (EDP)
Create and iterate on features and workflows quickly, with scalability in mind
Build robust and maintainable infrastructure and adhere to coding best practices
Unblock, support, and effectively communicate cross functionally in order to achieve results
Leave the code in a better state than when you found it (progressive refactoring)
Extensively manage expectations and collaborate with cross-functional teams to architect DevOps and DataOps-driven business processes for enabling design and creation of the best possible solutions to high value problems
Selecting and deploying appropriate CI/CD tools
Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
Mentoring and guiding team members
Encouraging and building automated processes wherever possible
Working closely with the Security Team in identifying and deploying cybersecurity measures by continuously performing vulnerability assessments and risk management
Working closely with the Integration Team and other members of the ED&A Team to ensure data quality from source through to production
Incidence management and root cause analysis
Participate in discussions with cross-functional teams, to understand their requirements. Translate the business requirements in to corresponding DevOps requirements
Collaborate with other members of the ED&A team to build a secure, robust, scalable, and efficient Enterprise Data Platform (EDP) in alignment with the long-term ED&A roadmap
Monitoring and measuring customer experience and KPIs
Qualifications
Experience with Azure infrastructure management and configuration including: auto-scaling, ADLS, Azure Blob Storage, Azure Kubernetes Engine, Azure SQL
Ability to read and understand logs across a variety of tools
Experience with pipeline orchestration tools such as Prefect / Airflow
Understanding of ETL frameworks such as DBT, Spark, and Ray
Ability to code in Python
Strong knowledge of the git version control system
Familiarity with Kubernetes is a plus
Experience with data science frameworks such as Pandas and ScikitLearn is preferred, but not required.
Familiarity with DataOps
Experience working in a fast-paced environment and
Self-directed and able to work independently
QuadReal has introduced a mandatory COVID-19 vaccination policy that requires full vaccination against COVID-19 for everyone working in our offices or sites. Accordingly, a successful candidate offered employment at QuadReal will need to provide proof of full vaccination prior to commencing employment, subject exemptions permitted under applicable employment and human rights legislation.
Want to learn about our end-to-end recruitment process? Click this link for a short video that will take you through each step, so you’ll know exactly what to expect.