Overview:
About Kinaxis
Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it's really our people who give us passion to always seek ways to do things better. As such, we're serious about your career growth and professional development, because People matter at Kinaxis.
In 1984, we started out as a team of three engineers. Today, we have grown to become a global organization with over 2000 employees around the world, with a brand-new HQ based in Kanata North in Ottawa. As one of Canada's Top Employers, we are proud to work with our customers and employees towards solving some of the biggest challenges facing supply chains today.
At Kinaxis, we power the world's supply chains to help preserve the planet's resources and enrich the human experience. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries, with more than 40,000 users in over 100 countries. We are expanding our team as we continue to innovate and revolutionize how we support our customers.
Responsibilities:
Location
This is a remote position. You can work from home and be located anywhere in Canada
Qualifications:
What you will do
* Modernization & Development
* Transition legacy systems to a cloud-native, modern data stack leveraging tools such as Informatica, Airflow, Postgres and modern technologies like Snowflake, dbt, BigQuery, Looker, CI/CD, Git, Databricks, PowerBI, Datadog and Grafana.
* Build innovative, scalable, and reliable data products that support Kinaxis' strategic goals.
* Build and maintain both batch and stream-based data pipelines for a wide variety of use cases, including the analysis of application and infrastructure logging data.
* Build and maintain complex application to application data connectors
* Implement and oversee rigorous data validation, cleansing, and error-handling mechanisms to maintain high data quality and reliability.
* Build data pipelines with reporting and analytics use cases in mind, ensuring transformations support scalable and performant BI consumption.
* Apply practical understanding of BI tool limitations to balance front-end vs back-end business logic and optimize query performance.
* Bring hands-on experience with at least one modern BI tool (e.g., Looker, Power BI, or equivalent), with a deep understanding of its data modeling, performance constraints, and semantic layers, enabling you to design data flows that align with business usage.
* Stay up to date on the latest technology trends and best practices as they relate to your role.
* Collaboration & Solution Design:
* Partner with internal business stakeholders from different business units and cross-functional teams to design and deliver tailored data solutions.
* Collaborate with peers specializing in FinOps, Data Architecture, Data Governance and Cloud
* Engineering to address diverse organizational needs.
* Troubleshoot complex data engineering challenges with a focus on scalability and reliability.
* Work closely with BI developers and analysts to ensure engineered datasets support intuitive and performant dashboards, models, and self-service use cases.
* Distill complex projects into bite-sized, actionable stories.
* Provide strategic oversight, lead large-scale initiatives. Help shape technical direction
* Insights & Enablement:
* Develop robust data models and analytics solutions to drive actionable insights.
* Develop and maintain complex BI models and visually appealing reports/dashboards for executive review
* Bridge the gap between raw data engineering and end-user reporting needs, proactively shaping data layers that power business outcomes.
* Champion data-driven decision-making by ensuring stakeholders have access to meaningful, high-quality data.
* Storytelling & Influence:
* Deliver compelling presentations and narratives to engage stakeholders and influence leadership.
* Advocate for innovative data strategies that align with Kinaxis' growth and innovation objectives.
Technologies we use
* Visual Studio, JIRA, Confluence, Git (Bitbucket/stash)
* Datadog, Grafana and Logstash for Observability
* Python for core development
* SQL and DBT for Data Modelling
* Mulesoft and Informatica for Data Integration
* Postgres, SQL Server, Snowflake and Google BigQuery as data stores
* Apache Airflow, PowerBI Dataflows, and Github Actions for Orchestration
* PowerBI and/or Looker for Data Analysis
* Ansible and Terraform for Cloud Infrastructure Automation
* Powershell for Windows endpoint data collection
* Google Cloud Platform (GCP) and Microsoft Azure for Cloud Infrastructure.
What we are looking for
* 5-7 years in data engineering, analytics engineering, software engineering, or related fields.
* Demonstrated expertise in traditional and modern data systems with hands-on experience building scalable, reliable solutions.
* Deep, hands-on experience with tools such as Snowflake, Databricks, Informatica, dbt, BigQuery, Looker and/or Power BI.
* Expert proficiency in software engineering principles, including CI/CD pipelines, version control (Git), RESTful APIs and scripting languages (Python, SQL)
* Solid experience with at least one modern BI tool (e.g., Power BI, Looker), with an understanding of how data model design impacts usability, performance, and semantic clarity.
* Familiarity with cloud infrastructure and supportive Observability tooling (Logstash, Datadog, Grafana)
* Strong problem-solving abilities with a customer-centric and innovative mindset.
* Exceptional storytelling and presentation skills for engaging stakeholders and influencing leadership.
* Passion for solving complex problems, a collaborative spirit, and a drive to deliver impactful solutions.
* Google Cloud Professional Data Engineer certification
* Familiarity with infrastructure-as-code (Terraform) and/or configuration-as-code (Ansible)
* Familiarity with FinOps and the ability to implement cost-saving strategies in the cloud is a plus
* Experience working for a SaaS company is a plus
#Full-time
#Senior
#LI-Remote
#LI-DN1
Share this job:
Share this Job