All training courses are provided in-house and the courses listed below can be mixed to meet your needs.
Please contact us to discuss what courses would best match your requirements and if you have any questions on specific courses or course categories.
This module aids participants with the practical implementation of a Data Governance Strategy.
All GOC Departments are now, at a high level, developing Data Strategies to align with the Privy Council report “A Data Strategy Roadmap for the Federal Public Service”, but how does this affect individual groups within GOC Departments, Agencies and Crown Corporations?
In this module, you will learn the key principles and best practices that drive Data Strategy and how they can be practically implemented in the workforce to provide a positive return on investment through key data practices, including how to:
- identify data needs and strategies for alignment with departmental strategy
- assess, review and address the needs of stakeholders
- be an ongoing champion of data use
- use data to guide decision making
- effectively prepare to share data both internally and externally
- obtain key insights from data
- use data to review and enforce accountability and responsibility
- connect data functions internally across groups and externally with OGD and 3rd parties
A department’s data analyses are only as good as the data that they access. Essentially, data engineering is the creation of data pipelines that enable the data scientists and analysts to do their work properly. As the workplace becomes more data focused, data engineering is moving from its traditional home in the IT department and directly into business lines. This module will give participants and overview of what data engineers do and how critical data pipelines are.
Participants will learn:
- the importance of a well designed data pipeline
- strategies for aggregating and combining data sources
- Treasury Board Secretariat guidelines and milestones in the storage and communication of Protected B (and high level) information
- what tools data Engineers need access to
- approaches to data collection and validation
This module is a pre requisite to obtain the course certification. As part of the certification process, participants are expected to present on a topic they learned through their training and present it to a group for critique and feedback. A mini report and a presentation are required to be submitted for Data Action Lab review and records.
As the focus for openness and transparency grows, the requirement to push more departmental data to the GOC open data portal becomes crucial. This is a time consuming process; it becomes important for departments to set up a streamlined set of workflows and processes to ensure efficiency and data quality. In this module, participants will get an overview of:
- best practices in publishing open data
- how to ensure consistency in publishing open data sets over time
- standards and protocols in publishing open data
- quality assurance of open data
A Key Performance Indicator (KPI) is a performance metric that allows organizations to understand the relative “health” of a department or group by monitoring activity over time. In a data focused workplace, all KPIs need to be well defined and monitored over time. Additionally, with the advent of ever easier to use analysis tools and access to broader data, KPIs that were not available in the past are now measurable. In this module, participants will learn::
- how to develop strategies and frameworks for KPI discovery
- the importance of defining KPI calculations
- identification and management of KPI data sources
- about tools for KPI management
- how to publish KPIs through Business Intelligence Dashboards
This module is designed to compliment the courses in DC-15 Visualizing Performance: Best Practices in Management Dashboard Design.
In this module we interactively walk through a dashboard requirements process including gathering end user and organization requirements, story boarding, and high level design process.
Poor data dashboard design comes from poor requirements analysis. Typically, dashboard design is left to data visualization experts; providing senior management with a list of best practices helps to build dynamic and effective dashboards from the outset. In this module, participants will learn to:
- effectively engage with the end users to properly define the context of the dashboard
- understand the importance of narrative and storyboarding as part of the dashboard design process
- understand what design elements are key to engaging the end user
- what charts to use with specific types of data and storytelling objectives
In this module, we look at tools to map the data sources that are utilized by a department, or group within a department, at a high level. This is a critical activity that allows us to set up a number of value add activities, including data management, identification of data relationships as part of data lineage analysis (where data comes from, what happens to it and where it moves over time), and discovery of hidden sensitive data and consolidation of multiple data sources.
Aimed at non technical managers, this module will enable participants to:
- create a high level view of the sources of data that support a group within a department
- understand the inter relationships between the data sources
- estimate the level of risk associated with the group data management approach
- align data requirements to group business requirements
- visit data analysis approaches that allow a deeper understanding of their data
All groups within GOC Departments and Agencies rely (to a degree) on rogue data sources (sometimes called black books). These are typically composed of Microsoft Excel spreadsheets compromised of copied and adjusted enterprise data, home grown Access databases, or data stored in specific analysis applications. In this module, we identify strategies for:
- performing an audit to create a list of black books
- understanding the forces driving their creation and ongoing maintenance
- reducing their number and relative importance
- migrating to more robust enterprise data sources
The second of two modules providing an overview of standard software tools. This module looks at enterprise tools, including tools provided by (other tools may be included at the discretion of Data Action Lab or by request):
This module uses the information provided in DC 1 Developing a Data Strategy Roadmap and DC 2 Developing a Data Governance Strategy and walks participants through a pre defined checklist to assess where their Department, Group or Line of Business finds itself in relation to GOC and Departmental expectations.
Upon completion, the participants will have an understanding of where their GAPs are and have the ability to perform a more comprehensive analysis to come up with an action plan to close the GAPs.
The first of two modules providing an overview of standard software tools. This module looks at self service tools including (other tools may be included at the discretion of Data Action Lab or by request):
- Code based analysis tools
- Data visualization and Business Intelligence tools
- Data engineering tools
- Simulation and symbolic mathematical tools
- Statistical packages
In a dynamically changing workplace, it is difficult to understand what new and changing roles are required to build a data focused workplace. in this module, we outline some of the key new roles that may be required and what competencies each role requires. This will aid in the selection, hiring, development, and training of data employees, helping to provide maximum benefit to the organization and growth opportunities for the employees.
We cover several data roles, and take a closer look at competencies, roles and responsibilities for:
- domain experts
- data translators
- data engineers
- data scientists
- analysts (data and business)
- computer scientists
- computer engineers
- AI/ML QC specialists
The third and final module in our Artificial intelligence (AI) and Machine Learning (ML) project series. in this module, participants are presented with a series of case studies which are analyzed to derive lessons learned and best practices. The module also provides an opportunity for participants to present information on their own projects for the group to review and provide feedback.
This is the second of three modules on running Artificial intelligence (AI) and Machine Learning (ML) projects. in this module, you will learn:
- how to understand AI/ML infrastructure requirements
- AI/ML techniques trade off in relation to project requirements
- initial and ongoing risk management
- model management and prioritization
- quality control and assurance activities
- ongoing system management and maintenance
This is the first of three modules on running Artificial intelligence (AI) and Machine Learning (ML) projects. Compared to regular projects, these project categories typically require a different approach to set up and run. Due to the significant amount of discovery required in AI/ML applications, standard project management frameworks such as PMBoK and Prince2 (or similar) phased approach usually need to be modified to ensure project success. in the first of these modules, you will learn:
- what makes AI/ML projects different
- how to define requirements for AI/ML projects
- the importance of research and discovery as managed phases of project implementation
- new skills required of project managers
- critical roles in the implementation of AI/ML projects
This module investigates methods to link the results of data analysis with departmental or group objectives. The standard is often “analysis for the sake of analysis”. In this module, we show how to build an analysis framework that aligns with departmental and group objectives. Participants will learn:
- how to precisely articulate business objectives
- to identify what type of analysis will measure the business objective
- how do define what data is required to build the analysis models
- how to monitor the analysis over time and make the linkages between analytics and objectives
- how to take analysis results and make meaningful recommendations
This module provides tips and approaches to help empower a data focused workplace. It is not sufficient to rely on a small number of individuals to manage and use data – all employees need to understand their role in the creation, analysis, dissemination and curation of organizational data. in this module, you will learn how to:
- create a data based decision culture where the fundamental objectives are collection, analysis, and deployment of data to make better and more informed decisions
- understand the tools that allow the workforce to enhance their analysis and decision making skills
- understand what specific data training & development is required to up skill the workforce
- recognize the data competencies that are required to run a standard data workplace
- establish the importance of democratizing data, or how to get the right data in front of the right people at the right time
- deploy dynamic risk management in everyday data driven decision making
- find and empower your organization’s data evangelists
- build a reliable and consistent data management process
In the recent Privy Council report “A Data Strategy Roadmap for the Federal Public Service”, a roadmap was laid out for GOC Departments to create Data Strategy and Governance frameworks and to implement them through a robust Data Strategy Roadmap.
This module investigates the impact this has on GOC Departments, and, more importantly, on the groups and lines of business within those Departments.
We provide guidance and recommendations derived from widespread best practices on how Senior Executives and Managers can design and implement activities to:
- create an internal roadmap to implement Departmental Strategy and Data Governance
- establish a decision making group to enforce departmental mandates including accountabilities, roles and responsibilities
- identify key roles and responsibilities around data leadership
- understand ethical and secure use of data
- assess the current state of data literacy, skills and competencies
- establish good hiring practices
- review and update data training and development strategies
- understand departmental policy frameworks and how they affect the group / line of business
- assess infrastructure needs
- help foster innovation through pilot projects
- develop a data quality framework