Understanding How to Govern AI Development
Governance responsibilities for designing, building, training, testing, releasing, monitoring and maintaining AI models and systems throughout the development life cycle.
5 minutes
5 Questions
Understanding How to Govern AI Development is a critical competency for AI Governance Professionals, encompassing the frameworks, policies, and practices needed to ensure AI systems are developed responsibly, ethically, and in compliance with applicable regulations. At its core, governing AI devel…
Concepts covered
Defining AI System Business Context and Use CaseDocumenting the AI Design and Build ProcessData Quality, Quantity and Integrity for AI TrainingUnit, Integration and Validation Testing for AIManaging Issues and Risks During AI Training and TestingConformity Assessment Requirements for AI ReleaseAI Maintenance, Updates and Retraining ScheduleSecurity Testing for AI SystemsModel and Data Drift in Production AIPublic Disclosures and Transparency Obligations for AIAI Impact Assessment in DesignRequirements Gathering for AI SystemsAI Architecture and Model SelectionHuman Oversight in AI DesignMetric and Threshold Evaluation for AIStakeholder Engagement and Feedback in AI DesignOperational Controls During AI DevelopmentProbability and Severity Harms Matrix for AI RiskRisk Mitigation Hierarchy for AIStakeholder Mapping for AI RiskBenchmarking and Pre-Deployment Pilots for AIData Governance for AI Training: Rights and Fit-for-PurposeData Lineage and Provenance for AIPerformance, Security, Bias and Interpretability TestingDocumenting the AI Training and Testing ProcessRelease Readiness Assessment and Model CardsContinuous Monitoring of Production AI SystemsAudits, Red Teaming and Threat Modeling for AIAI Incident Management and Root-Cause AnalysisPost-Market Monitoring Plans for AIInstructions for Use Provided to AI Deployers
Test mode:
More Understanding How to Govern AI Development questions
930 questions (total)