Manager - Governance, Risk & Compliance Analyst (Third-Party Risk Analyst) jobs in United States
cer-icon
Apply on Employer Site
company-logo

recruit22 · 1 hour ago

Manager - Governance, Risk & Compliance Analyst (Third-Party Risk Analyst)

Recruit22 is seeking a Manager for Governance, Risk & Compliance (GRC) who will lead the organization's enterprise risk and compliance program. The role focuses on third-party risk management, audit readiness, and policy governance while ensuring compliance with regulatory frameworks and facilitating safe AI adoption.

Information Technology & Services
badNo H1Bnote

Responsibilities

Define the GRC program strategy, roadmap, and success metrics; align initiatives with organizational risk appetite and business objectives
Establish and continuously improve governance processes, control frameworks, and report to leadership and risk committees
Operationalize an enterprise AI governance framework covering model development, procurement, deployment, monitoring, and retirement
Classify AI systems by risk tier (e.g., clinical decision support, operational automation, administrative copilots) and ensure proportional controls are applied
Oversee enterprise risk identification, assessment, and treatment plans; ensure timely remediation tracking and executive reporting
Approve risk ratings and risk acceptance recommendations; escalate material risks and propose mitigation investments
Identify, assess, and document AI-specific risks, including, Model bias and discrimination, Hallucinations and clinical safety risks, Model drift and data quality degradation, Data leakage and IP exposure, Inappropriate secondary use of data
Define and monitor Key Risk Indicators (KRIs) and Key Control Indicators (KCIs) for AI systems
Lead the third‐party/vendor risk program: methodology, tiering, due diligence, gap analysis, remediation SLAs, and performance metrics
Extend third-party risk management practices to AI vendors and embedded AI capabilities (e.g., EHR-integrated AI, ambient listening tools, SaaS copilots)
Evaluate vendors' model transparency and explainability, training data provenance, security and privacy safeguards, model update and retraining practices
Partner with Procurement and Legal to ensure AI-specific contractual safeguards (e.g., data usage restrictions, audit rights, indemnification)
Own planning and execution for internal and external audits (e.g., SOC 2, HIPAA, HITRUST), including evidence management, control validation, issues tracking, and management responses
Interpret and translate evolving AI-related regulatory and enforcement expectations into actionable controls, particularly as they intersect with healthcare regulations
Ensure AI use cases comply with patient safety and quality standards, Privacy and data-protection obligations, Clinical documentation and auditability requirements
Support internal and external audits by producing AI governance artifacts, risk assessments, control evidence, and model documentation
Maintain continuous audit readiness through control testing, corrective actions, and compliance dashboards
Govern the policy lifecycle (creation, approval, publication, attestation, and exceptions) for security policies, standards, and procedures
Produce executive‐level reporting and risk narratives; respond to security‐related inquiries from internal and external stakeholders
Define and maintain AI policies, standards, and control objectives aligned to responsible AI principles (fairness, transparency, accountability, safety, and privacy)
Partner with Compliance and L&D teams to develop AI awareness and role-based training for clinical, technical, and business users
Own the design, deployment, and maintenance of the GRC platform to support risk registers, control libraries, policy workflows, and continuous monitoring
Define data quality expectations and integrations; ensure platform usability, automation, and authoritative reporting
Provide leadership input to incident response processes and contribute to business continuity/disaster recovery (BC/DR) planning and exercises
Translate post‐incident learnings into control enhancements and program improvements
Build strong partnerships with security engineering, privacy, legal, procurement, infrastructure, and product teams to drive outcomes
Manage external assessors, auditors, and vendors; oversee statement of work, deliverables, and service quality
Define and publish KPI/KRI dashboards (e.g., remediation cycle time, vendor risk posture trends, audit findings burn‐down, policy attestations)
Drive a culture of continuous improvement, simplifying processes and elevating control effectiveness without impeding business agility

Qualification

GRC program managementInformation SecurityAI governanceCybersecurity complianceRisk assessmentVendor managementRegulatory frameworksMachine learning fundamentalsControl frameworksAudit readinessData-driven systemsStakeholder managementHealthcare regulationsCISM certificationCISA certificationCRISC certificationCGEIT certificationCIPT certificationISO 27001 certificationProject managementCommunication skills

Required

Bachelor's degree in information security, Computer Science, Information Technology, or equivalent practical experience
8+ years in Information Security / IT Risk / GRC, including 2+ years in a formal people leadership and/or program management capacity and 2+ years supporting AI, advanced analytics, or data-driven systems
Demonstrated experience operating GRC programs covering cybersecurity risk management, third‐party cyber risk assessment, cybersecurity policy governance, compliance management and audit
Hands‐on familiarity with GRC platforms and application of frameworks and regulations such as HIPAA, PCI DSS, NIST 800, NIST RMF, and SOC 2
Strong conceptual understanding of Machine learning and generative AI fundamentals, Model training, inference, drift, and retraining, differences between rules-based automation, traditional ML, and generative AI, familiarity with model documentation practices (e.g., model cards), human-in-the-loop controls, explainability and transparency techniques
Ability to translate technical AI concepts into governance, risk, and compliance language for non-technical stakeholders
Strong communication, stakeholder management, and project/program management skills; proven ability to influence decisions and drive outcomes across cross‐functional teams
Demonstrated consulting mindset and ability to balance innovation enablement with risk mitigation
Hands-on experience assessing technology vendors, preferably including SaaS platforms with embedded AI capabilities
Ability to evaluate AI-related assurances, certifications, and audit artifacts provided by vendors
Ability to manage multiple priorities in a fast‐paced environment and make informed, timely risk‐based decisions

Preferred

8-10+ years of progressive experience in GRC or cybersecurity with 3+ years leading teams and complex compliance initiatives and supporting AI, advanced analytics, or data-driven systems
Experience in healthcare or other regulated industries and with HITRUST certification programs
Experience supporting clinical AI or patient-impacting technologies
Familiarity with emerging AI governance standards and industry guidance
Prior involvement in establishing new governance programs (e.g., AI, cloud, data governance) rather than only operating mature ones
Relevant certifications such as CISM, CISA, CRISC, CGEIT, CIPT, or ISO 27001 Lead Implementer/Lead Auditor
Demonstrated success implementing or maturing GRC platforms, integrating control testing, risk registers, vendor workflows, and executive reporting
Background collaborating on incident response and BC/DR planning and translating outcomes into risk and control improvements

Company

recruit22

twitter
company-logo
We are a forward-thinking and innovative recruitment firm. We offer strategic recruitment solutions using cutting-edge technologies and methodologies.

Funding

Current Stage
Early Stage
Company data provided by crunchbase