What controls exist today?
Surface the actual position: declared tools, shadow AI, vendor evidence, existing policies and staff use.
Ethical • Legal • Safe • Accountable AI Operations
Founder and Principal Consultant
Founder of ELSA AI. Advisory AI governance for private healthcare.
I help private GP, dental and specialist clinics move from informal AI use to a documented governance position that can be read and challenged by their DPO, board, insurer, MDO and relevant advisers.

Same founder-led delivery as on the homepage: evidence for private clinics using ambient scribes, Copilot, ChatGPT and shadow AI.
What I do
My work sits at the intersection of cybersecurity, information risk, data protection evidence and practical AI governance. ELSA AI focuses on organisations deploying third-party AI tools, not companies building AI models or medical devices.
Most private clinics do not start with a formal AI programme. They start with informal adoption: one clinician using an ambient scribe, another using ChatGPT to draft correspondence, reception staff testing AI summaries, or AI-enabled features appearing inside existing supplier platforms.
The governance risk is not that AI is being used. The risk is having no documented position when the DPO, board, insurer, MDO or regulator asks how that use is controlled.
ELSA AI helps clinics answer that question with evidence: what AI tools are in use, where patient data may be involved, what vendor evidence exists, whether DPIA review is likely required or strongly indicated, how patients are informed, how outputs are reviewed, and what actions are needed in the next 30 days.
What I bring
I hold the CISM and CRISC certifications from ISACA and have more than two decades of experience across cybersecurity, information risk, governance, assurance and control evidence in regulated environments.
Before founding ELSA AI, I worked across sectors where weak evidence could quickly become a board, regulator, insurer or operational problem. That experience shaped the way ELSA AI works: practical, evidence-led, commercially focused and built for accountable decision-makers.
Selected previous environments include work across organisations in the sectors listed below. That experience informs approach; it is not an endorsement by those organisations unless separately agreed.
Financial services
National infrastructure
Defence and secure environments
Retail and supply chain
Public sector and manufacturing
Healthcare and patient data
How I work
Across regulated environments, the question that matters is simple: can you show the evidence, or can you only describe the intention?
Decision-makers need documentation they can read, challenge and act on. That discipline now applies to AI governance in private healthcare.
Surface the actual position: declared tools, shadow AI, vendor evidence, existing policies and staff use.
Map gaps to relevant published expectations and governance-standard signals, including ICO, CQC, NHS England, MHRA, MDO and jurisdiction-specific sources where applicable.
Identify which decisions belong to the DPO, clinical lead, board, insurer, MDO or appointed safety role. ELSA AI does not own final sign-off.
Produce board-ready evidence in plain English: not abstract AI ethics documents, not generic policy packs and not consultancy theatre.
Why founder-led matters
Every ELSA AI engagement is senior-led by Faisal Ali. The work requires senior judgement applied to the clinic's actual AI tools, actual workflows, actual patient data exposure and actual evidence gaps.
No generic policy packs. No junior-led delivery model. No offshore handover.
ELSA AI is designed for clinic owners, partners, boards, DPOs and clinical leads who need clear evidence they can act on.
Services
4 working days from completed intake.
Identifies declared and shadow AI use, maps patient data exposure, assesses ambient scribe governance readiness where relevant, and produces a board-ready evidence pack and 30-day priority action plan.
Typical fee: £4,500-£6,500 + VAT.
4 to 6 weeks.
Converts Diagnostic findings into a working AI governance baseline structured for board review and adoption: policy, tool register, risk register, DPIA readiness workpack, vendor evidence tracker, patient transparency wording, staff guidance, incident process and board evidence pack.
Typical fee: £14,500-£22,000 + VAT.
Quarterly retainer.
Keeps the AI governance evidence pack current as tools, vendor terms, staff usage, insurer questions and regulatory expectations change.
Typical fee: £950/month or £10,500/year prepaid + VAT.