Security Design ReviewRegulation & ComplianceMedical DevicesPublished February 6, 2026

Scaling Security Design Reviews in Medical Device Companies: A Modern, Compliant Approach

Executive summary

What this whitepaper covers

Medical device software lives under one of the densest regulatory stacks in technology: FDA premarket and postmarket cybersecurity guidance, FDA Section 524B, ISO 13485, the Quality System Regulation, IEC 62304, IEC 81001-5-1, HIPAA, and EU MDR. Audits rarely fault medical device companies for skipping SDR; they fault them for doing SDR inconsistently, with threat models that do not trace cleanly through the Design History File to mitigations, tests, and risk acceptance. This whitepaper walks through the framework map device teams need to satisfy at the same time, how traceability from architecture to threat to mitigation to acceptance survives an FDA pre-submission, and which parts of the review still require a human expert (risk acceptance, mitigation selection, regulatory interpretation) even in an AI-assisted program. Drawn from patterns we have seen across multiple device classes and submission types.

Key findings

What you'll take away

  • Framework map a single SDR needs to satisfy: FDA premarket/postmarket cybersecurity guidance, FDA Section 524B, ISO 13485, QSR, IEC 62304, IEC 81001-5-1, HIPAA, and EU MDR
  • Audits tend to fault inconsistent threat modeling, not missing SDRs; traceability through the Design History File is where most programs fall short
  • DHF traceability chain that survives submission review: architecture → threat → mitigation → test → risk acceptance, every step timestamped and attributed
  • Human experts retain authority over risk acceptance, mitigation selection, and regulatory interpretation, even in an AI-assisted program
  • Evidence generation as a byproduct of the review, not a separate compliance activity, is what keeps the program sustainable across release cycles
Download

Get the full whitepaper

Enter your details and we'll email you the PDF right away.

FAQ

Frequently asked questions

Which frameworks does a single SDR need to satisfy?
FDA premarket and postmarket cybersecurity guidance, FDA Section 524B, ISO 13485, QSR, IEC 62304, IEC 81001-5-1, HIPAA, and EU MDR. Building the program to the highest bar and producing evidence mapped per framework is the pattern that holds up.
What does an auditor actually want to see?
Traceability. Every threat tied back to an architectural element, every mitigation tied to a threat, every test tied to a mitigation, every accepted residual risk tied to a signed decision. Timestamped and attributed, end to end.
Where do human experts stay load-bearing in an AI-assisted SDR program?
Three places. Risk acceptance decisions. Mitigation selection where tradeoffs are device-specific. Regulatory interpretation where new guidance has not yet been operationalized. AI produces the draft; the expert owns the call.
Can evidence generation keep up with release cycles?
Yes, if it is a byproduct of the review rather than a separate activity. The moment evidence generation becomes its own workstream, it falls behind, and the program fails its next audit. The design is to produce the DHF artifact as part of the SDR itself.