The governed-proof platform
Stop Rebuilding Evidence Every Time a Reviewer Asks
- Reusable from day one:Platform Core builds the governed record that automation, readiness work, and Command extend later. Nothing starts over.
- Reviewer-safe by design:Access logs, reviewer packs, and auditor-safe exports all trace back to the same source record. No shadow copies.
- Current instead of reconstructed:Freshness, ownership, and source context stay visible so the next review starts from living proof, not last quarter's snapshot.
- Command adds in-perimeter proof:Command layers telemetry-backed evidence collection inside your environment boundary only when regulated operations require it.
Live Reviewer-Safe Coverage Across Imported Frameworks
11,444 Reviewer-Shareable Answers
Imported approved answers Aurora can reuse in reviewer-safe flows when the framework and audience settings allow it.
22,909 Reviewer-Shareable Attachments
Imported attachments already tagged safe for reviewer packaging and governed proof delivery.
601 Customer-Facing Attachments
Imported proof already suitable for customer-visible trust-center and lighter reviewer surfaces.
633 Preserved Blocked Conflicts
Aurora keeps explicit source conflicts visible instead of flattening them into silent proof claims.
Top imported reviewer-safe frameworks
HITRUST_CSF
2,735 answers • 0 attachments
2,735 shareable answers
AUSTRALIAN_ISM
1,130 answers • 0 attachments
1,130 shareable answers
AUSTRALIAN_ISM_IRAP
1,127 answers • 1,081 attachments
1,127 shareable answers • 1,081 shareable attachments
NIST_SP_800_53_REV_5
1,014 answers • 1,014 attachments
1,014 shareable answers • 1,014 shareable attachments
Live proof examples
AI performance, safety, and reliability disclosures
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-062
• Define and provide acceptable error ranges for each operational factor that would impact each of the Intended Use(s) and any additional operational factors that would narrow acceptable ranges or lower acceptable error rates (including false positive and false negative error rates) that could impact those Intended Uses. • Identify operational factors and/or Intended Uses, including quality of system input, use, and operational context are critical to manage for reliable and safe use of the system in its deployed context. • Disclose and document Sensitive Use cases. • Documentation of implementation of effective controls in the system design to discourage automation bias (the possible tendency of over- relying on outputs produced by the system). • Document any system limitations, input or output data model limitations, or predictable failures, including uses for which the system was not designed or evaluated that may impact Intended Uses. • Document implemented mitigations and controls for well-known AI risks, such as inference manipulation (“jailbreaks”), model manipulation (e.g., data poisoning), and inferential information disclosure (e.g., prompt extraction). • Evidence of system accuracy, performance, and the extent to which these results are generalizable across use cases.
Access rights management and review
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-036
Supplier demonstrates it has implemented an access rights management plan that includes: • Access control procedures, • Identification procedures, • Lockout procedures after unsuccessful attempts, • Automatic logoff after inactivity, • Robust parameters for selecting authentication credentials, • Deactivation of user accounts (including accounts used by employees or subcontractors) on employment or termination within 48 hours, • Strong password controls that enforce password length and complexity and prevent reuse, and • Use of Multi-Factor Authentication (MFA) for identities. Supplier demonstrates that it has an established process to review user access to Microsoft Personal and Confidential Data, enforcing the principle of least privilege. The process includes: • Clearly defined user roles, • Procedures to review and justify approval of access to roles, • Test that users within roles with access to Microsoft data have a documented justification for being in the group/role, and • Strong prohibitions on shared accounts or passwords.
Annual privacy and security training
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-003
Annual records of attendance are available and can be provided to Microsoft upon request. Training content is regularly updated and includes privacy and security principles like incident prevention awareness including safeguarding passwords, log-in monitoring, risks associated with downloading malicious software, and other relevant security reminders. Documentation of compliance with training requirements will include evidence of training related to privacy regulatory requirements, security obligations, and compliance with applicable contract requirements and obligations. IT staff must have training for incident response and simulated events and automated mechanisms to facilitate effective response to crisis situations. If the Microsoft Personal Data Processed by supplier includes PHI, training content must include HIPAA training, security reminders, address log-in monitoring, safeguarding passwords, and supplier’s permitted uses and disclosures as permitted by the Business Associate Agreement.
One Proof Record, Every Review Motion
How One Approved Source Stays Current Between Reviews
01
Govern the source record
Policies, controls, approvals, and framework mapping start in one maintained proof base. Every downstream surface inherits from here.
02
Attach living evidence
Every artifact carries an owner, freshness window, source context, and control link so proof stays legible between review cycles.
03
Answer from cited proof
Questionnaire responses cite real evidence and reuse approved language. No more starting from blank spreadsheets each cycle.
04
Share through controlled surfaces
Trust Center tiers, reviewer packages, and auditor workspace exports stay connected to the same approved source record.
05
Expand when the burden changes
Layer in automation, readiness records, or in-perimeter Command proof only when the next review actually demands it.
Why One Governed Record Beats Disconnected Tools
Not Automation-Only
Artifact collection without a governed record just creates a faster mess. Aurora layers response, sharing, and readiness on top of the same source of truth.
Not Portal-Only
A portal is only as credible as the record behind it. Aurora maintains the full governed history that backs every reviewer surface, not a static front door.
Not GRC Sprawl
Built for teams that need to show proof fast and repeatedly. One record, one workflow, one reviewer handoff instead of three disconnected systems.
Start with the Plan That Matches Your Evidence Burden
Recommended bundle
Proof Core
Teams replacing spreadsheets or stitching together evidence, answers, and trust sharing by hand.
Recommended bundle
Automation
Teams that need automated evidence freshness and connector-backed proof between reviews.
Share the Review That Keeps Restarting the Same Work
Send us the questionnaire, audit scope, or renewal packet that triggers your next rebuild. We will show the shortest path to one proof record that survives every cycle.
Start with Platform Core. Add automation, readiness, or Command only when the evidence burden demands it.