Skip to content
The governed-proof platform

Stop Rebuilding Evidence Every Time a Reviewer Asks

Aurora turns scattered policies, controls, evidence, and approved answers into one maintained proof record teams reuse across audits, questionnaires, renewals, and technical reviews.
Maintained proof record
What teams stop rebuilding once governed proof is in place

One record serves operators, reviewers, auditors, and leadership. Every new request starts from maintained proof instead of blank pages.

Live imported proof

Live Reviewer-Safe Coverage Across Imported Frameworks

Aurora carries 11,444 reviewer-shareable answers and 22,909 reviewer-shareable attachments across 127 imported frameworks, ready to ship.

11,444 Reviewer-Shareable Answers

Imported approved answers Aurora can reuse in reviewer-safe flows when the framework and audience settings allow it.

22,909 Reviewer-Shareable Attachments

Imported attachments already tagged safe for reviewer packaging and governed proof delivery.

601 Customer-Facing Attachments

Imported proof already suitable for customer-visible trust-center and lighter reviewer surfaces.

633 Preserved Blocked Conflicts

Aurora keeps explicit source conflicts visible instead of flattening them into silent proof claims.
Top imported reviewer-safe frameworks
HITRUST_CSF
2,735 answers0 attachments
2,735 shareable answers
AUSTRALIAN_ISM
1,130 answers0 attachments
1,130 shareable answers
AUSTRALIAN_ISM_IRAP
1,127 answers1,081 attachments
1,127 shareable answers • 1,081 shareable attachments
NIST_SP_800_53_REV_5
1,014 answers1,014 attachments
1,014 shareable answers • 1,014 shareable attachments
Live proof examples
AI performance, safety, and reliability disclosures
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-062
• Define and provide acceptable error ranges for each operational factor that would impact each of the Intended Use(s) and any additional operational factors that would narrow acceptable ranges or lower acceptable error rates (including false positive and false negative error rates) that could impact those Intended Uses. • Identify operational factors and/or Intended Uses, including quality of system input, use, and operational context are critical to manage for reliable and safe use of the system in its deployed context. • Disclose and document Sensitive Use cases. • Documentation of implementation of effective controls in the system design to discourage automation bias (the possible tendency of over- relying on outputs produced by the system). • Document any system limitations, input or output data model limitations, or predictable failures, including uses for which the system was not designed or evaluated that may impact Intended Uses. • Document implemented mitigations and controls for well-known AI risks, such as inference manipulation (“jailbreaks”), model manipulation (e.g., data poisoning), and inferential information disclosure (e.g., prompt extraction). • Evidence of system accuracy, performance, and the extent to which these results are generalizable across use cases.
Access rights management and review
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-036
Supplier demonstrates it has implemented an access rights management plan that includes: • Access control procedures, • Identification procedures, • Lockout procedures after unsuccessful attempts, • Automatic logoff after inactivity, • Robust parameters for selecting authentication credentials, • Deactivation of user accounts (including accounts used by employees or subcontractors) on employment or termination within 48 hours, • Strong password controls that enforce password length and complexity and prevent reuse, and • Use of Multi-Factor Authentication (MFA) for identities. Supplier demonstrates that it has an established process to review user access to Microsoft Personal and Confidential Data, enforcing the principle of least privilege. The process includes: • Clearly defined user roles, • Procedures to review and justify approval of access to roles, • Test that users within roles with access to Microsoft data have a documented justification for being in the group/role, and • Strong prohibitions on shared accounts or passwords.
Annual privacy and security training
MICROSOFT_SSPA_DPR MICROSOFT_SSPA_DPR-REQ-003
Annual records of attendance are available and can be provided to Microsoft upon request. Training content is regularly updated and includes privacy and security principles like incident prevention awareness including safeguarding passwords, log-in monitoring, risks associated with downloading malicious software, and other relevant security reminders. Documentation of compliance with training requirements will include evidence of training related to privacy regulatory requirements, security obligations, and compliance with applicable contract requirements and obligations. IT staff must have training for incident response and simulated events and automated mechanisms to facilitate effective response to crisis situations. If the Microsoft Personal Data Processed by supplier includes PHI, training content must include HIPAA training, security reminders, address log-in monitoring, safeguarding passwords, and supplier’s permitted uses and disclosures as permitted by the Business Associate Agreement.

Proof graph

How One Approved Source Stays Current Between Reviews

Aurora keeps proof attributable, current, and reviewer-safe so every request starts from maintained evidence.

01
Govern the source record
Policies, controls, approvals, and framework mapping start in one maintained proof base. Every downstream surface inherits from here.
02
Attach living evidence
Every artifact carries an owner, freshness window, source context, and control link so proof stays legible between review cycles.
03
Answer from cited proof
Questionnaire responses cite real evidence and reuse approved language. No more starting from blank spreadsheets each cycle.
04
Share through controlled surfaces
Trust Center tiers, reviewer packages, and auditor workspace exports stay connected to the same approved source record.
05
Expand when the burden changes
Layer in automation, readiness records, or in-perimeter Command proof only when the next review actually demands it.

Category difference

Why One Governed Record Beats Disconnected Tools

The usual split between automation tools, portals, and GRC platforms creates more rebuilding work than clarity.

Not Automation-Only

Artifact collection without a governed record just creates a faster mess. Aurora layers response, sharing, and readiness on top of the same source of truth.

Not Portal-Only

A portal is only as credible as the record behind it. Aurora maintains the full governed history that backs every reviewer surface, not a static front door.

Not GRC Sprawl

Built for teams that need to show proof fast and repeatedly. One record, one workflow, one reviewer handoff instead of three disconnected systems.

Recommended bundles

Start with the Plan That Matches Your Evidence Burden

Most teams start with Platform Core, then add automation when evidence freshness and drift become the real constraint.

Recommended bundle

Proof Core

$7,200 / yr

Teams replacing spreadsheets or stitching together evidence, answers, and trust sharing by hand.

Recommended bundle

Automation

$13,200 / yr

Teams that need automated evidence freshness and connector-backed proof between reviews.

Live walkthrough
Share the Review That Keeps Restarting the Same Work
Send us the questionnaire, audit scope, or renewal packet that triggers your next rebuild. We will show the shortest path to one proof record that survives every cycle.
Start with Platform Core. Add automation, readiness, or Command only when the evidence burden demands it.