Platform Platform OverviewArchitecture
Jurisdictions Thermal InspectionPrecision MappingFire MonitoringISR Operations
Papers Partner with us
ThermalLaw Spec Complete

Thermal
Inspection

Utility. Infrastructure. Enterprise Inspection.

Governed findings approval for autonomous thermal inspection. AI proposes anomaly candidates. Operators decide what becomes an official finding. Everything is auditable. Deterministic severity classification, per-candidate approval, and a tamper-evident audit chain — from solar farms to substations to rooftops. No cloud. No trust vacuum.

<20min
Per Asset
100%
Audit Trail
0
Cloud Dependency
THERMALLAW FLIGHT COMPLETE · 5 CANDIDATES LAW 8 · APPROVAL REQUIRED QUEUE OPEN
APPROVAL QUEUE 3 PENDING
INSP-031-001 PENDING
TYPE CORROSION
CONFIDENCE 94%
SEVERITY SIGNIFICANT
LAW 8 AWAITING
INSP-031-002 PENDING
TYPE IMPACT
CONFIDENCE 87%
SEVERITY MODERATE
LAW 8 AWAITING
INSP-031-003 PENDING
TYPE GRANULE LOSS
CONFIDENCE 72%
SEVERITY MINOR
LAW 8 AWAITING
The Challenge

Anomaly detection
needs governance.

Autonomous drones remove the pilot from the control loop — but enterprise inspection findings still require documented human authorization. Operators face a trust gap: the drone proposes anomaly candidates based on ML inference, but compliance and insurance require an operator signature on every submitted finding. AI-only solutions create a trust vacuum. Manual inspection is slow, subjective, and impossible to replay. The solution is a governed approval workflow where AI proposes, the operator decides, and the audit trail proves it.

AI-Only
Trust vacuum

AI-flagged items move directly into reports without explicit operator confirmation. No governed review step. No chain of custody for findings decisions. Compliance rejects output.

Manual-Only
Slow & subjective

Inconsistent coverage across dozens of assets. Classification varies between inspectors. No protection from false positives. Impossible to replay or verify.

ThermalLaw
Governed workflow

ML proposes candidates. Principal approves each one. Deterministic severity banding. Sealed audit trail. Replay-verified.

A utility field supervisor managing infrastructure inspection across dozens of distributed assets — solar farms, distribution lines, substation equipment — using an autonomous drone. They are trained on drone operations but are not a dedicated remote pilot. The drone flies its own mission. The operator's job is to authorize it, monitor it, and sign off on what it finds.

With ThermalLaw, the operator authorizes each mission in under 3 minutes. The autonomous flight captures every zone. Edge ML proposes anomaly candidates. The operator reviews each one individually on their iPad — approving or rejecting with typed reasons. The sealed documentation pack is exported before leaving the site. Every finding has a chain of custody. Every decision is auditable.

50–120 Assets per shift
< 10 sec Per candidate review
0 Unverified findings
Domain Laws

Three laws.
Extending FlightLaw.

ThermalLaw inherits all eight FlightLaw constraints and adds three domain-specific laws for evidence handling, finding approval, and report gating. FlightLaw violations always take precedence.

VALID
EvidenceLaw
Valid finding structure required

Every damage candidate must have a complete evidence structure: georeferenced crop, bounding box, confidence score, severity classification, and zone reference. Incomplete candidates are rejected before they reach the approval queue.

FindingApprovalLaw
Principal approval gate per candidate

Every damage candidate proposed by onboard ML must be explicitly approved or rejected by the principal. No batch approval. No auto-accept. Each decision is logged with actor attribution and timestamp in the audit trail.

ReportLaw
Output blocked until approval queue clears

The Documentation Pack cannot be generated, exported, or delivered until every candidate in the approval queue has been explicitly resolved. No partial reports. No pending findings in deliverables. The gate is absolute.

Mission Flow

Seven steps.
Governed end-to-end.

Every thermal inspection follows the same deterministic workflow. ML proposes. The principal decides. The audit trail seals it.

01
Observe

Standardized capture pattern. Grid zones, edges, penetrations, ridges. Coverage requirements defined before launch. Every zone must be imaged or the mission is incomplete.

Target: 100% zone coverage. Gap detection is deterministic, not sampling-based.
02
Infer

Onboard Core ML model proposes damage candidates. Each candidate includes a bounding box, confidence score, and preliminary classification. All inference runs locally on Apple Silicon.

Tradeoff: edge inference limits model size but guarantees availability on every roof, in every cell dead zone.
03
Explain

Each candidate is presented with its image crop, full-frame context, confidence percentage, and severity band. The principal sees exactly what the model saw and why it flagged it.

Design goal: reduce cognitive load per finding to a single glance. Crop + context + confidence in one view.
04
Approve

The operator explicitly approves or rejects each candidate. No batch operations — no affordance for bulk approval exists. Approval creates an immutable finding record. Rejection requires a typed reason. Neither action can be undone.

< 10 sec per candidate. Typed rejection reasons enable model quality feedback loops. Bulk approval is deliberately absent.
05
Flag

Approved candidates become flagged anomalies in the evidence record. Rejected candidates remain in the audit trail with rejection reason. Nothing is deleted.

Rejections are data, not mistakes. The audit trail records the principal's judgment, not just their approvals.
06
Export

Documentation Pack generated: PDF report, JSON evidence data, georeferenced images. ReportLaw gates export until the approval queue is empty. No pending findings in deliverables.

ReportLaw gate prevents premature delivery. Design goal: signed post-flight report in under 25 minutes.
07
Replay

The entire session can be replayed from the audit log to verify determinism. Same inputs, same outputs. QA and audit teams can independently verify every decision.

Key differentiator: no other inspection product offers deterministic replay of the entire decision chain.
Inspection Architecture

Governed capture.
Structured evidence.

Every roof inspection follows a deterministic grid pattern. Damage candidates are proposed by onboard ML, queued for principal approval, and sealed into the evidence record.

Thermal Inspection · Roof Assessment Workflow
RIDGE LINE Z-01 Z-02 Z-03 Z-04 Z-05 Z-06 Z-07 Z-08 92% DMG-001 78% DMG-002 55% DMG-003 38% REJECTED COVERAGE 85% APPROVAL QUEUE DMG-001 CONF: 92% | AREA: 420px | Z-01 SEVERITY: SIGNIFICANT APPROVED S.SWEENEY | 14:18:33 UTC DMG-002 CONF: 78% | AREA: 280px | Z-07 SEVERITY: MODERATE PENDING DMG-003 CONF: 55% | AREA: 140px | Z-09 SEVERITY: MINOR PENDING REPORTLAW: EXPORT BLOCKED 2 CANDIDATES PENDING APPROVAL DAMAGE CANDIDATE REJECTED (<0.50) GRID ZONE
Severity Banding

Deterministic
classification.

Severity is computed from confidence score and bounding box area. The classification is deterministic — same inputs always produce the same band. No subjective judgment. No override.

Confidence Area Severity
≥0.85 Any Significant
0.70 – 0.84 ≥200px Moderate
0.70 – 0.84 <200px Minor
0.50 – 0.69 ≥500px Moderate
0.50 – 0.69 <500px Minor
<0.50 Any Rejected
Deliverable

Documentation Pack.

The final deliverable is a sealed evidence package. Every component is traceable to the audit trail. ReportLaw gates generation until the approval queue is empty.

01
Cover Page

Session metadata. Property address, date, asset identifier, principal name, weather conditions at time of flight, mission hash.

02
Executive Summary

Anomaly count by severity band. Significant, Moderate, Minor tallies. Total candidates proposed vs. approved. Coverage percentage.

03
Coverage Map

Zone completion percentages overlaid on property outline. Gaps identified. Coverage threshold compliance status.

04
Flagged Anomalies

One page per approved finding. Full-frame image, crop, bounding box, confidence score, severity band, zone reference, approval attribution.

05
Methodology Appendix

ML model version, capture parameters, severity banding thresholds, grid specification, audit trail hash. Everything needed for independent verification.

Hardware

Edge-first.
Apple Silicon.

All inference runs locally. No cloud dependency. No data leaves the device until the principal exports the sealed Documentation Pack.

Primary Hardware
Skydio X10 (field validation target)
Also Compatible
PX4 / ArduPilot via MAVLink
ML Inference
Core ML on Apple Silicon
Processing
On-device, edge-first
Cloud
None. Zero dependency.
Export
PDF + JSON + images
Connectivity
MAVLink direct or KMZ export
Where It Runs

Three governed moments.
One authority model.

The ThermalLaw workflow runs inside Watch Station — the principal interface available on iPad, Mac, and iPhone. Each screen corresponds to a governance moment where the system requires an operator decision before state can advance.

01 · Pre-Mission
Mission Authorization

The operator reviews the proposed mission — flight area, target assets, altitude profile, AI risk assessment — and explicitly authorizes. Authorization is logged with timestamp and operator identity. This is the first entry in the chain of custody.

See in Watch Station →
02 · In-Mission / Post-Mission
Anomaly Review Queue

The core governed moment. Each anomaly candidate shows: detection image, AI confidence score, severity band, and asset location. Approve creates an immutable finding. Reject requires a typed reason. No bulk approval affordance exists.

See in Watch Station →
03 · Post-Mission
Findings Report & Audit Trail

The operator reviews approved findings, verifies session replay integrity, and exports the documentation pack: PDF report, structured JSON, detection images, and the complete hash-chained audit log.

See in Watch Station →
ThermalLaw

Governed inspection.

Every finding proposed. Every decision logged. Every report gated. The audit trail is the product.

Runs on Flightworks Control · Principal interface Watch Station · Primary hardware Skydio X10