Neo¶
Background¶
The Neo wrapper quantifies vacuitic uncertainty by learning a unique signature of the training data and assessing how closely new, production data aligns with this signature. This approach provides insight into data novelty and out-of-distribution conditions by measuring the degree of divergence between the production data signature and the training signature(s). Neo’s design leverages proprietary techniques to ensure that the learned signatures generalize well, preserving robustness even in the face of significant distributional shifts. Unlike traditional methods that require high computational and memory overhead, Neo’s architecture is optimized for efficiency, making it suitable for deployment in environments where resources are limited or real-time inference is required. This wrapper, therefore, allows for rapid, robust assessment of vacuitic uncertainty, enabling more reliable decision-making in uncertain environments.
Usage¶
Wrapping your model with capsa_torch.neo
from torch import nn
from capsa_torch import neo
# Define your model
model = nn.Sequential(...) # note: other (non-sequential) models are also allowed
# Build a wrapper and wrap your model
wrapper = neo.Wrapper()
wrapped_model = wrapper(model)
Calling your wrapped model
# By default, your wrapped model returns a prediction
prediction = wrapped_model(input_batch)
# But if you use `return_risk` you will also automatically get vacuitic uncertainty too!
prediction, uncertainty = wrapped_model(input_batch, return_risk=True)
Training your wrapped model
Neo-wrapped models must be finetuned in order to return accurate measures of uncertainty. The loss used for finetuning is the uncertainty produced by the wrapper.
For fine-tuning (training)
Use
return_risk=True
argument when trainingUse the
risk
output as the loss
prediction, risk = wrapped_model(input_batch, return_risk=True)
loss = risk.mean()
For inference
Use
return_risk=True
in order to evaluate vacuitic uncertainty
prediction, risk = model(input_batch, return_risk=True) # shape: [B, …]
Examples
API Reference
Tensorflow implementation is coming soon. Please contact us for inquiries.