Losses and LoggingLoss
using EasyHybrid
using EasyHybrid: compute_lossEasyHybrid.compute_loss Function
compute_loss(ŷ, y, y_nan, targets, training_loss, agg::Function)
compute_loss(ŷ, y, y_nan, targets, loss_types::Vector, agg::Function)Compute the loss for the given predictions and targets using the specified training loss (or vector of losses) type and aggregation function.
Arguments:
ŷ: Predicted values.y: Target values.y_nan: Mask for NaN values.targets: The targets for which the loss is computed.training_loss: The loss type to use during training, e.g.,:mse.loss_types::Vector: A vector of loss types to compute, e.g.,[:mse, :mae].agg::Function: The aggregation function to apply to the computed losses, e.g.,sumormean.
Returns a single loss value if training_loss is provided, or a NamedTuple of losses for each type in loss_types.
WARNING
y_nanis a boolean mask (or function returning a mask per target) used to ignore missing values.For uncertainty-aware losses, pass target values as
(y_vals, y_sigma)and write custom losses to accept that tuple.
Tips and quick reference
Prefer
f(ŷ_masked, y_masked)for custom losses;y_maskedmay be a vector or(y, σ).Use
Val(:metric)only for predefinedloss_fnvariants.Quick calls:
compute_loss(..., :mse, sum): predefinedcompute_loss(..., custom_loss, sum): custom losscompute_loss(..., (f, (arg1, arg2, )), sum): additional argumentscompute_loss(..., (f, (kw=val,)), sum): with keyword argumentscompute_loss(..., (f, (arg1, ), (kw=val,)), sum): with additional arguments and keyword argumentscompute_loss(..., (y, y_sigma), ..., custom_loss_uncertainty, sum): with uncertainties
Simple usage
Predefined metrics
# synthetic data
ŷ = Dict(:t1 => [1.0, 2.0], :t2 => [0.5, 1.0])
y(t) = t == :t1 ? [1.1, 1.9] : [0.4, 1.1]
y_nan(t) = trues(2)
targets = [:t1, :t2]2-element Vector{Symbol}:
:t1
:t2julia> mse_total = compute_loss(ŷ, y, y_nan, targets, :mse, sum) # total MSE across targets0.020000000000000025julia> losses = compute_loss(ŷ, y, y_nan, targets, [:mse, :mae], sum) # multiple metrics in a NamedTuple(mse = (t1 = 0.010000000000000018, t2 = 0.010000000000000005, sum = 0.020000000000000025), mae = (t1 = 0.10000000000000009, t2 = 0.10000000000000003, sum = 0.20000000000000012))Custom functions, args, kwargs
Custom losses receive masked predictions and masked targets:
custom_loss(ŷ, y) = mean(abs2, ŷ .- y)
weighted_loss(ŷ, y, w) = w * mean(abs2, ŷ .- y)
scaled_loss(ŷ, y; scale=1.0) = scale * mean(abs2, ŷ .- y)
complex_loss(ŷ, y, w; scale=1.0) = scale * w * mean(abs2, ŷ .- y);Use variants:
julia> compute_loss(ŷ, y, y_nan, targets, custom_loss, sum)0.020000000000000025julia> compute_loss(ŷ, y, y_nan, targets, (weighted_loss, (0.5,)), sum)0.010000000000000012julia> compute_loss(ŷ, y, y_nan, targets, (scaled_loss, (scale=2.0,)), sum)0.04000000000000005julia> compute_loss(ŷ, y, y_nan, targets, (complex_loss, (0.5,), (scale=2.0,)), sum)0.020000000000000025Uncertainty-aware losses
WARNING
This will be supported soon!
Signal uncertainty by providing targets as (y_vals, y_sigma) and write the loss to accept that tuple:
function custom_loss_uncertainty(ŷ, y_and_sigma)
y_vals, σ = y_and_sigma
return mean(((ŷ .- y_vals).^2) ./ (σ .^2 .+ 1e-6))
endTop-level usage (both y and y_sigma can be functions or containers):
y_sigma(t) = t == :t1 ? [0.1, 0.2] : [0.2, 0.1]
loss = compute_loss(ŷ, (y, y_sigma), y_nan, targets,
custom_loss_uncertainty, sum)Behavior
compute_losspacks per-target(y_vals_target, σ_target)tuples and forwards them toloss_fn.Predefined metrics use only
y_valswhen a(y, σ)tuple is supplied. (TODO)
LoggingLoss
The LoggingLoss helper aggregates per-target loss specifications for training and evaluation.
EasyHybrid.LoggingLoss Type
LoggingLossA structure to define a logging loss function for hybrid models.
Arguments
loss_types: A vector of loss specifications (Symbol, Function or Tuple)Symbol: predefined loss, e.g.
:mseFunction: custom loss function, e.g.
custom_lossTuple: function with args/kwargs:
(f, args): positional args, e.g.(weighted_loss, (0.5,))(f, kwargs): keyword args, e.g.(scaled_loss, (scale=2.0,))(f, args, kwargs): both, e.g.(complex_loss, (0.5,), (scale=2.0,))
training_loss: The loss specification to use during training (same format as above)agg: Function to aggregate losses across targets, e.g.sumormeantrain_mode: If true, usestraining_loss; otherwise usesloss_types.
Examples
# Simple predefined loss
logging = LoggingLoss(
loss_types=[:mse, :mae],
training_loss=:mse
)
# Custom loss function
custom_loss(ŷ, y) = mean(abs2, ŷ .- y)
logging = LoggingLoss(
loss_types=[:mse, custom_loss],
training_loss=custom_loss
)
# With arguments/kwargs
weighted_loss(ŷ, y, w) = w * mean(abs2, ŷ .- y)
scaled_loss(ŷ, y; scale=1.0) = scale * mean(abs2, ŷ .- y)
logging = LoggingLoss(
loss_types=[:mse, (weighted_loss, (0.5,)), (scaled_loss, (scale=2.0,))],
training_loss=(weighted_loss, (0.5,))
)Internally, in training we use logging.training_loss and in evaluation logging.loss_types. Note that LoggingLoss can mix symbols and functions.
Loss → train
So, how do you specified your loss? and the additional metrics given by loss_types?
default losses
You could select a different training or and a different vector for additional metrics
train(...;
training_loss = :mae,
loss_types = [:mse, :mae, :nse]
)without additional arguments
Define your own custom function fn(ŷ, y) as above and pass it to the corresponding keyword argument:
train(...;
training_loss = fn,
loss_types = [fn, :mae, :nse]
)with additional arguments
now your function will have additional arguments, i.e. fn_args(ŷ, y, args...):
train(...;
training_loss = (fn_args, (args...,)),
loss_types = [(fn_args, (args...,)), :mae, :nse]
)and possible keyword arguments, i.e. fn_args(ŷ, y, args...; kwargs...):
train(...;
training_loss = (fn_args, (args...,), (kwargs...,)),
loss_types = [(fn_args, (args...,), (kwargs...,)), :mae, :nse]
)