Skip to content

Losses and LoggingLoss

julia
using EasyHybrid
using EasyHybrid: compute_loss
EasyHybrid.compute_loss Function
julia
compute_loss(ŷ, y, y_nan, targets, training_loss, agg::Function)
compute_loss(ŷ, y, y_nan, targets, loss_types::Vector, agg::Function)

Compute the loss for the given predictions and targets using the specified training loss (or vector of losses) type and aggregation function.

Arguments:

  • : Predicted values.

  • y: Target values.

  • y_nan: Mask for NaN values.

  • targets: The targets for which the loss is computed.

  • training_loss: The loss type to use during training, e.g., :mse.

  • loss_types::Vector: A vector of loss types to compute, e.g., [:mse, :mae].

  • agg::Function: The aggregation function to apply to the computed losses, e.g., sum or mean.

Returns a single loss value if training_loss is provided, or a NamedTuple of losses for each type in loss_types.

source

WARNING

  • y_nan is a boolean mask (or function returning a mask per target) used to ignore missing values.

  • For uncertainty-aware losses, pass target values as (y_vals, y_sigma) and write custom losses to accept that tuple.

Tips and quick reference

  • Prefer f(ŷ_masked, y_masked) for custom losses; y_masked may be a vector or (y, σ).

  • Use Val(:metric) only for predefined loss_fn variants.

  • Quick calls:

    • compute_loss(..., :mse, sum): predefined

    • compute_loss(..., custom_loss, sum) : custom loss

    • compute_loss(..., (f, (arg1, arg2, )), sum): additional arguments

    • compute_loss(..., (f, (kw=val,)), sum): with keyword arguments

    • compute_loss(..., (f, (arg1, ), (kw=val,)), sum): with additional arguments and keyword arguments

    • compute_loss(..., (y, y_sigma), ..., custom_loss_uncertainty, sum): with uncertainties

Simple usage

Predefined metrics

julia
# synthetic data
ŷ = Dict(:t1 => [1.0, 2.0], :t2 => [0.5, 1.0])
y(t) = t == :t1 ? [1.1, 1.9] : [0.4, 1.1]
y_nan(t) = trues(2)
targets = [:t1, :t2]
2-element Vector{Symbol}:
 :t1
 :t2
julia
julia> mse_total = compute_loss(ŷ, y, y_nan, targets, :mse, sum) # total MSE across targets
0.020000000000000025
julia
julia> losses = compute_loss(ŷ, y, y_nan, targets, [:mse, :mae], sum) # multiple metrics in a NamedTuple
(mse = (t1 = 0.010000000000000018, t2 = 0.010000000000000005, sum = 0.020000000000000025), mae = (t1 = 0.10000000000000009, t2 = 0.10000000000000003, sum = 0.20000000000000012))

Custom functions, args, kwargs

Custom losses receive masked predictions and masked targets:

julia
custom_loss(ŷ, y) = mean(abs2, ŷ .- y)
weighted_loss(ŷ, y, w) = w * mean(abs2, ŷ .- y)
scaled_loss(ŷ, y; scale=1.0) = scale * mean(abs2, ŷ .- y)
complex_loss(ŷ, y, w; scale=1.0) = scale * w * mean(abs2, ŷ .- y);

Use variants:

julia
julia> compute_loss(ŷ, y, y_nan, targets, custom_loss, sum)
0.020000000000000025
julia
julia> compute_loss(ŷ, y, y_nan, targets, (weighted_loss, (0.5,)), sum)
0.010000000000000012
julia
julia> compute_loss(ŷ, y, y_nan, targets, (scaled_loss, (scale=2.0,)), sum)
0.04000000000000005
julia
julia> compute_loss(ŷ, y, y_nan, targets, (complex_loss, (0.5,), (scale=2.0,)), sum)
0.020000000000000025

Uncertainty-aware losses

WARNING

This will be supported soon!

Signal uncertainty by providing targets as (y_vals, y_sigma) and write the loss to accept that tuple:

julia
function custom_loss_uncertainty(ŷ, y_and_sigma)
    y_vals, σ = y_and_sigma
    return mean(((ŷ .- y_vals).^2) ./.^2 .+ 1e-6))
end

Top-level usage (both y and y_sigma can be functions or containers):

julia
y_sigma(t) = t == :t1 ? [0.1, 0.2] : [0.2, 0.1]
loss = compute_loss(ŷ, (y, y_sigma), y_nan, targets,
    custom_loss_uncertainty, sum)

Behavior

  • compute_loss packs per-target (y_vals_target, σ_target) tuples and forwards them to loss_fn.

  • Predefined metrics use only y_vals when a (y, σ) tuple is supplied. (TODO)

LoggingLoss

The LoggingLoss helper aggregates per-target loss specifications for training and evaluation.

EasyHybrid.LoggingLoss Type
julia
LoggingLoss

A structure to define a logging loss function for hybrid models.

Arguments

  • loss_types: A vector of loss specifications (Symbol, Function or Tuple)

    • Symbol: predefined loss, e.g. :mse

    • Function: custom loss function, e.g. custom_loss

    • Tuple: function with args/kwargs:

      • (f, args): positional args, e.g. (weighted_loss, (0.5,))

      • (f, kwargs): keyword args, e.g. (scaled_loss, (scale=2.0,))

      • (f, args, kwargs): both, e.g. (complex_loss, (0.5,), (scale=2.0,))

  • training_loss: The loss specification to use during training (same format as above)

  • agg: Function to aggregate losses across targets, e.g. sum or mean

  • train_mode: If true, uses training_loss; otherwise uses loss_types.

Examples

julia
# Simple predefined loss
logging = LoggingLoss(
    loss_types=[:mse, :mae],
    training_loss=:mse
)

# Custom loss function
custom_loss(ŷ, y) = mean(abs2, ŷ .- y)
logging = LoggingLoss(
    loss_types=[:mse, custom_loss],
    training_loss=custom_loss
)

# With arguments/kwargs
weighted_loss(ŷ, y, w) = w * mean(abs2, ŷ .- y)
scaled_loss(ŷ, y; scale=1.0) = scale * mean(abs2, ŷ .- y)
logging = LoggingLoss(
    loss_types=[:mse, (weighted_loss, (0.5,)), (scaled_loss, (scale=2.0,))],
    training_loss=(weighted_loss, (0.5,))
)
source

Internally, in training we use logging.training_loss and in evaluation logging.loss_types. Note that LoggingLoss can mix symbols and functions.

Loss → train

So, how do you specified your loss? and the additional metrics given by loss_types?

default losses

You could select a different training or and a different vector for additional metrics

julia
train(...;
    training_loss = :mae,
    loss_types = [:mse, :mae, :nse]
    )

without additional arguments

Define your own custom function fn(ŷ, y) as above and pass it to the corresponding keyword argument:

julia
train(...;
    training_loss = fn,
    loss_types = [fn, :mae, :nse]
    )

with additional arguments

now your function will have additional arguments, i.e. fn_args(ŷ, y, args...):

julia
train(...;
    training_loss = (fn_args, (args...,)),
    loss_types = [(fn_args, (args...,)), :mae, :nse]
    )

and possible keyword arguments, i.e. fn_args(ŷ, y, args...; kwargs...):

julia
train(...;
    training_loss = (fn_args, (args...,), (kwargs...,)),
    loss_types = [(fn_args, (args...,), (kwargs...,)), :mae, :nse]
    )