Skip to content

EasyHybrid.jl

Simply do

julia
using EasyHybrid
@hybrid YourHybridModelName α
# α is the newly introduced physical parameter to be learned!

see your new model by typing ?YourHybridModelName.

you should see something like:
julia
@doc YourHybridModelName
YourHybridModelName(NN, predictors, forcing, targets, α)

A hybrid model with:

  • NN: a neural network

  • predictors: a tuple of names, i.e, (:clay, :moist)

  • forcing: data names, i.e. (:temp, )

  • targets: data names, i.e. (:ndvi, )

  • Physical parameters: α

All physical parameters are trainable by default. Use ?Lux.Experimental.freeze to make specific parameters non-trainable during training.

What is @habrid doing?

Defines your model as a sub-type of an LuxCore.AbstractLuxContainerLayer. Namely,

julia
struct YourHybridModelName{D, T1, T2, T3, T4} <: LuxCore.AbstractLuxContainerLayer{(:NN, :predictors, :forcing, :targets, :α)}
  NN
  predictors
  forcing
  targets
  α
  function YourHybridModelName(NN::D, predictors::T1, forcing::T2, targets::T3, α::T4) where {D, T1, T2, T3, T4}
      new{D, T1, T2, T3, T4}(NN, collect(predictors), collect(targets), collect(forcing), [α])
  end
end

sets initial parameters and states

julia
function LuxCore.initialparameters(::AbstractRNG, layer::YourHybridModelName)
  ps, _ = LuxCore.setup(Random.default_rng(), layer.NN)
  return (; ps, α = layer.α,) # these parameters are trainable!
end
julia
function LuxCore.initialstates(::AbstractRNG, layer::RespirationRbQ10)
  _, st = LuxCore.setup(Random.default_rng(), layer.NN)
  return (; st) # none of the possible additional arguments/variables here are trainable!
end
EasyHybrid.@hybrid Macro
julia
@hybrid ModelName α β γ

Macro to define hybrid model structs with arbitrary numbers of physical parameters.

This defines a struct with:

  • Default fields: NN (neural network), predictors, forcing, targets.

  • Additional physical parameters, i.e., α β γ.

Examples

julia
@hybrid MyModel α β γ
@hybrid FluidModel (:viscosity, :density)
@hybrid SimpleModel :a :b
source

Hence, after specifying the physical parameters, you only need to describe how this model operates.

Definition

Model definition, how does the actual model operates!

julia
function (hm::YourHybridModelName)(dataset_keyed, ps, st)
  # data selection
  p = dataset_keyed(hm.predictors)
  x = Array(dataset_keyed(hm.forcing))
  
  # output from Neural network application 
  β, st = LuxCore.apply(hm.NN, p, ps.ps, st.st) 

  # equation! this is where you should focus, your model!
= β .* ps.α .^(0.1f0 * (x .- 15.0f0)) 

  return (; ŷ), (; β, st) # always output predictions and states as two tuples
end

Loss function

We provide a generic loss function, if you need further adjustments then define a specific one for your hybrid model.

define a custom loss

What is your approach for optimization?

julia
function lossfn(HM::YourHybridModelName, ds_p, (ds_t, ds_t_nan), ps, st)
  targets = HM.targets
  ŷ, _ = HM(ds_p, ps, st)
  y = ds_t(targets)
  y_nan = ds_t_nan(targets)

  loss = 0.0
  for k in targets
      loss += mean(abs2, (ŷ[k][y_nan(k)] .- y(k)[y_nan(k)]))
  end
  return loss
end