-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Automation Internals
This page tries to explain how the LMMS code handles automation patterns and controllers.
-
Models are the base classes for all data represented by widgets like knobs, sliders, spinboxes... -
Models that can be connected toAutomationClips orControllers areAutomatableModels -
AutomationClipcontains a list of allAutomatableModel'sconnected (internally namedm_objects).AutomationClipis aTrackContentObject(TCO). -
AutomationTracks contain data for one automation track, represented asAutomationTrackViewin the UI. This is basically a whole automation track row in the Song Editor. -
An AutomationTrack can contain multiple
AutomationClips, represented asAutomationClipViews in the UI.
Controller automation is simple: The Models of the widgets have the controller linked via pointers. A call to AutomatableModel::value() returns
- the Controller's value (recalculated to fit the Model using
controllerValue), or else - the non-automated
m_value
Automation through patterns requires pre-calculation by the Song class, as these patterns can be recorded, or require interpolation.
- When an
AutomationTrackView::dropEventoccurs,- the ID of the
AutomatableModelis provided in the drop event - the
AutomationTrackViewlooks up that ID in theEngin::projectJournalto get theAutomatableModel - the
AutomationTrackcreates a newAutomationClip - the AutomatableModel gets added to the new
AutomationClip - the connection is now done
- the ID of the
- When a Song is played,
Song::processNextBuffercallsSong::processAutomations, which- calls
automatedValuesAt, i.e. returns allAutomatableModels at the current position in a QMap<AutomatableModel*, float> by using theAutomationClipmembersobjects()andvalueAt() - calls all
AutomatableModel::setAutomatedValuewith the pair of the QMap, i.e. assigns floats to theAutomatableModels' staticm_values
- calls
Automation is usually done by saving an XML node "automationpattern" (inside a node "automationtrack") which saves all connected models as "object" nodes. Each such node specifies an "id" attribute. AutomatableModel also have such an "id" attribute, so they can be matched in the journalling system. The AutomatableModel usually is stored as a node having its name, or as a node "quoted-model" with an attribute "quoted-name".
When loading a savefile, LMMS can load automation patterns and automatable models in arbitrary order. When loading a pattern's object node, a variable AutomationClip::m_idsToResolve gets filled with the journalling IDs. After the full song has been loaded, AutomationClip::resolveAllIDs() adds model pointers to the AutomationClip::m_objects array.
Now, when AutomationClipView::paintEvent is being called to draw a pattern view, the minimum and maximum value are fetched directly from the first object (i.e. the first AutomatableModel) of the AutomationClip.
See also https://docs.lmms.io/developer-guides/core/controlling-playback