Hello, for me Usine/Hollyhock is the tooll which moved me from linear studio work in standard DAWs to "live mode".
Lullabies which i released (
http://exitab.exitmusic.org/album/lulla ... la-extb067 ) are live recorded. Firstly i prepared workspace for each track, then i exported racks and organized them in one big workspace - using grid to activate/deactivate racks for each track. basicaly. Lot of things are based on euclidean sequencer, and some simple principles of detuned oscilators, noise generators. "live" in this case for me means that i am controlling live parameters of these modules, and controlling the sound design of each track by volume faders, frequencies etc. If you want to look at that i have all workspaces online on github
https://github.com/drakh/lullabies
Recently i prepared one workspace for live techno performance where i use lots of circleseq euclidean sequencer for drums and for basslines. basslines and harmonies are as well controlled by "harmony improvisator VST" all needed GUI elements are organised in this monstruous workspace in interface builder

and i need to map midi controlers to it.
and here is some test recirding from it
https://soundcloud.com/ark-hd/techno-homework by controlling the whole thing by just mouse.
another live improvisation was a collaboration with Slavo Krekovic (he was using Max MSP) it was reinterpretation of Ladislav Kupkovic's composition Pismena (letters) (
https://www.mixcloud.com/festivalzvukov ... ts-z-dz-s/ ) my part was based on some oscillators running thru formant filters, some granulators (native Hollyhocks modules as well external VSTs) and some speech synthesis VSTs controlled from Hollyhock.
So shortly.
Basically i am using Hollyhock as my customised sequencer -- using different sequencers i made for it (CircleSeq, Neuron, ARGrid, FugueMachine) for feeding VSTis and sometimes using some simple principles for making the drones controlling it by lots of midi controllers.