Logic Pro 9 Sculpture fader mapping problem

Hi,
I'm finding i/o fader control mismatches in Sculpture.

i.e.: If I monitor the "material" fader output I have "F2 59 xx" and "F2 60 yy".
If I create sliders with such fader controls (59 & 60), the "Material" joystick does not respond.
The "F2 59" does not generate any apparent control and the "F2 60" moves the "Pick Up 1" control.
The Pick Up 1 (responsive to F2 60) generates a "F2 63" so I tried to offset the other control by "minus 3" in order to get "56" (instead of 59) ...but no joy.
The problem is not confined to these two parameters alone but it affects all the other parameters as well.

Is it a known bug?
Has anybody correctly mapped all the Sculpture parameters?

Thanx for any help!
 
Last edited by a moderator:
If I monitor the "material" fader output I have "F2 59 xx" and "F2 60 yy".
If I create sliders with such fader controls (59 & 60), the "Material" joystick does not respond
If you don't want to use the Controller Assignments for whatever reason but want to control plugins directly in the environment, you need Fader messages. This is a Logic internal message type, similar to CC but works only inside Logic. You have to set the output of your controlling object to "Fader".

When the monitor tells you
F 2 59 (value)
then "F" means "Fader", "2" is the channel and "59" is the controller number.

The channels is determined by the slot number where the plugin sits. Instrument channelstrip have different slot numbers than the other types.
 
Upvote 0
you need Fader messages. This is a Logic internal message type, similar to CC but works only inside Logic. You have to set the output of your controlling object to "Fader".

Very often controlling plug-ins directly via Fader events does not work as expected, especially not with plug-ins that feature more than 128 parameters (e.g. Sculpture or Ultrabeat). The Fader event number you can visualize at the output of a channel strip object is not necessarily the actual Fader event number controlling the plug-in parameter internally. Furthermore you can only create Fader events up to number 127 in the Environment, but plug-in parameters can have vastly higher numbers (up to several thousands).

As Peter said: Fader events are an internal data type, and I would strongly recommend to use controller assignments instead dealing with manually created Fader events in the Environment.

Best...

Manfred
 
Upvote 0
Hi Peter and Manfred!

When the monitor tells you F 2 59 (value) then "F" means "Fader", "2" is the channel and "59" is the controller number.

I created the fader objects following these rules.

Very often controlling plug-ins directly via Fader events does not work as expected, especially not with plug-ins that feature more than 128 parameters (e.g. Sculpture or Ultrabeat).

Exactly what I found...
I am able to create faders for some other instruments (I didn't try them all) but failed with Sculpture.

I found that Sculpure "lies".
As the original post: the y axis of the "material" generates "F 2 60 vv" but does not respond to the same message generated by a logic object (fader or joystik).
Such message affects the "PickUp 1" position.

If a picture is worth a thousand words a movie ... ;-)
http://www.riccardoballerini.com/riccardorballerini/temporary.html

My intention was to recreate (with a joystick object) the "Material" behaviour of sculpture and to control the joystick object with the Korg Wavestation's real joystick.

Thank you!!

Riccardo
 
Upvote 0
Very often controlling plug-ins directly via Fader events does not work as expected, especially not with plug-ins that feature more than 128 parameters (e.g. Sculpture or Ultrabeat).

Exactly what I found...
I am able to create faders for some other instruments (I didn't try them all) but failed with Sculpture.

I found that Sculpure "lies".
As the original post: the y axis of the "material" generates "F 2 60 vv" but does not respond to the same message generated by a logic object (fader or joystik).

Well, Sculpture tells the truth, but the Environment is not able to display it properly (it was never meant to do so). As I said: The Environment is only able to create/process Fader events with 7 bit numbers, means numbers between 0 and 127. I assume (however I don't know for sure) Logic internally uses (at least) 14 bit format for parameter numbering, which allows parameter numbers from 0 to 16383.
(Note: The parameter resolution can be even higher, Logic can write automation with a resolution of up to 32 bit float. The actual resolution depends on the respective parameter).

Now MIDI (and similar) implementations split up the 14 bit data range in two 7 bit messages. These are called MSB (most significant byte) and LSB (least significant byte). You can say 14 bits in MSB/LSB are 128 data blocks with 128 values each. The first byte names the data block, the second byte names the actual value inside this data block. The LSB will count from 0-127 repeatedly, while the MSB increments by 1 with every full pass of the LSB. Just think of it like a mileage counter:)

Now if a device (or in this case the Environment) cannot read the full 14 bit message, it will respond only to the last byte, the LSB. Means you have the LSB value, but you don't know from which of the possible 128 MSB data blocks it comes from. Is it from the lower 14 bit value range, or from the upper? Without the MSB you simply don't know, because a single LSB value can be from anywhere in the possible range of 16384 values.
Or to take the mileage counter again: Think of a 20 year old car that has a 5 digit counter with 20.000 miles on it. But most likely the car has run 120.000 miles, maybe even 220.000 or more. But you cannot say for sure because the leftmost digit for the 100.000 miles is missing.

And I guess this is exactly what happens with Sculpture. If you record automation for the material pad, you'll see in the automation event list that the actual Fader event numbers are not 59 and 60, but 443 and 444. Now subtract 128 (the MSB) as many times as possible. This is 3x128 (=384) in this case. The remaining rest (or the LSB value, the right digits of the counter) is as much as 59 and 60. There you go...

My intention was to recreate (with a joystick object) the "Material" behaviour of sculpture and to control the joystick object with the Korg Wavestation's real joystick.

Better use the Wavestation joystick with controller assignments directly. What's the point of using a vector fader object in the Environment to mirror the Wavestations joystick movement?

Best...

Manfred
 
Upvote 0
Riccardo A. Ballerini said:
If a picture is worth a thousand words a movie ... ;-)
http://www.riccardoballerini.com/ric...temporary.html

Problem #2 ...
... is visible in your movie. When you watch the monitor object while you operate the Sculpture matrix you'll see that it goes from 0-100 but your fader is 0-127. Same with the pick position of the string that you operate with the fader. It is fully to the right while your fader is around 100 or a bit higher. Softsynths have different ranges, at least the Logic synths and you have to scale each fader. This means you cannot take any fader and work on different parameters. You would not only need to switch the controller numbers but also the fader scaling. The Controller Assignments do the scaling for you, even if you assign a fader to different plugins.

Problem #3 ...
... if you move a fader to somewhere, then switch the plugin or parameter and move the fader again, it is in another position. When you return to the first parameter and move the fader again, the synth suddenly jumps to the current fader value. The Controller Assignments provide something that can be called "soft takeover". The parameter does not change until the fader comes very close to the current value of the parameter.

Both problems can be handled in the environment and if you manage to find out about the MSB/LSB thing that Manfred expained, you might be able to control more parameters. Switching faders and their scaling can be fun, but I can tell you that programming soft-takeover for many faders in the environment is no fun at all. And well, I haven't told you problem #4 yet. Recording automation. This is tricky when you operate software faders by hardware. Normally they do not write automation that way ...

Many of us have been there. And a lot died in the environment because it is so dark and there is no water. Thats the cause why Apple cannot update it, there are too many dead bodies lying around.

But I would not say that your work doesn't make sense. It is always good to know something about the environment, the more the better. Not only makes it sensitive to MIDI processing, you can use your knowledge quite often when you need to filter, translate or switch some MIDI streams for a certain instrument or effect. Mapping a foot pedal from Volume to Aftertouch, additionally add a pitch stream, give it an expression curve and a switch to rise a filter above a certain level is a breeze if you know your stuff. And it is definitely fun to play an hour with this electronic Lego and get a completely new sound.
 
Upvote 0
Thank you for having confirmed my hypotheses! It is a real pleasure to have the chance to speak to people who are so competent and generous with their time. Again, thank you very very much!
 
Upvote 0
Back
Top