# Posts tagged ‘real number’

Thanks goes to Daniel Pook Kolb.

Given a weight space of real numbers where each weight is a dimension, we can embed any data in this space be it an input or a pure value itself. By breaking the connection of weight to data (shape) we can essentially have multiple weights drive a single input.

For example if we have a dimension of two weights: W1 and W2 when W1 is at 1.0 we can drive a face pose/blendshape – lets call this shape S1. Likewise when W2 is at 1.0 we can drive another shape S2.

Treating this as a two dimensional space, we can place entries anywhere inside it. For example combining W1 and W2 (W1 * W2) we can place an entry (shape) at this point in the space.

Manipulating the weights drive this entry in a non-linear way. For example when W1 is at 1.0 and W2 is at 1.0, W1*W2 results in this entry being at 1.0. Likewise when W1 is at 1.0 but W2 is at 0.0, (W1 * W2) the entry is at 0.0.

Greater values of the input weights causes greater influence of the entry at there combination. This influence happens in a non-linear way.

W1 0.00 W2 0.00 = Entry 0.0

W1 0.25 W2 0.25 = Entry 0.06

W1 0.50 W2 0.50 = Entry 0.25

W1 0.75 W2 0.75 = Entry 0.56

W1 1.00 W2 1.00 = Entry 1.00

If we treat this as corrective system, the entry at the combination is additive to the other entrys at W1 and W2. This entry however can be an absolute if its regarded as a value that the W1 and W2 entries take negatively, whilst driving a new entry positively.

The weight space can in theory have as many dimensions as possible, with as many entries into its space (given that they exist inside a range of 0:1).