# HDR Image assemble

We have several LDR with different exposure and we need to make a one HDR image.

Let’s nubmer of this LDR images will be
$$j=1..n$$
The exposure time of image j
$$\Delta t_j$$

Let response of a point on the sensor element will be the exposure X. We will base on the principle of reversibility, which is a physical property electronic imaging systems, based this exposure can be defined:  $$E * \Delta t \\ E \text{- Illumination,}\: \Delta T \text{- time}$$

Unfortunatelly pixel value in photo not equal X. Lets make:
$$\text{pixel value}\: i,\: \text{in image} j = Z_{ij}$$
then
$$Z_{ij} = f(X) = f(E_i\Delta t_j)$$
f – camera response function (0) – converts the exposure to the pixel value

We can find this function or it can be assumed to match the sRGB standard, gamma correction curve with
$$\gamma = 2.2$$
In this case (where we now this function)
$$f^-1(Z_{ij}) = E_i\Delta t_j$$
then
$$E_j = \frac{f^-1(Z_{ij})}{\Delta t}$$
At this moment we have a problem, it’s impossible to restore luminance using one image, because we will lose information in underexposed and overexposed pixels.
In other words we need to take information from all images with different weighting.
$$w(Z)\: -\: \text{function used to attenuate the contribution of poorly exposed pixels}$$
$$E_i = \frac{\sum\limits_{j=1}^n(\frac{w(Z_{ij}f^-1(Z_{ij}}{\Delta t})}{\sum\limits_{j=1}^nw(Z_{ij})}$$

We assume that the LDR images are captured in sRGB so we can use the Luminance standard computation wich I describe earlier(1)
Also
$$Z_{ij} \in [0;1]$$

float weight1 ( float lum)
{
float res = 1.0 - pow((2.0 * lum - 1.0), 12.0);
return res;
}