lucid.cumsum¶
The cumsum function computes the cumulative sum (inclusive prefix sum) of elements in the input tensor along a specified axis.
Function Signature¶
def cumsum(a: Tensor, axis: int = -1) -> Tensor
Parameters¶
a (Tensor): The input tensor whose cumulative sums are to be computed.
axis (int, optional): The axis along which to perform the cumulative sum. Defaults to the last axis (\(-1\)).
Returns¶
Tensor A tensor of the same shape as a, where each element at position \(i\) along the specified axis is the sum of all elements from the start of that axis up to \(i\). If a requires gradients, the returned tensor will also require gradients.
Forward Calculation¶
For each index \(k\) along the chosen axis:
where \(a_{j}\) are the elements of a along the specified axis.
Backward Gradient Calculation¶
The Jacobian of the cumulative-sum operation yields, for an upstream gradient \(\nabla \mathrm{out}\):
Hence the gradient w.r.t. each input element \(a_{i}\) is
Example¶
>>> import lucid
>>> a = Tensor([1, 2, 3, 4], requires_grad=True)
>>> out = lucid.cumsum(a, axis=0)
>>> print(out)
Tensor([ 1 3 6 10], grad=None)
>>> s = out.sum()
>>> s.backward()
>>> print(a.grad)
[4., 3., 2., 1.]
Note
When axis is negative, it counts from the last dimension (e.g., \(axis=-1\) refers to the final axis of a).