lucid.clip¶
- lucid.clip(a: Tensor, /, min_value: int | float | complex | None = None, max_value: int | float | complex | None = None) Tensor ¶
The clip function limits the values in the input tensor to a specified range element-wise.
Function Signature¶
def clip(
a: Tensor,
min_value: _Scalar | None = None,
max_value: _Scalar | None = None,
) -> Tensor
Parameters¶
a (Tensor): The input tensor whose values are to be clipped.
min_value (_Scalar | None, optional): The lower bound for clipping. If
None
(default), the minimum ofa
is used.max_value (_Scalar | None, optional): The upper bound for clipping. If
None
(default), the maximum ofa
is used.
Returns¶
- Tensor:
A new tensor where each element is clipped to the range \([\text{min\_value}, \text{max\_value}]\). If a requires gradients, the resulting tensor will also require gradients.
Forward Calculation¶
The forward calculation for clip is defined as:
Backward Gradient Calculation¶
The gradient of the clip function with respect to the input tensor a is:
This means the gradient is non-zero only for elements within the clipping range.
Example¶
>>> import lucid
>>> a = Tensor([-1, 0, 1, 2, 3], requires_grad=True)
>>> out = lucid.clip(a, min_value=0, max_value=2)
>>> print(out)
Tensor([0 0 1 2 2], grad=None)
The function supports tensors of arbitrary shapes:
>>> import lucid
>>> a = Tensor([[-1, 0, 3], [2, -2, 4]], requires_grad=True)
>>> out = lucid.clip(a, min_value=0, max_value=3)
>>> print(out)
Tensor([[0 0 3] [2 0 3]], grad=None)