Skip to content

Remove leaky custom_gradient wrapper for softplus (fixes #2008)#2022

Open
pranithreddym wants to merge 2 commits into
tensorflow:mainfrom
pranithreddym:fix/softplus-memory-leak
Open

Remove leaky custom_gradient wrapper for softplus (fixes #2008)#2022
pranithreddym wants to merge 2 commits into
tensorflow:mainfrom
pranithreddym:fix/softplus-memory-leak

Conversation

@pranithreddym
Copy link
Copy Markdown

The _stable_grad_softplus custom_gradient closure captured the input tensor x, which ended up in TF's gradient registry (never cleared), causing a memory leak on every call. The same issue affected sigmoid.py which had an identical copy of the wrapper.

tf.nn.softplus has been numerically stable since 2019-2020:

The JAX mode already used tf.nn.softplus directly; this change makes the non-JAX path consistent and removes the TODO(b/155501444) workaround.

Affected files:

  • bijectors/softplus.py: replace if/else block with _stable_grad_softplus = tf.nn.softplus
  • bijectors/sigmoid.py: same removal of the duplicate custom_gradient wrapper

)

The _stable_grad_softplus custom_gradient closure captured the input
tensor x, which ended up in TF's gradient registry (never cleared),
causing a memory leak on every call. The same issue affected sigmoid.py
which had an identical copy of the wrapper.

tf.nn.softplus has been numerically stable since 2019-2020:
- Uses log1p via tensorflow/tensorflow@ce92fd6ae5
- Gradient uses sigmoid via tensorflow/tensorflow@862d326bf

The JAX mode already used tf.nn.softplus directly; this change makes the
non-JAX path consistent and removes the TODO(b/155501444) workaround.

Affected files:
- bijectors/softplus.py: replace if/else block with _stable_grad_softplus = tf.nn.softplus
- bijectors/sigmoid.py: same removal of the duplicate custom_gradient wrapper
@google-cla
Copy link
Copy Markdown

google-cla Bot commented May 11, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant