Skip to content

Commit

Permalink
prepare for differentiable topk use for another lib
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Aug 18, 2023
1 parent 8648bed commit 860081f
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 2 deletions.
8 changes: 7 additions & 1 deletion colt5_attention/topk.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,17 @@ def topk(
eps_init = None,
eps_decay = 1.,
mask = None,
fused = False
fused = False,
non_differentiable = False
):
"""
differentiable top-k on last dimension
"""

if non_differentiable:
values, indices = torch.topk(x, k = k, dim = -1)
return TopkReturn(values, indices, None, None)

assert coor_descent_k_ratio >= 1.
assert k > 0

Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'CoLT5-attention',
packages = find_packages(),
version = '0.10.14',
version = '0.10.15',
license='MIT',
description = 'Conditionally Routed Attention',
long_description_content_type = 'text/markdown',
Expand Down

1 comment on commit 860081f

@johndpope
Copy link

@johndpope johndpope commented on 860081f Aug 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I beseech to run all your pytorch code through chatgpt - and simply say
"can you comment this code?"

This code defines a class Attend that implements attention mechanisms, including Flash Attention. It provides options to handle dropout, causal masking, and the choice between Flash Attention and other attention methods. The code is heavily focused on efficient computation for attention mechanisms in deep learning models.

https://chat.openai.com/share/4cc9a83e-cc31-499f-a3d6-01530112c384

you may also want to check out this sample with sympy + dirac's equation
https://chat.openai.com/c/527d1398-837b-41e9-9839-41603097c50b

UPDATE
accidentally mistyped subsequent prompt
"can you comment ON this code?"

surprisingly gave me this helpful answer back
Screenshot from 2023-08-19 06-00-36

Please sign in to comment.