Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I Added Doc-String Into The class. #5293

Merged
merged 13 commits into from
Oct 11, 2023
12 changes: 12 additions & 0 deletions src/diffusers/models/unet_2d_blocks.py
Original file line number Diff line number Diff line change
Expand Up @@ -466,6 +466,18 @@ def get_up_block(


class AutoencoderTinyBlock(nn.Module):
"""
Tiny Autoencoder block used in [`AutoencoderTiny`]. It is a mini residual module consisting of plain conv + ReLU blocks.

Args:
in_channels (`int`): The number of input channels.
out_channels (`int`): The number of output channels.
act_fn (`str`):` The activation function to use. Supported values are `relu`, `tanh`, and `sigmoid`.

Output:
A tensor with the same shape as the input tensor, but with the number of channels equal to `out_channels`.
"""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The indentation seems to be off. Could we fix that? It should be:

"""
Tiny Autoencoder block used in [`AutoencoderTiny`]. It is a mini residual module consisting of plain conv + ReLU blocks.  

Args:
    in_channels (`int`): The number of input channels.
    out_channels (`int`): The number of output channels.
    act_fn (`str`):` The activation function to use. Supported values are `"swish"`, `"mish"`, `"gelu"`, and `"relu"`.
    
Returns:
    `torch.FloatTensor`: A tensor with the same shape as the input tensor, but with the number of channels equal to `out_channels`.
"""

The structure of the API documentation is referred from https://github.com/huggingface/doc-builder#writing-source-documentation.

The supported activation functions can be noted from:

def get_activation(act_fn):

Let me know if there's anything unclear.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have solved the indentation problem. The get_activation function is defined in a separate file, so I will not explain it here 🤗.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not about explaining that. It's about including the right information. The current statement suggests that we support tanh and sigmoid as the activation values but we clearly not (as evident in the get_activation() function). To make that point clearly, I provided the reference to get_activation().

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm so sorry, that was my mistake.


def __init__(self, in_channels: int, out_channels: int, act_fn: str):
super().__init__()
act_fn = get_activation(act_fn)
Expand Down
Loading