Skip to content

Commit

Permalink
Backport PR #3044 on branch 1.2.x (feat: Add variance to ZINB model) (#…
Browse files Browse the repository at this point in the history
…3050)

Backport PR #3044: feat: Add variance to ZINB model

Co-authored-by: Ramon Viñas <rvinas@users.noreply.github.com>
  • Loading branch information
meeseeksmachine and rvinas authored Nov 20, 2024
1 parent 8ad0139 commit daf662d
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
4 changes: 2 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@ to [Semantic Versioning]. Full commit history is available in the

- Added adaptive handling for last training minibatch of 1-2 cells in case of
`datasplitter_kwargs={"drop_last": False}` and `train_size = None` by moving them into
validation set, if available.
{pr}`3036`.
validation set, if available. {pr}`3036`.
- Add `batch_key` and `labels_key` to `scvi.external.SCAR.setup_anndata`.
- Implemented variance of ZINB distribution. {pr}`3044`.

#### Fixed

Expand Down
3 changes: 2 additions & 1 deletion src/scvi/distributions/_negative_binomial.py
Original file line number Diff line number Diff line change
Expand Up @@ -502,7 +502,8 @@ def mean(self) -> torch.Tensor:

@property
def variance(self) -> None:
raise NotImplementedError
pi = self.zi_probs
return (1 - pi) * self.mu * (self.mu + self.theta + pi * self.mu * self.theta) / self.theta

@lazy_property
def zi_logits(self) -> torch.Tensor:
Expand Down

0 comments on commit daf662d

Please sign in to comment.