-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: too many values to unpack (expected 3) from ast.predict_celltypes() #23
Comments
Hi @jgu13 |
Hi Kieran! I just would like to let you know that I modified line 297 of
I tested on the dataset
I hope this could help! |
yes that looks right, the original was Re cell names, would it make sense to assume the rownames of type_assignments.index = dset.index ? Finally, the cell types are stored internally so shouldn't have to be supplied, so type_assignments.columns = self._type_dset.get_classes() + ["Other"] if that makes sense. Thanks for catching this. Would you like to structure as a pull request to get contribution credit? Otherwise I'm happy to modify directly. |
I have got an AttributeError from using get_classes()
This could be caused by that |
Hi,
I am trying to predict cell types by calling
ast.predict_celltypes(dset : pd.DataFrame)
. I first wanted to test the function with the publicly available datasetbasel_22k_subset.h5ad
which is the one being loaded intoastir_tutorial
jupyter notebook. I then got the error from line 297_, exprs_X, _ = new_dset[:]
inceltype.predict()
. Here is how to reproduce the error:I have trained a CellTypeModel with the
basel_22k_subset.h5ad
dataset, with the initial parameters:Then I saved the trained model by calling
ast.save_model('trained_model.hdf5')
and loaded the model by callingast.load_model('trained_model.hdf5')
.To convert the
basel_22k_subset.h5ad
dataset into a dataframe, I didThe data frame is properly loaded.
When I finally tried to predict cell types by calling
ast.predict_celltypes(df)
, the error was raised.Any insight would be appreciated.
The text was updated successfully, but these errors were encountered: