-
Notifications
You must be signed in to change notification settings - Fork 747
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[pytorch] [java/scala] new version remove Module class convert to layerImpl ,cause the layerImpl cannot covert to Module ! #1393
Comments
The prototype of the register_module method is: public Module register_module(String name, Module module); In previous version, there was, in addition, specialized methods like: public LinearImpl register_module(String name, LinearImpl module); but these were a workaround for a bug that has been fixed and they are not needed anymore. LinearImpl fc1 = new LinearImpl(784, 64);
register_module("fc1", fc1); |
It is work fine ,but now I has another question, the new version ,we has implement the SequentialImpl.class, for the normal layer in pytorch. we can easy put or push_back to the SequentialImpl, It is can work . but now I need to user_defined layer or model list want to add to the SequentialImpl ,but it is is not work.
it can work but next user_defined model add to SequentialImpl cannot work
so how to use SequentialImpl for a list of user_defined layer or model? thanks |
Now I has implement user defined layer use SequentialImple named SeqNow like our example code Net Model simpleMnist, the code can running ,but cannot decrease the loss, I don't know why.
the console seqNow ··· |
for the net console Epoch: 1 | Batch: 100 | Loss: 2.2619615 ··· |
Your |
thanks, add i use the same way solve it
} ··· |
You just need to register |
I want to use the reshape layer or view layer in SequentialImpl ,but not found , so I want to know if we have the method to add reshape layer in model middle postion? |
yes ,you say is correct |
by the way , in the new version ,does ModuleListImpl ModuleDictImpl could perfect implement for really use in coding? thanks |
There is not such modules in libtorch. You'll have to do the reshape in a |
No tested, but I think it's usable. |
next day I will test them ,then give you answer |
HI @HGuillemet , In my opinoin, the ModuleListImpl ModuleDictImpl two class could usable only a part not all,
but we can not invoke the forward method ,because in the ModuleListImpl ModuleDictImpl container all element is Module class ,Module class doesn't have forward method no more, If I want to convert each element to it origin layer type, also meet error , I don't know how to forward them element layer in ModuleListImpl ModuleDictImpl , if convinient please give me some example to do this. thanks ···
// var count = 0 class ListNow() extends Module {
// x = seqs.forward(x) ··· |
I also found some convertion make me confuse, example like LinearImpl could convert to Module,but Module can not convert to LinearImpl |
but if in ModuleList ModuleDict forward method write code like that ,them can run perfectly.
but as you know ,we want to foreach module list or dict layer element like python pytorch yield to invoke each layer forward method,it will very easy to write |
so most important,how to foreach module list or dict layer element like python pytorch yield to invoke each layer forward method for ModuleList ModuleDict, |
like you see ,in python ,we use ModuleList for a range Layer, like that
if in javacpp pytorch, how to coding that |
There is no reason for this. I suspect this is a problem related to Scala. |
Note that if you have a Java module, for instance: LinearImpl linear = new LinearImpl(3,3); you pass it to libtorch and get it back: ModuleListImpl list = new ModuleListImpl();
list.push_back(linear);
Module m = list.get(0); Module System.err.println(linear == m); // false
System.err.println(linear.equals(m)); // false
System.err.println(m instanceof LinearImpl); // false
System.err.println(linear.address() == m.address()); // false
System.err.println(linear.asModule() == m); // false
System.err.println(linear.asModule().equals(m)); // true
System.err.println(linear.asModule().address() == m.address()); // true There is nothing we can do about this. |
Also even in C++, you cannot directly call forward on a items of ModuleList. |
oh , I think ,if the Module class just is java wrapper obj, and it cannot convert back to real layer obj to invoke forward method, so what the meanning for ModuleDict and ModuleList in pytorch? the Module need forward method or must has anyway to invoke element layer forward method , how do you think? |
I don't think so, maybe in java the Module cannot convert back to layer obj |
give you one important example , when layer organize a array , they have some father class Module, then they lost forward method? can not image that bad things!!! do you think this rule is legal?
|
We could probably add a constructor for native modules to downcast from Module m = list.get(0);
LinearImpl linear = new LinearImpl(m);
Tensor output = linear.forward(input); What do you think ? |
I think we need only one step ,just add forward method to Module class like Sequential class , only that we could use Module array for layer forward method yield perfect, if we do not , I only use switch case judge Layer type then process it , it not grace .like this class ModuleListYieldNow() extends Module { |
Yes, either you use some Java list or array of |
maybe hard for you to select the best way solve it, so now just follow you way like new LinearImpl(m) ,thanks |
The first option already works. So if it's enough please use it. |
thanks , need the second option, |
This issue is finally addressed by PR bytedeco/javacpp#700 t2 = new LinearImpl(m).forward(t) where
where |
perfect ,now javacpp pytorch is forward to the useable tools for java /scala /jvm,waiting for PR merge,thanks @HGuillemet |
That pull request has been merged, so this should be working now! |
perfect,thanks,waiting 1.5.10 version release publish to mvn repos |
HI ,
when I use new version pytorch 2.0.1 in scala 2.11.10, meet the class convert error, the example java code I rewrite in scala, but it cannot work ,because in javacpp-pytorch new version ,Module.class has remove all the layerImpl convert to Module ,use register_module method, why remove them? now the error is
how to solve that error ,do I need import some method sugar dependency in scala code?
if I scala code remove asInstanceOf[LinearImpl] these code ,the scala code cannot compile, Please help me ,thanks
dependency:
code : convert the example java code to scala
The text was updated successfully, but these errors were encountered: