Replies: 4 comments 7 replies
-
D'Oh! I had misunderstood the mlx docs. The higher level stuff such as NN layers are only implemented in python and not in c++. Or at least not yet. So, when porting to a 100% c++ solution, it's necessary to port not only the example code but also some of the stuff such as NN layers. |
Beta Was this translation helpful? Give feedback.
-
Is higher level stuff such as NN layers going to eventually be made available in c++? |
Beta Was this translation helpful? Give feedback.
-
Yes, that's definitely a possibility and something we originally intended. Prioritizing depends on how useful it would be. The C++ nn API could be pretty similar with maybe some slight changes to be more C++ friendly. |
Beta Was this translation helpful? Give feedback.
-
Porting a Python-based example to C++ can indeed be feasible, especially considering the similarities between Python and C++ APIs in certain contexts. While the process may involve translating the logic and structure from Python to C++, leveraging equivalent functionalities, it's essential to note that direct translation might not always be one-to-one due to language differences. Starting a C++ port based on your understanding could be a great initiative. Just be prepared for potential nuances and differences between the languages that might require adjustments in the translation process. Utilizing the similar logic and APIs can certainly expedite the transition, but some adaptations may be necessary for optimal functionality in C++. Best of luck with the porting process! |
Beta Was this translation helpful? Give feedback.
-
Has anybody attempted a port of the python-based llama example to c++?
My understanding is that because the python and c++ apis are very similar, it's straightforward to port from python to c++. Is my understanding correct?
If my understanding is correct, then I'm going to get started on a c++ port.
Beta Was this translation helpful? Give feedback.
All reactions