Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the presets for Tritonserver #1085

Merged
merged 22 commits into from
Oct 18, 2021
Merged

Add the presets for Tritonserver #1085

merged 22 commits into from
Oct 18, 2021

Conversation

jackyh
Copy link
Contributor

@jackyh jackyh commented Sep 17, 2021

Tritonserver is used to deploy ML/DL Inference service.

@jackyh
Copy link
Contributor Author

jackyh commented Sep 18, 2021

Still have three kinds of issues here:

  1. about Macro:
    /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp: In function 'jint Java_org_bytedeco_tritonserver_global_tritonserver_TRITONBACKEND_1DECLSPEC(JNIEnv*, jclass)':
    /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp:863:38: error: expected primary-expression before ';' token
    863 | int rval = TRITONBACKEND_DECLSPEC;

This is based on on the following c++ code::
#ifdef _COMPILING_TRITONBACKEND
#if defined(_MSC_VER)
#define TRITONBACKEND_DECLSPEC __declspec(dllexport)
#define TRITONBACKEND_ISPEC __declspec(dllimport)
#elif defined(GNUC)
#define TRITONBACKEND_DECLSPEC attribute((visibility("default")))
#define TRITONBACKEND_ISPEC
#else
#define TRITONBACKEND_DECLSPEC
#define TRITONBACKEND_ISPEC
#endif
#else
#if defined(_MSC_VER)
#define TRITONBACKEND_DECLSPEC __declspec(dllimport)
#define TRITONBACKEND_ISPEC __declspec(dllexport)
#else
#define TRITONBACKEND_DECLSPEC
#define TRITONBACKEND_ISPEC
#endif
#endif

I followed this: https://github.com/bytedeco/javacpp/wiki/Mapping-Recipes#ignoring-attributes-and-macros, and add the line as:
.put(new Info("TTRITONBACKEND_DECLSPEC", "TRITONBACKEND_ISPEC").cppTypes().annotations())

But this doesn't work here, though looks like in most cases it works. What's the reason here?

  1. about struct(class):
    /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp: In function '_jobject* Java_org_bytedeco_tritonserver_global_tritonserver_TRITONBACKEND_1ApiVersion__Ljava_nio_IntBuffer_2Ljava_nio_IntBuffer_2(JNIEnv*, jclass, jobject, jobject)':
    /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp:900:40: error: cannot convert 'TRITONSERVER_Error*' to 'int*' in assignment
    900 | rptr = TRITONBACKEND_ApiVersion((uint32_t*)ptr0, (uint32_t*)ptr1);
    | ~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    | |
    | TRITONSERVER_Error*

This is based on the following c++ code:
struct TRITONSERVER_Error; (in tritonserver.h)
and this class is implemented in tritonserver.cc
I don't know how to do with this?

  1. enum related:
    In file included from /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp:142:
    /opt/tritonserver/include/triton/core/tritonbackend.h:233:28: note: initializing argument 4 of 'TRITONSERVER_Error* TRITONBACKEND_InputPropertiesForHostPolicy(TRITONBACKEND_Input*, const char*, const char**, TRITONSERVER_DataType*, const int64_t**, uint32_t*, uint64_t*, uint32_t*)'
    233 | TRITONSERVER_DataType* datatype, const int64_t** shape,
    | ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~

lots of errors are about enum, such as the following c++ code:
typedef enum TRITONSERVER_datatype_enum {
TRITONSERVER_TYPE_INVALID,
TRITONSERVER_TYPE_BOOL,
TRITONSERVER_TYPE_UINT8,
TRITONSERVER_TYPE_UINT16,
TRITONSERVER_TYPE_UINT32,
TRITONSERVER_TYPE_UINT64,
TRITONSERVER_TYPE_INT8,
TRITONSERVER_TYPE_INT16,
TRITONSERVER_TYPE_INT32,
TRITONSERVER_TYPE_INT64,
TRITONSERVER_TYPE_FP16,
TRITONSERVER_TYPE_FP32,
TRITONSERVER_TYPE_FP64,
TRITONSERVER_TYPE_BYTES
} TRITONSERVER_DataType;

How to deal with these cases?

@saudet
Copy link
Member

saudet commented Sep 18, 2021

.put(new Info("TTRITONBACKEND_DECLSPEC", "TRITONBACKEND_ISPEC").cppTypes().annotations())

It's "TRITONBACKEND_DECLSPEC", and not "TTRITONBACKEND_DECLSPEC", right?

This is based on the following c++ code:
struct TRITONSERVER_Error; (in tritonserver.h)
and this class is implemented in tritonserver.cc
I don't know how to do with this?

You'll probably need to add that one to the list above as well.

  1. enum related:
    In file included from /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp:142:
    /opt/tritonserver/include/triton/core/tritonbackend.h:233:28: note: initializing argument 4 of 'TRITONSERVER_Error* TRITONBACKEND_InputPropertiesForHostPolicy(TRITONBACKEND_Input*, const char*, const char**, TRITONSERVER_DataType*, const int64_t**, uint32_t*, uint64_t*, uint32_t*)'
    233 | TRITONSERVER_DataType* datatype, const int64_t** shape,
    | ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~

lots of errors are about enum, such as the following c++ code:
typedef enum TRITONSERVER_datatype_enum {
TRITONSERVER_TYPE_INVALID,
TRITONSERVER_TYPE_BOOL,
TRITONSERVER_TYPE_UINT8,
TRITONSERVER_TYPE_UINT16,
TRITONSERVER_TYPE_UINT32,
TRITONSERVER_TYPE_UINT64,
TRITONSERVER_TYPE_INT8,
TRITONSERVER_TYPE_INT16,
TRITONSERVER_TYPE_INT32,
TRITONSERVER_TYPE_INT64,
TRITONSERVER_TYPE_FP16,
TRITONSERVER_TYPE_FP32,
TRITONSERVER_TYPE_FP64,
TRITONSERVER_TYPE_BYTES
} TRITONSERVER_DataType;

How to deal with these cases?

This error doesn't seem related to the enum, and that should work as is. Could you explain why you believe they are connected?

@jackyh
Copy link
Contributor Author

jackyh commented Sep 18, 2021

.put(new Info("TTRITONBACKEND_DECLSPEC", "TRITONBACKEND_ISPEC").cppTypes().annotations())

It's "TRITONBACKEND_DECLSPEC", and not "TTRITONBACKEND_DECLSPEC", right?

Yes, my mistake, fixed.

This is based on the following c++ code:
struct TRITONSERVER_Error; (in tritonserver.h)
and this class is implemented in tritonserver.cc
I don't know how to do with this?

You'll probably need to add that one to the list above as well.

This is gone when the first one about typo is fixed.

  1. enum related:
    In file included from /workspace/upstream/javacpp-presets/tritonserver/target/native/org/bytedeco/tritonserver/linux-x86_64/jnitritonserver.cpp:142:
    /opt/tritonserver/include/triton/core/tritonbackend.h:233:28: note: initializing argument 4 of 'TRITONSERVER_Error* TRITONBACKEND_InputPropertiesForHostPolicy(TRITONBACKEND_Input*, const char*, const char**, TRITONSERVER_DataType*, const int64_t**, uint32_t*, uint64_t*, uint32_t*)'
    233 | TRITONSERVER_DataType* datatype, const int64_t** shape,
    | ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~

lots of errors are about enum, such as the following c++ code:
typedef enum TRITONSERVER_datatype_enum {
TRITONSERVER_TYPE_INVALID,
TRITONSERVER_TYPE_BOOL,
TRITONSERVER_TYPE_UINT8,
TRITONSERVER_TYPE_UINT16,
TRITONSERVER_TYPE_UINT32,
TRITONSERVER_TYPE_UINT64,
TRITONSERVER_TYPE_INT8,
TRITONSERVER_TYPE_INT16,
TRITONSERVER_TYPE_INT32,
TRITONSERVER_TYPE_INT64,
TRITONSERVER_TYPE_FP16,
TRITONSERVER_TYPE_FP32,
TRITONSERVER_TYPE_FP64,
TRITONSERVER_TYPE_BYTES
} TRITONSERVER_DataType;
How to deal with these cases?

This error doesn't seem related to the enum, and that should work as is. Could you explain why you believe they are connected?

This issue still there. Since it will use "TRITONSERVER_DataType *" as variable data type, looks like javacpp can not translate it correctly. When the compiler compiles the generated jnitritonserver.cpp file, the error is:

error: cannot convert 'TRITONSERVER_DataType' {aka 'TRITONSERVER_datatype_enum'} to 'TRITONSERVER_DataType*' {aka 'TRITONSERVER_datatype_enum*'}

I think if this is not an enum, but an int, there wont be an translation issue here.

@saudet
Copy link
Member

saudet commented Sep 19, 2021

What is the signature in Java for the native wrapper method of TRITONBACKEND_InputPropertiesForHostPolicy()? Please update the files in src/gen on your fork!

@jackyh
Copy link
Contributor Author

jackyh commented Sep 20, 2021

What is the signature in Java for the native wrapper method of TRITONBACKEND_InputPropertiesForHostPolicy()? Please update the files in src/gen on your fork!

Yes, updated the files in src/gen. The issue is still there.
I checked with: https://github.com/bytedeco/javacpp/wiki/Mapping-Recipes#specifying-names-to-use-in-java, maybe this is a similliar issue? with the difference of a struct/class to enum?
When I add something like as:
.put(new Info("TRITONSERVER_datatype_enum").pointerTypes("TRITONSERVER_DataType"))
.put(new Info("TRITONSERVER_DataType").valueTypes("TRITONSERVER_DataType").pointerTypes("@cast("TRITONSERVER_DataType*") PointerPointer", "@ByPtrPtr TRITONSERVER_DataType"))

still not works, looks like this is only works for c++ struct/class?

@saudet
Copy link
Member

saudet commented Sep 20, 2021

The Java signature of TRITONBACKEND_InputPropertiesForHostPolicy() is incorrect. We're going to need an IntPointer there for this to work. We don't need to do anything special, it will work out of the box as with enum nvinfer1::WeightsRole and the functions in IRefitter and VRefitter from TensorRT:
https://github.com/bytedeco/javacpp-presets/blob/master/tensorrt/src/gen/java/org/bytedeco/tensorrt/nvinfer/IRefitter.java
https://github.com/bytedeco/javacpp-presets/blob/master/tensorrt/src/gen/java/org/bytedeco/tensorrt/nvinfer/VRefitter.java
If you do things the same way as with TensorRT, you will get the same result. Please don't try to do things differently, and make sure that you got the order of the header files correctly.

@jackyh
Copy link
Contributor Author

jackyh commented Sep 20, 2021

WeightsRole

looks like I need to modify the generated file to put a int pointer there, right?

@saudet
Copy link
Member

saudet commented Sep 20, 2021

No, it works automatically because enum nvinfer1::WeightsRole gets parsed before nvinfer1::IRefitter and nvinfer1::VRefitter.

@Platform(
value = {"linux-arm64", "linux-ppc64le", "linux-x86_64", "windows-x86_64"},
compiler = "cpp11",
include = {"tritonbackend.h", "tritonrepoagent.h", "tritonserver.h"},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here, "tritonserver.h" should probably be moved before "tritonbackend.h".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here, "tritonserver.h" should probably be moved before "tritonbackend.h".
Yes, it works! It gets compiled eventually. Also, I removed the nouse gererated TensorRT files.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, simply remove src/gen and rerun the build to get rid of old files.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 11, 2021

@saudet updated the PR. Again, marked with "//jack" are places that I dont know how to deal with. Also, I added the simple.cc file for reference.
one more idea, can we just use the same way as presets to transform the file to java automatically, then modify the generated file?

@jackyh
Copy link
Contributor Author

jackyh commented Oct 11, 2021

@saudet I need to catch up with the schedule:) I need your help to finish this sample before the end of this week, bro.

@saudet
Copy link
Member

saudet commented Oct 11, 2021

@saudet I need to catch up with the schedule:) I need your help to finish this sample before the end of this week, bro.

Got it, don't worry, should be able to get this done tomorrow. Are you sure though that simple.cc is a standalone program?

@jackyh
Copy link
Contributor Author

jackyh commented Oct 11, 2021

@saudet I need to catch up with the schedule:) I need your help to finish this sample before the end of this week, bro.

Got it, don't worry, should be able to get this done tomorrow. Are you sure though that simple.cc is a standalone program?

some time ago, I saw someone did something like:
gcc simple.cc -o simple -ltritonserver

Also, I will try myself tomorrow about this.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 11, 2021

@saudet I need to catch up with the schedule:) I need your help to finish this sample before the end of this week, bro.

Got it, don't worry, should be able to get this done tomorrow. Are you sure though that simple.cc is a standalone program?

some time ago, I saw someone did something like: gcc simple.cc -o simple -ltritonserver

Also, I will try myself tomorrow about this.

you can find some doc here: https://github.com/triton-inference-server/server/blob/main/docs/inference_protocols.md#c-api

and, as you know, the original c++ file is here:
https://github.com/triton-inference-server/server/blob/main/src/servers/simple.cc

@jackyh
Copy link
Contributor Author

jackyh commented Oct 12, 2021

@saudet I need to catch up with the schedule:) I need your help to finish this sample before the end of this week, bro.

Got it, don't worry, should be able to get this done tomorrow. Are you sure though that simple.cc is a standalone program?

some time ago, I saw someone did something like: gcc simple.cc -o simple -ltritonserver

Also, I will try myself tomorrow about this.

@saudet I just tried, basically, simple can run with:

gcc simple.cc -o simple -I/opt/tritonserver/include -I/workspace/tritonserver_21.07_source/server-2.12.0/ -I/usr/local/cuda-11.4/targets/x86_64-linux/include -D TRITON_ENABLE_GPU -D TRITON_MIN_COMPUTE_CAPABILITY=5.3 -L/opt/tritonserver/lib -L/usr/local/lib -L/usr/local/cuda-11.4/targets/x86_64-linux/lib -ltritonserver -lpthread -lcudart -lstdc++

here's the command line:

root@872b34da9e75:/workspace/tritonserver_21.07_source/server-2.12.0/src/servers# ./simple
-r must be used to specify model repository path
Usage: ./simple [options]
-m <"system"|"pinned"|gpu> Enforce the memory type for input and output tensors. If not specified, inputs will be in system memory and outputs will be based on the model's preferred type.
-v Enable verbose logging
-r [model repository absolute path]

@saudet
Copy link
Member

saudet commented Oct 12, 2021

Yes, thanks, it looks like it was intended to be used as "simple" standalone sample code, so that looks OK. However, it is quite complex for a "simple" example. I wasn't able to take enough time today, but I should be able to work through that tomorrow.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 12, 2021

Yes, thanks, it looks like it was intended to be used as "simple" standalone sample code, so that looks OK. However, it is quite complex for a "simple" example. I wasn't able to take enough time today, but I should be able to work through that tomorrow.

thanks! Bro! I will also try to add something to simple.java

@jackyh
Copy link
Contributor Author

jackyh commented Oct 13, 2021

Yes, thanks, it looks like it was intended to be used as "simple" standalone sample code, so that looks OK. However, it is quite complex for a "simple" example. I wasn't able to take enough time today, but I should be able to work through that tomorrow.

@saudet How about simplify the simple sample if it's really too long? How about bypass some of the difficult part?

@saudet
Copy link
Member

saudet commented Oct 13, 2021

It's fine, I should have something that runs by the end of the day.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 13, 2021

It's fine, I should have something that runs by the end of the day.

Then I will try to start some discussion about use case/scenario/benchmark in the place you mentioned last time, adding the guy you mentioned.

@saudet
Copy link
Member

saudet commented Oct 13, 2021

Ok, along with a few fixes to the presets, I pushed an initial version of Simple.java that compiles fine and does something when executed with mvn clean compile exec:java -Djavacpp.platform=linux-x86_64, but I didn't test anything, so there's most likely bugs in there. In any case, please give it a try and let me know how far you get with it! Thanks

BTW, that C API is pretty low-level. It would make sense to write on top of that a high-level Java API...

@jackyh
Copy link
Contributor Author

jackyh commented Oct 13, 2021

Ok, along with a few fixes to the presets, I pushed an initial version of Simple.java that compiles fine and does something when executed with mvn clean compile exec:java -Djavacpp.platform=linux-x86_64, but I didn't test anything, so there's most likely bugs in there. In any case, please give it a try and let me know how far you get with it! Thanks

BTW, that C API is pretty low-level. It would make sense to write on top of that a high-level Java API...

OK, will do it tomorrow.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 14, 2021

@saudet errors when compiling:

Downloaded from central: https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.11/commons-codec-1.11.jar (335 kB at 391 kB/s)
[WARNING]
java.lang.UnsatisfiedLinkError: no jnitritonserver in java.library.path
at java.lang.ClassLoader.loadLibrary (ClassLoader.java:1860)
at java.lang.Runtime.loadLibrary0 (Runtime.java:871)
at java.lang.System.loadLibrary (System.java:1124)
at org.bytedeco.javacpp.Loader.loadLibrary (Loader.java:1738)
at org.bytedeco.javacpp.Loader.load (Loader.java:1345)
at org.bytedeco.javacpp.Loader.load (Loader.java:1157)
at org.bytedeco.javacpp.Loader.load (Loader.java:1133)
at org.bytedeco.tritonserver.global.tritonserver. (tritonserver.java:30)
at java.lang.Class.forName0 (Native Method)
at java.lang.Class.forName (Class.java:348)
at org.bytedeco.javacpp.Loader.load (Loader.java:1212)
at org.bytedeco.javacpp.Loader.load (Loader.java:1157)
at org.bytedeco.javacpp.Loader.load (Loader.java:1133)
at org.bytedeco.tritonserver.tritonserver.TRITONSERVER_ResponseAllocatorAllocFn_t. (TRITONSERVER_ResponseAllocatorAllocFn_t.java:70)
at Simple. (Simple.java:283)
at sun.misc.Unsafe.ensureClassInitialized (Native Method)
at java.lang.invoke.DirectMethodHandle$EnsureInitialized.computeValue (DirectMethodHandle.java:330)
at java.lang.invoke.DirectMethodHandle$EnsureInitialized.computeValue (DirectMethodHandle.java:327)
at java.lang.ClassValue.getFromHashMap (ClassValue.java:229)
at java.lang.ClassValue.getFromBackup (ClassValue.java:211)
at java.lang.ClassValue.get (ClassValue.java:117)
at java.lang.invoke.DirectMethodHandle.checkInitialized (DirectMethodHandle.java:351)
at java.lang.invoke.DirectMethodHandle.ensureInitialized (DirectMethodHandle.java:341)
at java.lang.invoke.DirectMethodHandle.internalMemberNameEnsureInit (DirectMethodHandle.java:291)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:254)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.UnsatisfiedLinkError: Could not find jnitritonserver in class, module, and library paths.
at org.bytedeco.javacpp.Loader.loadLibrary (Loader.java:1705)
at org.bytedeco.javacpp.Loader.load (Loader.java:1345)
at org.bytedeco.javacpp.Loader.load (Loader.java:1157)
at org.bytedeco.javacpp.Loader.load (Loader.java:1133)
at org.bytedeco.tritonserver.global.tritonserver. (tritonserver.java:30)
at java.lang.Class.forName0 (Native Method)
at java.lang.Class.forName (Class.java:348)
at org.bytedeco.javacpp.Loader.load (Loader.java:1212)
at org.bytedeco.javacpp.Loader.load (Loader.java:1157)
at org.bytedeco.javacpp.Loader.load (Loader.java:1133)
at org.bytedeco.tritonserver.tritonserver.TRITONSERVER_ResponseAllocatorAllocFn_t. (TRITONSERVER_ResponseAllocatorAllocFn_t.java:70)
at Simple. (Simple.java:283)
at sun.misc.Unsafe.ensureClassInitialized (Native Method)
at java.lang.invoke.DirectMethodHandle$EnsureInitialized.computeValue (DirectMethodHandle.java:330)
at java.lang.invoke.DirectMethodHandle$EnsureInitialized.computeValue (DirectMethodHandle.java:327)
at java.lang.ClassValue.getFromHashMap (ClassValue.java:229)
at java.lang.ClassValue.getFromBackup (ClassValue.java:211)
at java.lang.ClassValue.get (ClassValue.java:117)
at java.lang.invoke.DirectMethodHandle.checkInitialized (DirectMethodHandle.java:351)
at java.lang.invoke.DirectMethodHandle.ensureInitialized (DirectMethodHandle.java:341)
at java.lang.invoke.DirectMethodHandle.internalMemberNameEnsureInit (DirectMethodHandle.java:291)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:254)
at java.lang.Thread.run (Thread.java:748)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 28.515 s
[INFO] Finished at: 2021-10-14T04:30:58Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:3.0.0:java (default-cli) on project simple: An exception occured while executing the Java class. no jnitritonserver in java.library.path: Could not find jnitritonserver in class, module, and library paths. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

what's the probable reason?

@jackyh jackyh reopened this Oct 14, 2021
@jackyh
Copy link
Contributor Author

jackyh commented Oct 14, 2021

@saudet doesn't work...

my steps:
0) rm -rf /root/.m2/repository

  1. cd /workspace/forSimpleSample/javacpp-presets/tritonserver/
  2. mvn install --projects . -Djavacpp.platform.extension=-gpu
  3. cd platform
  4. mvn install --projects . -Djavacpp.platform.host
  5. cd ../samples
  6. mvn clean compile exec:java -Djavacpp.platform=linux-x86_64

failed with the above error.

@saudet
Copy link
Member

saudet commented Oct 14, 2021

  1. mvn install --projects . -Djavacpp.platform.extension=-gpu

The presets has no "-gpu" extension, this won't work, just do mvn install

@jackyh
Copy link
Contributor Author

jackyh commented Oct 14, 2021

  1. mvn install --projects . -Djavacpp.platform.extension=-gpu

The presets has no "-gpu" extension, this won't work, just do mvn install

OK, it works:

Downloaded from central: https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-utils/3.0.20/plexus-utils-3.0.20.jar (243 kB at 124 kB/s)
-r must be used to specify model repository path
Usage: java Simple [options]
-m <"system"|"pinned"|gpu> Enforce the memory type for input and output tensors. If not specified, inputs will be in system memory and outputs will be based on the model's preferred type.
-v Enable verbose logging
-r [model repository absolute path]

I will try to add some model_repo for it.:)

@jackyh
Copy link
Contributor Author

jackyh commented Oct 14, 2021

root@872b34da9e75:/workspace/forSimpleSample/javacpp-presets/tritonserver/samples# mvn exec:java -Djavacpp.platform=linux-x86_64 -Dexec.args="-r /workspace/tritonserver_21.07_source/server-2.12.0/docs/examples/model_repository/models"
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------< org.bytedeco.tritonserver:simple >------------------
[INFO] Building simple 1.5.6
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- exec-maven-plugin:3.0.0:java (default-cli) @ simple ---
I1014 08:26:20.969143 5194 metrics.cc:290] Collecting metrics for GPU 0: Quadro RTX 5000
...
1 - 1 = 0
1 + 1 = 2
1 - 1 = 0
Releasing buffer org.bytedeco.javacpp.Pointer[address=0x7f8eec007ca0,position=0,limit=0,capacity=0,deallocator=null] of size 64 in CPU for result 'OUTPUT0'
Releasing buffer org.bytedeco.javacpp.Pointer[address=0x7f8eec007c10,position=0,limit=0,capacity=0,deallocator=null] of size 64 in CPU for result 'OUTPUT1'

successfully! milestone I hit!:) ca y est. @saudet

@jackyh
Copy link
Contributor Author

jackyh commented Oct 16, 2021

@saudet added README and some other slight modifications, pls review. One more thing, how to generate the apidoc files?

@jackyh
Copy link
Contributor Author

jackyh commented Oct 16, 2021

There's still a couple of places that need to be updated, at least those files: https://github.com/jackyh/javacpp-presets/blob/master/tritonserver/src/main/java9/module-info.java https://github.com/jackyh/javacpp-presets/blob/master/platform/pom.xml https://github.com/jackyh/javacpp-presets/blob/master/README.md

done. pls review.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 17, 2021

@saudet looks like passed CI

@saudet
Copy link
Member

saudet commented Oct 17, 2021

Yes, it seems almost ready to be merged. 👍 Do you plan to add support for linux-arm64 and windows-x86_64 as well, or is tritonserver pretty much limited to linux-x86_64 only?

@jackyh
Copy link
Contributor Author

jackyh commented Oct 17, 2021

Yes, it seems almost ready to be merged. 👍 Do you plan to add support for linux-arm64 and windows-x86_64 as well, or is tritonserver pretty much limited to linux-x86_64 only?

Since I dont have tegra-based board on-hand, I need to check with ARM support later.
For Windows, I'm not familliar with Windows-based development.

@jackyh
Copy link
Contributor Author

jackyh commented Oct 17, 2021

Yes, it seems almost ready to be merged. 👍 Do you plan to add support for linux-arm64 and windows-x86_64 as well, or is tritonserver pretty much limited to linux-x86_64 only?

Since I dont have tegra-based board on-hand, I need to check with ARM support later. For Windows, I'm not familliar with Windows-based development.

For ARM based platform other than Tegra, maybe can have a try, let me check with the repo of Triton, I remember there's some ARM support there?

@saudet
Copy link
Member

saudet commented Oct 17, 2021

It looks like there are binaries available for download on the release page for both linux-arm64 and windows-x86_64: https://github.com/triton-inference-server/server/releases

So, to build against those platforms as well, we could download them automatically, for example, as part of the cppbuild.sh script. Would you like to work on that right away? Or would like to merge what we have now and work on that later maybe?

@jackyh
Copy link
Contributor Author

jackyh commented Oct 18, 2021

It looks like there are binaries available for download on the release page for both linux-arm64 and windows-x86_64: https://github.com/triton-inference-server/server/releases

So, to build against those platforms as well, we could download them automatically, for example, as part of the cppbuild.sh script. Would you like to work on that right away? Or would like to merge what we have now and work on that later maybe?

for me, since I don't have those platform (board), I prefer to defer the support of ARM and Windows later.

@saudet saudet merged commit 9a07755 into bytedeco:master Oct 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants