Submitted by lennart-reiher-ika t3_y8ymlx in MachineLearning

I happy to share our latest repository for making the TensorFlow C++ API more accessible!

We now provide a pre-built library and a Docker image for easy installation and usage of the TensorFlow C++ API at https://github.com/ika-rwth-aachen/libtensorflow_cc.

In order to, e.g., run TensorFlow models from C++ source code, one usually needs to build the C++ API in the form of the libtensorflow_cc.so library from source. There is no official release of the library and the build from source is only sparsely documented.

We try to remedy this current situation by providing two main components:

  1. We provide the pre-built libtensorflow_cc.so including accompanying headers as a one-command-install deb-package.
  2. We provide a pre-built Docker image based on the official TensorFlow Docker image. Our Docker image has both TensorFlow Python and TensorFlow C++ installed.

Try it out yourself by running the example application:

git clone https://github.com/ika-rwth-aachen/libtensorflow_cc.git && \
cd libtensorflow_cc && \
docker run --rm \
    --volume $(pwd)/example:/example \
    --workdir /example \
    rwthika/tensorflow-cc:latest \
        ./build-and-run.sh

While we currently only support x86_64 machines running Ubuntu, this could easily be extended to other OS and platforms in the future. Except for a some exceptions, all TensorFlow versions from 2.0.0 through 2.9.2 are available, 2.10.0 coming soon.

If you want to use the TensorFlow C++ API to load, inspect, and run saved models and frozen graphs in C++, we suggest that you also check out our helper library tensorflow_cpp.

Looking forward to hearing some feedback from you, thanks!
Lennart

40

Comments

You must log in or register to comment.

bmer t1_it3olbz wrote

Windows support would be great. I’ve previously tried to get windows to work, but never managed to resolve the missing symbols issue.

See for example: https://github.com/tensorflow/tensorflow/issues/41904

In the end I used the c api, which works on Linux, windows and Mac and are available precompiled from the website.

2

soulslicer0 t1_it570nh wrote

The pytorch c++ api is so much better

3

Syncopat3d t1_it60d8h wrote

What are some typical use cases for using TF from C++, other than for inference in a C++ app? For inference, at least in TF1, there was tfcompile, which gives you an artifact (a standalone dynamic library that does inference) that is much more light-weight and easy to integrate into a C++ program than TF.

In TF2, IDK whether tfcompile still exists and works or whether there is something similar.

1

lennart-reiher-ika OP t1_it6dgi9 wrote

I think that inference would still be the main use case, but sure, you could also use it for graph inspection, training, whatever, if you really wanted to.

Looks like tfcompile still exists, but I have never used it myself. Doesn't look to be much better documented than the C++ API itself. The full C++ API of course gives you way more flexibility and doesn't involve this special process of compiling a specific model. We have been pretty happy with our additional wrapper library tensorflow_cpp, allowing us to easily load arbitrary frozen graphs and saved models for inference.

1

lennart-reiher-ika OP t1_it6domt wrote

You're probably right that there would be demand for Windows support indeed. Still, Linux probably covers the majority of potential users, so we would rather first prioritize things like ARM support.

Feel free to contribute though, if you would like to tackle Windows once again! You do seem to have the required Windows experience.

2

dreugeworst t1_it742kh wrote

I've previously had to recompile the cc library as it wasn't exporting all the symbols needed to use certain parts of the api. I couldn't find it at a glance, sorry if this was listed somewhere obvious, but does this repo expand the exports from the shared object / dll?

1

dreugeworst t1_it7lvwh wrote

We were having issues with using LoadSavedModel in combination with a SavedModelBundle. As the latter included a protobuf::map in the struct definition and protobuf symbols weren't exported, this caused us issues.

I just tried a minimal example and it worked. It has been 2 years since I came across this issue, so probably something has changed. Although I don't see the relevant symbols added to the version script, the .so file does export the symbols in question, perhaps I was compiling with -fvisibility=hidden on, or maybe this has been removed by the tf project.

Does your CMake support come from the tensorflow project? Bit disappointing to see there is still no tensorflow target. BTW, I think your example meant to use target_include_directories(foo PRIVATE ${TensorFlow_INCLUDE_DIRS})

1

bmer t1_it96oiz wrote

It’s been awhile since I’ve looked at it, so not sure how hard it would be to get to work. I only commented since you mentioned that you would support other operating systems. For others interested in cross platform support there is also cppflow.

1