While working on a Raspberry Pi image that had been used prior by an electrical engineer to setup all of the dependencies for the hardware, there was an error when trying to upgrade to use Tensorflow. Tensorflow was needed to run a model trained with Cognitive Services: Custom Vision Service. The error was when the script imported Numpy. That caused the following error:
numpy/core/_multiarray_umath.cpython-35m-arm-linux-gnueabihf.so: undefined symbol: cblas_sgemm
To remedy this, all of the installations of Numpy had to be uninstalled. The following commands were run:
- apt-get remove python-numpy
- apt-get remove python3-numpy
- pip3 uninstall numpy
After all three of those commands complete, Numpy was reinstalled using the package provided for raspian:
apt-get install python3-numpy
For one project, there was a need for multiple models within the same Python application. These models were trained using the Cognitive Services: Custom Vision Service. There are two steps to using an exported model:
- Prepare the image
- Classify the image
Prepare an image for prediction
Classify the image
To run multiple models in Python was fairly simple. Simply call tf.reset_default_graph() after saving the loaded session into memory.
After the CustomVisionCategorizer is create, just call score and it will score with the labels in the map.
To see if I could, I put together a cross communication library for .Net Core and Python applications using Boost.Interprocess, Boost.Python, and Boost.Signals2. The goal was simple, expose the same interface for cross communication to C# and Python. The approach taken was to use the condition example and edit it to expose to the different languages.
First I need to create the objects to make the interface. There are four files making up these objects:
- shm_remove.hpp – just a lifecycle object to clear the shared buffer when it is destructed
- TraceQueue.hpp – The shared memory object
- SharedMemoryConsumer.hpp – The subscriber to the shared memory data
- SharedMemoryProducer.hpp – The publisher for the shared memory data
These objects comprise the core interface of the shared memory provider. Now, the memory providers need to be exposed to multiple languages. There are different ways to do this and I decided to do it by hand. I should point out SWIG is my usual approach to this task, however, in this instance it seemed easy enough to do it by hand.
To expose the python code, I needed to create a few classes to expose the interface definitions to Boost.Python. The two classes are:
- PythonSharedMemoryConsumer.hpp – The python interface for the SharedMemoryConsumer
- PythonModule.cpp – The file that exposes the module to python
These two classes combine to expose the files to python and can be used in a python script by just importing the shared library.
With the python portion complete, I needed to expose the shared memory objects to CSharp. This is easy enough to do by hand if you expose the classes to be used by PInvoke. To accomplish this, I only needed three files:
- NetCoreSharedMemoryProducer.hpp – The .NET Core version of the publisher
- NetCoreSharedMemoryConsumer.hpp – The .NET Core version of the consumer
- NetCoreModule.cpp – The source file exposing the interfaces for PInvoke
Now we need to call that code from C# using PInvoke Interop