Retrain or fine-tunning in Caffe a network with images of the existing categories
I'm quite new to caffe and this could be a non sense question.
I have trained my network from scrath. It trains well and gets a reasonable accuracy in tests. The question is about retraining or fine tunning this network. Suppose you have new samples of images of the same original categories and you want to teach the net with this new images (because for example the net fails to predict in this particular images).
As far a I know it is possible to resume training with a snapshot and solverstate or fine-tuning using only the weigths fo the training model. What is the best option in this case?. or is better to retrain the net with original images and new ones together?.
Think in a possible "incremental training" scheme, because not all the cases for a particular category are available in the initial training. Is it possible to retrain the net only with the new samples?. Should I change the learning rate or maintain any parameters in order to maintain the original accuracy in prediction when training with the new samples? the net should predict in original image set with the same behaviour arter fine tunning.
Thanks in advance.
See also questions close to this topic
Are there better / similar services like google colab?
From a long time I have been doing my experiments on google colab and opted not to buy a gpu instead. But are there better or similar services like Google Colab that is used for deep learning?
Plot validation loss in Tensorflow Object Detection API
I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes I use pre-trained faster_rcnn_resnet50_coco_2018_01_28 model.
I want to detect under/overfitting after training of the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss.
Is this possible to plot a validation loss on Tensorboard too?
why not solve hyper parameter λ of L2 regularization in DNN simultaneously
The regularation like weight decay in DNN is related to the Inequality Constraints optimisation. According to Lagrange Multipliers and the Karush-Kuhn-Tucker conditions, there exist a unique λ when subject to KKT condition. But when we solve the optimize problem, we just set a prior value instead of solve it, why?
How should LMDB record data be organized so that Caffe's data layer can read them?
I need to create a fast, efficient, low-overhead routine for storing key / value pairs in LMDB for subsequent consumption by Caffe's data layer (i.e., no linking to a bunch of external libraries).
I've reviewed the caffe.proto, caffe.pb.h and caffe.pb.cc files and a handful of others pertaining to Google's protocol buffers to gain an understanding of the Datum class, which is the 'value' in LMDB records.
The best bet for me appears to be an audit of the datum.SerializeToString() method, which takes all the data structures and nested structures comprising Datum and converts them to some sort of string value. However, after plumbing the depths of Google's protobuf, I haven't been able to find where this method is defined.
Can someone point me in the right direction? And obviously if there's a faster / better / cheaper way of understanding how the serialized Datum value should be structured, then I'd definitely be open to it. Thanks.
Finding boost-python3 with Anaconda cmake prefix
DLDR How do I point the cmake at boost-python3 library? It is not automatically detected by cmake.
I'm trying to build caffe for Python 3.6 using the provided cmake.
My system specs:
- Python 3.6.5, Anaconda custom (64-bit)
- Mac OS 10.13.6
- No CUDA
I've installing boost with brew, e.g.
brew install boost boost-python3
I can see the boost libraries using
find / -name libboost* 2>/dev/null. They occur in three directories
/usr/local/lib/-> symlink to above
- boost-python3 is in
If I run
cmake -DCMAKE_PREFIX_PATH=<anaconda_env_path> -D python_version=3, I get this at the top of the output
-- Boost version: 1.67.0 -- Found the following Boost libraries: -- system -- thread -- filesystem -- chrono -- date_time -- atomic
But further down, I also get
CMake Warning at /Users/Mauceri/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1723 (message): No header defined for python-py365; skipping header check Call Stack (most recent call first): cmake/Dependencies.cmake:157 (find_package) CMakeLists.txt:49 (include) -- Could NOT find Boost CMake Warning at /Users/Mauceri/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1723 (message): No header defined for python-py36; skipping header check Call Stack (most recent call first): cmake/Dependencies.cmake:164 (find_package) CMakeLists.txt:49 (include) -- Could NOT find Boost CMake Warning at /Users/Mauceri/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1723 (message): No header defined for python-py3; skipping header check Call Stack (most recent call first): cmake/Dependencies.cmake:164 (find_package) CMakeLists.txt:49 (include) -- Could NOT find Boost -- Could NOT find Boost -- Python interface is disabled or not all required dependencies found. Building without it...
Similar to Cmake doesn't find Boost, I have tried adding
to the cmake command. In the resulting output, the following repeats three times with boost_python-py365, boost_python-py36, boost_python-py3, and boost_python:
[ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1121 ] _boost_TEST_VERSIONS = 1.67.0;1.67;1.66.0;1.66;1.65.1;1.65.0;1.65;1.64.0;1.64;1.63.0;1.63;1.62.0;1.62;1.61.0;1.61;1.60.0;1.60;1.59.0;1.59;1.58.0;1.58;1.57.0;1.57;1.56.0;1.56;1.55.0;1.55;1.54.0;1.54;1.53.0;1.53;1.52.0;1.52;1.51.0;1.51;1.50.0;1.50;1.49.0;1.49;1.48.0;1.48;1.47.0;1.47;1.46.1;1.46.0;1.46 [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1123 ] Boost_USE_MULTITHREADED = TRUE [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1125 ] Boost_USE_STATIC_LIBS = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1127 ] Boost_USE_STATIC_RUNTIME = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1129 ] Boost_ADDITIONAL_VERSIONS = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1131 ] Boost_NO_SYSTEM_PATHS = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1199 ] Declared as CMake or Environmental Variables: [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1201 ] BOOST_ROOT = /usr/local/Cellar/boost/1.67.0_1/include/ [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1203 ] BOOST_INCLUDEDIR = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1205 ] BOOST_LIBRARYDIR = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1207 ] _boost_TEST_VERSIONS = 1.67.0;1.67;1.66.0;1.66;1.65.1;1.65.0;1.65;1.64.0;1.64;1.63.0;1.63;1.62.0;1.62;1.61.0;1.61;1.60.0;1.60;1.59.0;1.59;1.58.0;1.58;1.57.0;1.57;1.56.0;1.56;1.55.0;1.55;1.54.0;1.54;1.53.0;1.53;1.52.0;1.52;1.51.0;1.51;1.50.0;1.50;1.49.0;1.49;1.48.0;1.48;1.47.0;1.47;1.46.1;1.46.0;1.46 [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1306 ] location of version.hpp: /usr/local/Cellar/boost/1.67.0_1/include/boost/version.hpp [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1330 ] version.hpp reveals boost 1.67.0 [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1416 ] guessed _boost_COMPILER = [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1426 ] _boost_MULTITHREADED = -mt [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1502 ] _boost_RELEASE_ABI_TAG = - [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1504 ] _boost_DEBUG_ABI_TAG = -d [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1567 ] _boost_LIBRARY_SEARCH_DIRS_RELEASE = /usr/local/Cellar/boost/1.67.0_1/lib;NO_DEFAULT_PATH;NO_CMAKE_FIND_ROOT_PATH_boost_LIBRARY_SEARCH_DIRS_DEBUG = /usr/local/Cellar/boost/1.67.0_1/lib;NO_DEFAULT_PATH;NO_CMAKE_FIND_ROOT_PATH [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1756 ] Searching for PYTHON_LIBRARY_RELEASE: boost_python-mt-1_67;boost_python-mt;boost_python [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:400 ] Boost_LIBRARY_DIR_RELEASE = /usr/local/Cellar/boost/1.67.0_1/lib _boost_LIBRARY_SEARCH_DIRS_RELEASE = /usr/local/Cellar/boost/1.67.0_1/lib;NO_DEFAULT_PATH;NO_CMAKE_FIND_ROOT_PATH [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1809 ] Searching for PYTHON_LIBRARY_DEBUG: boost_python-mt-d-1_67;boost_python-mt-d;boost_python-mt;boost_python [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:400 ] Boost_LIBRARY_DIR_DEBUG = /usr/local/Cellar/boost/1.67.0_1/lib _boost_LIBRARY_SEARCH_DIRS_DEBUG = /usr/local/Cellar/boost/1.67.0_1/lib;NO_DEFAULT_PATH;NO_CMAKE_FIND_ROOT_PATH [ /Users/me/anaconda/share/cmake-3.11/Modules/FindBoost.cmake:1883 ] Boost_FOUND = 1 Could NOT find Boost Boost version: 1.67.0 Boost include path: /usr/local/Cellar/boost/1.67.0_1/include Could not find the following Boost libraries: boost_python No Boost libraries were found. You may need to set BOOST_LIBRARYDIR to the directory containing Boost libraries or BOOST_ROOT to the location of Boost.
Therefore, I think the boost-python path is what is missing. I also tried adding
-DBOOST_LIBRARYDIR, but that didn't seem to change anything.
The FindBoost documentation contains this comment about boost-python
Note that Boost Python components require a Python version suffix (Boost 1.67 and later), e.g. python36 or python27 for the versions built against Python 3.6 and 2.7, respectively. This also applies to additional components using Python including mpi_python and numpy. Earlier Boost releases may use distribution-specific suffixes such as 2, 3 or 2.7. These may also be used as suffixes, but note that they are not portable.
I noticed that the boost-python3 libraries had the suffix 37 (libboost_python37.a), so I also tried using an anaconda environment with python 3.7, but the same errors persisted.
How can I get my cmake to find the boost-python3 libraries?
Caffe: second Slice layer does not working
In my proto there are 2 Slice layers. the fist one is working well and appears in the log. The second one seems does not work and it does not even appear in the log!