Conversation
|
This implements a subset of the changes introduced in #80. It has the advantage that it looks ready to be merged to me, whereas #80 still requires some changes and the author has been silent for a while. This PR has the (small) disadvantage that it only includes the Leaky ReLU, not the regular one. Should we merge this in favor of #80 or should we wait for action on #80 instead? |
|
So what is going on with ReLU? This seems to be a major feature FANN is lacking. I see Merge is blocked - does anyone know why the pending merge is still pending? |
CMakeLists.txt
Outdated
|
|
||
| configure_file (cmake/config.h.in ${CMAKE_CURRENT_BINARY_DIR}/src/include/config.h) | ||
| include_directories (${CMAKE_CURRENT_BINARY_DIR}/src/include/) | ||
| include_directories (${CMAKE_CURRENT_BINARY_DIR}/src/include/ ${CMAKE_CURRENT_BINARY_DIR}/src/include/optimized/opencl/) |
There was a problem hiding this comment.
OpenCL is not used so this shouldn't be here.
| @@ -0,0 +1,11 @@ | |||
| </$objtype/mkfile | |||
There was a problem hiding this comment.
What's the reason of adding this file?
There was a problem hiding this comment.
Portability to Plan 9 I guess
src/CMakeLists.txt
Outdated
| FIND_PACKAGE(OpenCL) | ||
| IF(OPENCL_FOUND) | ||
| SET(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -I${OpenCL_INCLUDE_DIR}") | ||
| SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -I${OpenCL_INCLUDE_DIR}") | ||
| ENDIF(OPENCL_FOUND) |
There was a problem hiding this comment.
Can this be deleted as it's not used anywhere?
src/CMakeLists.txt
Outdated
| # optimized/opencl/fann.c | ||
| # optimized/opencl/fann_cl.cpp | ||
| # optimized/opencl/fann_cl_kernel.c | ||
| # optimized/opencl/fann_cl_train.c | ||
| # optimized/opencl/fann_cl_ann.c | ||
| # optimized/opencl/fann_cl_run.c |
src/CMakeLists.txt
Outdated
|
|
||
| ADD_SUBDIRECTORY( include ) | ||
|
|
||
| #INCLUDE_DIRECTORIES(${CMAKE_CURRENT_SOURCE_DIR}/include ${CMAKE_CURRENT_SOURCE_DIR}/include/optimized/opencl) |
src/fann.c
Outdated
| #ifdef PLAN9 | ||
| #include <stdio.h> | ||
| #include <stdlib.h> | ||
| #include <string.h> | ||
| #include <sys/time.h> | ||
| #include <time.h> | ||
| #include <math.h> | ||
| #else |
There was a problem hiding this comment.
Portability to Plan 9 I guess
| } | ||
|
|
||
| for(; i != num_connections; i += 4) | ||
| #pragma omp parallel for reduction(+:neuron_sum) |
There was a problem hiding this comment.
omp shouldn't be added as part of this PR as it's not used in fann.c
| LINEAR_PIECE_LEAKY, | ||
| LINEAR_PIECE_RECT |
| @@ -0,0 +1,31 @@ | |||
| </$objtype/mkfile | |||
There was a problem hiding this comment.
again not sure what's the reason for this to be added...?
|
@echoline I have added few comments as I saw some recent commits. Is this PR still active? If so it would be also good to add a test for this. |
|
I have continued making unrelated changes since the initial pull request. I
am attaching a patch for leaky and normal ReLUs
…On Sat, Jan 23, 2021 at 10:15 AM Jakub Zelenka ***@***.***> wrote:
@echoline <https://github.com/echoline> I have added few comments as I
saw some recent commits. Is this PR still active? If so it would be also
good to add a test for this.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#105 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AADCAWP73MKG2Q7WURJ7RFDS3MG3DANCNFSM4FHWU4BA>
.
|
|
@bukka hmmm github seems to have filtered out my email attachment... |
|
Hey gang - long time (since early 2000s) FANN user and developer. I have been watching the ReLU effort for a while now - is there a batter Github thread for asking some questions on the ReLU changes? I can help with regression testing on my DL380 SAN cluster. I have a 20 core server I run things like this on. |
Original patch by @echoline (libfann#105 (comment)) plus a test case.
|
I'm going to close this - leaky relu has been added in different PR and the rest is not in the state that it could be merged. Feel free to create a new PR if you would like some parts of this get integrated. |
leaky rectifying units are a new type of activation function