I understood the concept of automatic differentiation, but couldn't find any explanation how tensorflow calculates the error gradient for non differentiable functions as for example tf.where
in my loss function or tf.cond
in my graph. It works just fine, but I would like to understand how tensorflow backpropagates the error through such nodes, since there is no formula to calculate the gradient from them.
How does tensorflow handle non differentiable nodes during gradient calculation?
1.6k Views Asked by Natjo At
1
There are 1 best solutions below
Related Questions in PYTHON
- How to bind date in 24 hour format in ms chart
- For a SplineArea scrolling chart, show minute graduation on x-axis, but plot points every 6 seconds
- Winforms Chart - draw a allowed area on line chart
- Graph DataGrid Data on Chart
- AddXY points in a chart, Y point is shown but is not labeled
- Chart in winform displaying wrong Point
- Zooming of specific area in chart of win form
- c# MS chart get scrollbar value
- mschart - Poor interaction/cursor performance for larger data sets
- How do I make the grid lines on a Windows.Forms.DataVisualization.Charting.Chart scroll with the data?
Related Questions in TENSORFLOW
- How to bind date in 24 hour format in ms chart
- For a SplineArea scrolling chart, show minute graduation on x-axis, but plot points every 6 seconds
- Winforms Chart - draw a allowed area on line chart
- Graph DataGrid Data on Chart
- AddXY points in a chart, Y point is shown but is not labeled
- Chart in winform displaying wrong Point
- Zooming of specific area in chart of win form
- c# MS chart get scrollbar value
- mschart - Poor interaction/cursor performance for larger data sets
- How do I make the grid lines on a Windows.Forms.DataVisualization.Charting.Chart scroll with the data?
Related Questions in BACKPROPAGATION
- How to bind date in 24 hour format in ms chart
- For a SplineArea scrolling chart, show minute graduation on x-axis, but plot points every 6 seconds
- Winforms Chart - draw a allowed area on line chart
- Graph DataGrid Data on Chart
- AddXY points in a chart, Y point is shown but is not labeled
- Chart in winform displaying wrong Point
- Zooming of specific area in chart of win form
- c# MS chart get scrollbar value
- mschart - Poor interaction/cursor performance for larger data sets
- How do I make the grid lines on a Windows.Forms.DataVisualization.Charting.Chart scroll with the data?
Related Questions in AUTOMATIC-DIFFERENTIATION
- How to bind date in 24 hour format in ms chart
- For a SplineArea scrolling chart, show minute graduation on x-axis, but plot points every 6 seconds
- Winforms Chart - draw a allowed area on line chart
- Graph DataGrid Data on Chart
- AddXY points in a chart, Y point is shown but is not labeled
- Chart in winform displaying wrong Point
- Zooming of specific area in chart of win form
- c# MS chart get scrollbar value
- mschart - Poor interaction/cursor performance for larger data sets
- How do I make the grid lines on a Windows.Forms.DataVisualization.Charting.Chart scroll with the data?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
In the case of
tf.where
, you have a function with three inputs, conditionC
, value on trueT
and value on falseF
, and one outputOut
. The gradient receives one value and has to return three values. Currently, no gradient is computed for the condition (that would hardly make sense), so you just need to do the gradients forT
andF
. Assuming the input and the outputs are vectors, imagineC[0]
isTrue
. ThenOut[0]
comes fromT[0]
, and its gradient should propagate back. On the other hand,F[0]
would have been discarded, so its gradient should be made zero. IfOut[1]
wereFalse
, then the gradient forF[1]
should propagate but not forT[1]
. So, in short, forT
you should propagate the given gradient whereC
isTrue
and make it zero where it isFalse
, and the opposite forF
. If you look at the implementation of the gradient oftf.where
(Select
operation), it does exactly that:Note the input values themselves are not used in the computation, that will be done by the gradients of the operation producing those inputs. For
tf.cond
, the code is a bit more complicated, because the same operation (Merge
) is used in different contexts, and alsotf.cond
also usesSwitch
operations inside. However the idea is the same. Essentially,Switch
operations are used for each input, so the input that was activated (the first if the condition wasTrue
and the second otherwise) gets the received gradient and the other input gets a "switched off" gradient (likeNone
), and does not propagate back further.