Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9736

Implicit / inverse multivariate differentiation

$
0
0

I'm working on solving a pretty tricky optimization problem, where the constrain functions $g_i$ are not well behaved, so has been challenging for optimizers to work on directly. Instead of solving this problem directly, I've transformed it into a much better behaved problem where instead of solving for the values directly, we solve some related ones that forces all the constrains to have a certain derivative, and then recovers the original variables from there.

We end up solving a problem we know that at the current point $(x,y)$

$u g_x = v g_y$

where $(u, v)$ are the decision variables.

I want to understand how the gradient and hessian look with respect to u, v rather than x and y.

This hasn't been difficult for the first derivatives and diagonal entries of the hessian, however I've been having trouble discovering the rule that can give me the off-diagonal hessian entries / cross second derivatives:

$\frac{d^2x}{dudv}, \frac{d^2y}{dudv} $ ....

What is the rule that I can use to define these inverse second cross derivatives in term of derivatives of the original function g?

The $g_i$ functions are of up to 4 variables, although I just posted the 2 variable example for simplicity.


Viewing all articles
Browse latest Browse all 9736

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>