Is it possible to use Autograd to compute the derivative of a neural network output with respect to one of its inputs?

Is it possible to use Autograd to compute the derivative of a neural network output with respect to one of its inputs?

<br>


I have a neural network model that outputs a vector Y of size approximately 4000 for about 9 inputs X. I am in need of computing the partial derivative of the output of Y with one or two of the inputs X_1 or X_2.

I already have these derivatives, and I have trained two different neural networks for either of X_1 and X_2. It does quite well, but the issue is that the derivatives are not as accurate as the neural network that computes Y.

I am hoping that there is a way to compute derivatives of the output vector Y to one of the inputs in X from the finalised/optimised neural network, such that I will not need to train two additional neural networks for the derivatives.

Is there a way of doing this with autograd?

Angular 9 Tutorial: Learn to Build a CRUD Angular App Quickly

What's new in Bootstrap 5 and when Bootstrap 5 release date?

Brave, Chrome, Firefox, Opera or Edge: Which is Better and Faster?

How to Build Progressive Web Apps (PWA) using Angular 9

What is new features in Javascript ES2020 ECMAScript 2020

Top Python Development Companies | Hire Python Developers

After analyzing clients and market requirements, TopDevelopers has come up with the list of the best Python service providers. These top-rated Python developers are widely appreciated for their professionalism in handling diverse projects. When...

Python GUI Programming Projects using Tkinter and Python 3

Python GUI Programming Projects using Tkinter and Python 3

Guide to Python Programming Language

Guide to Python Programming Language