Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added changes to GPI2 class in ops.py #2

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Conversation

mushahidkhan835
Copy link
Collaborator

@mushahidkhan835 mushahidkhan835 commented Jun 26, 2023

Added changes to GPI2 class in ops.py so that gradients using different interfaces can be calculated. This commit also includes unit tests for the changes.

Copy link
Contributor

@glassnotes glassnotes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @mushahidkhan835 ! Comments and suggestions within.

I experienced some failures when running the test file within the test_compute_gradient function. It was different for each interface, I think it was related to the casting before the comparison, which should be solved by switching to qml.math.allclose and/or making the interface-specific test functions.

BTW I link to it in a comment but the qutrit gradient tests are a good reference for structuring these ones.

Before re-uploading, could you please also format your code using black, for consistency?

black -l 100 file.py

exponent = (0 -1j) * phi
a = (1 + 0j) / qml.math.sqrt(2)
b = ((0 - 1j) * qml.math.exp(exponent)) / qml.math.sqrt(2)
c = ((0 - 1j) * qml.math.exp(qml.math.conj(exponent))) / qml.math.sqrt(2)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
c = ((0 - 1j) * qml.math.exp(qml.math.conj(exponent))) / qml.math.sqrt(2)
c = qml.math.conj(b)

Is this correct? If so, can just use conj directly in the return statement rather than creating a separate variable for it.

@@ -89,7 +92,9 @@ class GPI2(Operation):
num_wires = 1
num_params = 1
ndim_params = (0,)

grad_method = "A"
parameter_frequencies = [(1,)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check into the documentation of the parameter_frequencies - it has to do with the eigenvalues of the generator, and is not a fixed value for all single-qubit operations. It's important because I believe the parameter-shift rule generator uses the frequencies to determine what the shift values and gradient recipe should be.

Copy link
Collaborator Author

@mushahidkhan835 mushahidkhan835 Jul 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was told that atleast one of the following either grad_recipe, parameter_frequencies or generator should be clearly defined to avoid unexpected behavior during differentiation. Also as per https://docs.pennylane.ai/en/stable/code/api/pennylane.operation.Operation.html#pennylane.operation.Operation.parameter_frequencies


grad_method = "A"
parameter_frequencies = [(1,)]

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please define the generator method for the operation.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the docs and on the forum, i was told that i can define either generator, parameter_frequencies or grad_recipe.


@pytest.mark.parametrize("phi, expected", [(0, 1/np.sqrt(2) * (np.array([[1, -1j],[-1j, 1]]))), (0.3, 1/np.sqrt(2)*np.array([[1, -1j * np.exp(-1j * 0.3)],[-1j * np.exp(1j * 0.3), 1]]))])
def test_compute_matrix_returns_correct_matrix(self, phi, expected):
assert qml.math.allequal(GPI2.compute_matrix(phi), expected)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assert qml.math.allequal(GPI2.compute_matrix(phi), expected)
assert qml.math.allequal(GPI2.compute_matrix(phi), expected)
assert qml.math.allequal(qml.matrix(GPI2)(phi), expected)

Just for good measure!

class TestGPI2:
"""Tests for the Operations ``GPI2``."""

interfaces = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know it's a bit more tedious, but I recommend to actually separate the interface tests into individual ones (one per interface) and create the arrays within, rather than importing the interfaces up top and using this list and the interface_array method. The main reason is that a user does not in principle need to have all the interfaces installed; if they don't, running the test file will fail at import.

For an example, take a look at the gradient tests in the parametrized qutrit operations: https://github.com/PennyLaneAI/pennylane/blob/master/tests/ops/qutrit/test_qutrit_parametric_ops.py#L419


@pytest.mark.parametrize("interface", interfaces)
def test_compute_gradient(self, interface):
phi = 0.3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In addition to the individual interface tests as mentioned above, evaluate the gradient at a couple of different parameter values (0, positive/negative values, any special cases like pi/2 if relevant).

Try also different diff_methods, like finite-diff and parameter-shift, to make sure the results align (though for the latter you will need to look into the frequencies attribute)

BTW you can stack multiple pytest.mark.parametrizes to get all the different combinations 🙂

phi_grad = None
if interface == "jax":
grad_fcn = jax.grad(qnode)
assert jax.numpy.isclose(grad_fcn(phi), 4.3661002990646334e-17)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Presumably all these small numbers should just be 0? In addition to trying different parameter values, try also to adjust the circuit/observable to ensure that some non-zero gradient instances are tested (otherwise, it might just be outputting 0 all the time!)

phi.requires_grad = True
result = qnode(phi)
result.backward()
assert np.isclose(float(phi.grad), 2.9802322387695312e-08)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect using qml.math.isclose would remove the need for casting?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants