Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Moving away from targeting black box design #1036

Closed
mrossinek opened this issue Jan 19, 2023 · 1 comment · Fixed by #1084
Closed

Moving away from targeting black box design #1036

mrossinek opened this issue Jan 19, 2023 · 1 comment · Fixed by #1084

Comments

@mrossinek
Copy link
Member

mrossinek commented Jan 19, 2023

What should we add?

TL;DR

I propose that we:

  • phase out the (Minimum)EigensolverFactory classes (with their replacement being better documentation/education of users)
  • no longer enforce the re-usability of algorithm instances (and their subcomponents)

Qiskit Nature provides the glue between quantum computational concepts and classical computational science. As such, there is quite a large variety in the background and knowledge of our users. This does not always make code design decisions easy and in the past we have added a lot of tools which wrap other components in order to achieve a workflow that goes from a problem places it into an algorithm to obtain a result. While this overall workflow is good and is also reflected in other Qiskit application modules, we have added a few components that are coupled too tightly making more complex workflows not always easy to realize. Some examples are reflected by:

This issue gathers some general thoughts on this topic.

  1. We should avoid tight wrapping of components (e.g. QubitConverter wrapping QubitMapper instances) and requiring this wrapping throughout the stack.

This is a general observation which @woodsp-ibm has also brought up in the past (mostly offline) but I would like to re-iterate on this and stress this as an important point. Resolving #967 will largely deal with this, but I am explicitly writing it down here for documentation purposes.

  1. We should move away from providing black box components.

Quantum computing is simply not in a state yet, where "it just works", especially when looking at chemistry applications. More concretely, what I am referring to here is the VQEUCCFactory (and its siblings). The design of this concept is not scalable to the various options of algorithm and ansatz variants (see e.g. #854). Furthermore, the default provided by that class is only useful for classical simulation purposes where UCCSD is feasible for smaller systems. For now, a real quantum device will not be able to handle a full UCCSD approach.
Thus, the class itself does not provide a real use-case which is aligned with Qiskit focusing on real device applications. Furthermore, the usefulness of the class was largely decreased when we switched to primitives, because compared to the old workflow in which a single import was sufficient, one now needs to import the ansatz (e.g. UCCSD), the optimizer (e.g. SLSQP) and the primitive (e.g. Estimator) to plug them into the factory, at which point most of the work is already done.
And finally, since problems are now fully defined before being passed to .solve(...), the argument of building out the ansatz at runtime is also no longer relevant.

However, we also cannot simply remove the factory classes. Instead, we need to improve our documentation and other resources which teach users about the necessary steps of setting up an ansatz and algorithm correctly. Properly explaining this will set users up to also more easily understand how they can tweak their setup to use a different algorithm, ansatz, etc.

  1. "Algorithm reusability"

With all of what I said above, below is a possible workflow:

# omitting imports for brevity...

problem = PySCFDriver(...).run()

mapper = ParityMapper(num_particles=problem.num_particles)

ansatz = UCCSD(num_particles=problem.num_particles, ...)
ansatz.initial_state = HartreeFock(...)

estimator = Estimator(...)
optimizer = SPSA(...)

vqe = VQE(estimator, ansatz, optimier)
vqe.initial_point = HFInitialPoint(...).to_numpy_array()

algo = GroundStateEigensolver(mapper, vqe)

result = algo.solve(problem)

The part that has changed the most (e.g. compared to the current README example) is the avoidance of the VQEUCCFactory. Furthermore, this uses the proposed ParityMapper of #1018.

An argument could be made, that this workflow disallows the reusability of the algorithm, because both, the ParityMapper and the UCCSD ansatz (and by extension the VQE) are coupled to the problem size. I would argue, that this is not a deal breaker. Let me explain why:

  • the alternative (enforcing reusability no matter the cost) causes a lot of strain on code design which impacts maintenance and will result in MANY edge cases which are very likely sources of bugs to occur
  • thinking of an actual usage scenario, an algorithm will very likely to only be reused where the number of particles (and other problem properties that matter here) do not change. For example across a dissociation curve. Here, the workflow is unaffected.
  • and finally looking at real world usage on a real quantum device, it is even likely that within a dissociation, the user will need to tweak algorithm and/or ansatz settings at each step of the way (e.g. when using AdaptVQE for each data point, unlees they want to recompute AdaptVQE every time, they might want to load a stored ansatz for each data point at which the ansatz needs to be updated, too)

And as a final argument, once real quantum computers might have matured to a point where black box usage of them might be possible, at that point the automation of workflows like the one shown above, would not happen on the level of the GroundStateEigensolver (which is where the current VQEUCCFactory approaches things), but instead one would probably provide black box workflows for different kinds of problems (e.g. dissociation, MD, etc.). This would allow re-usage of algorithms and also still enable automation of warm-starting the VQE across multiple data points (just as an example).


EDIT: a better way of phrasing than "black-box" could have been to refer to "pre-composed objects". I will not update everything above but maybe this helps clarify what I had intended a bit better.

@mrossinek mrossinek changed the title Moxing away from targeting black box design Moving away from targeting black box design Jan 20, 2023
@MarcoBarroca
Copy link
Contributor

Just to add to this. I remember that when mentoring for the last IBM Quantum challenge some users were confused as to how they could run the VQE with a different ansatz. This was mostly due to the notebooks in the challenge suggesting them to create a VQEUCCFactory() and passing that to the GroundStateEigensolver(). It wasn't clear to them that the solver could also take any object of the MinimumEignesolver() class directly so some were simply trying to pass a different ansatz to the factory unsuccessfully. I think this relates to the point that creating factories for every variation of VQE does not scale.

Phasing out factories seems like a good idea, personally I've completely stopped using them after the switch to primitives.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants