Integrating Tensorflow and Qiskit for Quantum Machine Learning

Integrating Tensorflow and Qiskit for Quantum Machine Learning

Integrating Tensorflow and Qiskit for Quantum Machine Learning: Taking a step towards quantum machine learning. In this article, we will be talking about integrating Qiskit in custom Keras layers.


There exist two popular integrations of quantum computing packages in standard deep learning libraries:

  1. Tensorflow and Cirq as Tensorflow Quantum
  2. Pytorch and Qiskit

In this article, we will be talking about integrating Qiskit in custom Keras layers.


Quantum machine learning has an interesting application of assisting classical neural networks with quantum layers that involve computation not realisable classically. Recent work in academia has stressed on applications of _quantum-assisted deep learning _which can have complex activations, better representation, and other salient features not achievable in classical networks.

For the implementation side, this means finding a way to integrate quantum processing in normal deep neural networks. Several ways exist to achieve this. Here, we discuss integrating Qiskit as subclasses of Keras layers. Let’s get started.

Defining the quantum layer

This obviously depends on specific application. The thing to keep in mind is to be consistent in the inputs and outputs this layer has. With eager_execution as default in Tensorflow 2.x, it is only natural to fall back to numpy arrays as our default inputs to and outputs from all quantum layers.

from qiskit.aqua.operators import Z, Y, X
    from qiskit.aqua.operators import StateFn

    QUBITS = 4
    operatorZ = Z ^ Z ^ Z ^ Z
    operatorX = X ^ X ^ X ^ X
    operatorY = Y ^ Y ^ Y ^ Y

    def quantum_layer(initial_parameters):
        ## expecting parameters to be a numpy array
        quantumRegister = QuantumRegister(QUBITS)
        quantumCircuit = QuantumCircuit(quantumRegister)


        for i in range(len(initial_parameters)):
          quantumCircuit.ry(initial_parameters[i] * np.pi, i)

        psi = StateFn(quantumCircuit)

        ## two ways of doing the same thing
        expectationX = (~psi @ operatorX @ psi).eval()
        expectationZ = psi.adjoint().compose(operatorZ).compose(psi).eval().real
        expectationY = (~psi @ operatorY @ psi).eval()

        expectationZ = np.abs(np.real(expectationZ))
        expectations = [expectationX, expectationY, expectationZ, 
                        expectationX + expectationY + expectationZ] 

        return np.array(expectations)
view raw hosted with ❤ by GitHub

Sample quantum layer

This is an arbitrary quantum layer taking in four inputs and outputting a numpy array of length 4. We calculate the expectations of standard Pauli operators, create a list, and return it. This layer would change according to the specifics of the underlying application.

tensorflow deep-learning quantum-computing qiskit machine-learning deep learning

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Quantum Machine Learning: learning on neural networks

Quantum Machine Learning: learning on neural networks. Analytical gradient computation, the Hadamard test, and more. This time, we’re going a little deeper into the rabbit hole and looking at how to build a neural network on a quantum computer.

Why you should learn Computer Vision and how you can get started

A few compelling reasons for you to starting learning Computer. In today’s world, Computer Vision technologies are everywhere.

What is Supervised Machine Learning

What is neuron analysis of a machine? Learn machine learning by designing Robotics algorithm. Click here for best machine learning course models with AI

Pros and Cons of Machine Learning Language

AI, Machine learning, as its title defines, is involved as a process to make the machine operate a task automatically to know more join CETPA

Artificial Intelligence, Machine Learning, Deep Learning 

Artificial Intelligence (AI) will and is currently taking over an important role in our lives — not necessarily through intelligent robots.