How can i set rep_delay in ibm qiskit runtime

123 Views Asked by At

I want to be able to speed up the sampling done by qiskit_ibm_runtime.Estimator, by reducing the time qubits are allowed to decay to their ground state. I realise this might introduce errors in the intial state but it may help me to estimate the possible sampling speed if error rates were not my priority, or if error rate improved with QPU developments. When not using qiskit runtime the rep_delay can be set when running a back end

from qiskit.circuit.library import RealAmplitudes
circuit = RealAmplitudes(num_qubits=2, reps=2)
backend.run(circuit, rep_delay=0.00001)

Can a this delay be set when using the primitive Estimator or Sampler? I tried

from qiskit_ibm_runtime import Estimator, Session, QiskitRuntimeService
from qiskit.circuit.library import RealAmplitudes
circuit = RealAmplitudes(num_qubits=2, reps=2)
service = QiskitRuntimeService(channel="ibm_cloud", token=API_Key_my, instance=crn_pay)

backend={0:"ibm_algiers",1:"ibmq_qasm_simulator",2:"ibmq_manila"}[0]
    with Session(service=service, backend=backend):
        job = qiskit_ibm_runtime.Sampler(options={"shots": 2048} ).run(circuit, rep_delay=0.00001)

but it responded with error

got an unexpected keyword argument 'rep_delay'
1

There are 1 best solutions below

0
On

rep_delay is not a documented option, but sampler still accepts it if you pass it in the options using

Options(execution={"rep_delay": your_rep_delay_value})