Liking cljdoc? Tell your friends :D

org.soulspace.qclojure.application.algorithm.variational-algorithm

Common infrastructure for variational quantum algorithms (VQE, QAOA, etc.).

This namespace provides reusable components that are shared between different variational quantum algorithms, reducing code duplication and ensuring consistent behavior across algorithms.

Key Features:

  • Generic objective function creation
  • Common optimization method dispatching
  • Shared algorithm structure templates
  • Common result analysis and processing
  • Parameter initialization strategies

Design Principles:

  • Algorithm-agnostic: Works with any parameterized quantum circuit
  • Composable: Functions can be mixed and matched as needed
  • Consistent: Uniform interfaces and error handling
  • Extensible: Easy to add new optimization methods or analysis functions
Common infrastructure for variational quantum algorithms (VQE, QAOA, etc.).

This namespace provides reusable components that are shared between different
variational quantum algorithms, reducing code duplication and ensuring
consistent behavior across algorithms.

Key Features:
- Generic objective function creation
- Common optimization method dispatching
- Shared algorithm structure templates
- Common result analysis and processing
- Parameter initialization strategies

Design Principles:
- Algorithm-agnostic: Works with any parameterized quantum circuit
- Composable: Functions can be mixed and matched as needed
- Consistent: Uniform interfaces and error handling
- Extensible: Easy to add new optimization methods or analysis functions
raw docstring

analyze-convergence-historyclj

(analyze-convergence-history optimization-result)

Analyze convergence from optimization history for any variational algorithm.

This function analyzes the optimization trajectory to provide insights into convergence behavior, energy improvement, and optimization quality.

Parameters:

  • optimization-result: Result map from optimization containing :history

Returns: Map with convergence analysis

Analyze convergence from optimization history for any variational algorithm.

This function analyzes the optimization trajectory to provide insights into
convergence behavior, energy improvement, and optimization quality.

Parameters:
- optimization-result: Result map from optimization containing :history

Returns:
Map with convergence analysis
sourceraw docstring

analyze-optimization-convergenceclj

(analyze-optimization-convergence optimization-result)

Analyze convergence properties of optimization results.

Parameters:

  • optimization-result: Result map from optimization

Returns: Map with convergence analysis

Analyze convergence properties of optimization results.

Parameters:
- optimization-result: Result map from optimization

Returns:
Map with convergence analysis
sourceraw docstring

analyze-parameter-sensitivityclj

(analyze-parameter-sensitivity sensitivities)

Analyze parameter sensitivity for variational algorithms.

This function identifies which parameters have the most impact on the objective function by computing normalized sensitivities and providing ranking.

Parameters:

  • sensitivities: Vector of parameter sensitivities from landscape analysis

Returns: Map with sensitivity analysis

Analyze parameter sensitivity for variational algorithms.

This function identifies which parameters have the most impact on the objective
function by computing normalized sensitivities and providing ranking.

Parameters:
- sensitivities: Vector of parameter sensitivities from landscape analysis

Returns:
Map with sensitivity analysis
sourceraw docstring

analyze-variational-landscapeclj

(analyze-variational-landscape objective-fn
                               optimal-params
                               &
                               {:keys [perturbation-size]
                                :or {perturbation-size 0.01}})

Analyze the energy landscape around optimal parameters for any variational algorithm.

This function performs parameter sensitivity analysis by perturbing each parameter and measuring the energy change. It provides insights into which parameters have the most impact on the objective function.

Parameters:

  • objective-fn: Objective function to analyze
  • optimal-params: Optimal parameters found by optimization
  • perturbation-size: Size of parameter perturbations for analysis (default: 0.01)

Returns: Map with landscape analysis including gradient norms and parameter sensitivities

Analyze the energy landscape around optimal parameters for any variational algorithm.

This function performs parameter sensitivity analysis by perturbing each parameter
and measuring the energy change. It provides insights into which parameters have
the most impact on the objective function.

Parameters:
- objective-fn: Objective function to analyze
- optimal-params: Optimal parameters found by optimization
- perturbation-size: Size of parameter perturbations for analysis (default: 0.01)

Returns:
Map with landscape analysis including gradient norms and parameter sensitivities
sourceraw docstring

convergence-monitorclj

(convergence-monitor history options)

Monitor variational algorithm convergence with sophisticated stopping criteria.

This function tracks optimization progress and implements intelligent stopping criteria based on energy convergence, gradient norms, and parameter stability. Works with any variational quantum algorithm.

Parameters:

  • history: Vector of optimization steps {:iteration :energy :gradients :parameters}
  • options: Convergence options map
    • :tolerance - Energy convergence tolerance (default: 1e-6)
    • :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
    • :min-iterations - Minimum iterations before convergence checking (default: 10)
    • :patience - Window size for convergence analysis (default: 20)

Returns: Map with convergence analysis and recommendations

Monitor variational algorithm convergence with sophisticated stopping criteria.

This function tracks optimization progress and implements intelligent
stopping criteria based on energy convergence, gradient norms, and 
parameter stability. Works with any variational quantum algorithm.

Parameters:
- history: Vector of optimization steps {:iteration :energy :gradients :parameters}
- options: Convergence options map
  - :tolerance - Energy convergence tolerance (default: 1e-6)
  - :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
  - :min-iterations - Minimum iterations before convergence checking (default: 10)
  - :patience - Window size for convergence analysis (default: 20)

Returns:
Map with convergence analysis and recommendations
sourceraw docstring

enhanced-variational-objectiveclj

(enhanced-variational-objective hamiltonian
                                circuit-construction-fn
                                backend
                                execution-options)

Create enhanced variational objective function that provides gradients.

This function creates an enhanced objective that computes both energy and gradients efficiently using the parameter shift rule. The gradient computation is integrated with the result framework to enable sophisticated gradient-based optimization methods.

Parameters:

  • hamiltonian: Hamiltonian to minimize
  • circuit-construction-fn: Function that takes parameters and returns a circuit
  • backend: Quantum backend for circuit execution
  • execution-options: Execution options (can include :parallel? for gradient computation)

Returns: Function that takes parameters and returns {:energy value :gradients [...] :quantum-state state}

Create enhanced variational objective function that provides gradients.

This function creates an enhanced objective that computes both energy and gradients
efficiently using the parameter shift rule. The gradient computation is integrated 
with the result framework to enable sophisticated gradient-based optimization methods.

Parameters:
- hamiltonian: Hamiltonian to minimize  
- circuit-construction-fn: Function that takes parameters and returns a circuit
- backend: Quantum backend for circuit execution
- execution-options: Execution options (can include :parallel? for gradient computation)

Returns:
Function that takes parameters and returns {:energy value :gradients [...] :quantum-state state}
sourceraw docstring

enhanced-variational-optimizationclj

(enhanced-variational-optimization objective-fn initial-parameters options)

Run variational algorithm optimization with integrated convergence monitoring.

This function wraps optimization methods with intelligent convergence monitoring, allowing for early stopping based on energy changes, gradient norms, and parameter stability. It tracks the full optimization history and provides detailed convergence analysis.

Supports enhanced objectives that provide gradients, falling back to standard optimization for regular objective functions.

Parameters:

  • objective-fn: Objective function to minimize (can be enhanced or standard)
  • initial-parameters: Starting parameter values
  • options: Optimization options including convergence monitoring parameters
    • :optimization-method - Method to use (default: :adam)
    • :max-iterations - Maximum iterations (default: 500)
    • :tolerance - Energy convergence tolerance (default: 1e-6)
    • :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
    • :min-iterations - Minimum iterations before convergence (default: 10)
    • :patience - Convergence analysis window (default: 20)
    • :learning-rate - Learning rate for gradient descent (default: 0.01)

Returns: Map with optimization results and convergence analysis

Run variational algorithm optimization with integrated convergence monitoring.

This function wraps optimization methods with intelligent convergence monitoring,
allowing for early stopping based on energy changes, gradient norms, and parameter
stability. It tracks the full optimization history and provides detailed convergence
analysis.

Supports enhanced objectives that provide gradients, falling back to standard
optimization for regular objective functions.

Parameters:
- objective-fn: Objective function to minimize (can be enhanced or standard)
- initial-parameters: Starting parameter values  
- options: Optimization options including convergence monitoring parameters
  - :optimization-method - Method to use (default: :adam)
  - :max-iterations - Maximum iterations (default: 500)
  - :tolerance - Energy convergence tolerance (default: 1e-6)
  - :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
  - :min-iterations - Minimum iterations before convergence (default: 10)
  - :patience - Convergence analysis window (default: 20)
  - :learning-rate - Learning rate for gradient descent (default: 0.01)

Returns:
Map with optimization results and convergence analysis
sourceraw docstring

random-parameter-initializationclj

(random-parameter-initialization num-parameters
                                 &
                                 {:keys [range] :or {range [-0.1 0.1]}})

Generate random initial parameters for variational algorithms.

Parameters:

  • num-parameters: Number of parameters to initialize
  • range: Parameter range as [min max] (default: [-0.1 0.1])

Returns: Vector of random initial parameters

Generate random initial parameters for variational algorithms.

Parameters:
- num-parameters: Number of parameters to initialize
- range: Parameter range as [min max] (default: [-0.1 0.1])

Returns:
Vector of random initial parameters
sourceraw docstring

summarize-algorithm-performanceclj

(summarize-algorithm-performance algorithm-result algorithm-name)

Create a summary of variational algorithm performance.

Parameters:

  • algorithm-result: Complete result map from algorithm execution
  • algorithm-name: Name of the algorithm (e.g., 'VQE', 'QAOA')

Returns: Map with performance summary

Create a summary of variational algorithm performance.

Parameters:
- algorithm-result: Complete result map from algorithm execution
- algorithm-name: Name of the algorithm (e.g., 'VQE', 'QAOA')

Returns:
Map with performance summary
sourceraw docstring

variational-algorithm-templateclj

(variational-algorithm-template config algorithm-fns)

Template for implementing variational quantum algorithms.

This function provides a common structure that can be used to implement new variational algorithms or refactor existing ones. It handles the common workflow and delegates algorithm-specific tasks to provided functions.

Parameters:

  • config: Algorithm configuration map
  • algorithm-fns: Map of algorithm-specific functions:
    • :hamiltonian-constructor - (fn [config] -> hamiltonian)
    • :circuit-constructor - (fn [config] -> circuit-construction-fn)
    • :parameter-count - (fn [config] -> number)
    • :result-processor - (fn [optimization-result config] -> final-result)

Returns: Complete algorithm result map

Template for implementing variational quantum algorithms.

This function provides a common structure that can be used to implement
new variational algorithms or refactor existing ones. It handles the
common workflow and delegates algorithm-specific tasks to provided functions.

Parameters:
- config: Algorithm configuration map
- algorithm-fns: Map of algorithm-specific functions:
  - :hamiltonian-constructor - (fn [config] -> hamiltonian)
  - :circuit-constructor - (fn [config] -> circuit-construction-fn)
  - :parameter-count - (fn [config] -> number)
  - :result-processor - (fn [optimization-result config] -> final-result)

Returns:
Complete algorithm result map
sourceraw docstring

variational-objectiveclj

(variational-objective hamiltonian
                       circuit-construction-fn
                       backend
                       execution-options)

Create a generic objective function for variational quantum algorithms.

This function provides a common interface for creating objective functions that work with both VQE and QAOA (and future variational algorithms). It abstracts the common pattern of:

  1. Convert parameters to quantum circuit
  2. Execute circuit on backend
  3. Extract Hamiltonian expectation value using result-specs
  4. Return energy for optimization

The result extraction infrastructure handles backend capabilities transparently, so this always uses result-specs for consistent and efficient operation.

Parameters:

  • hamiltonian: Hamiltonian to minimize (collection of Pauli terms)
  • circuit-construction-fn: Function that takes parameters and returns a circuit
  • backend: Quantum backend for circuit execution
  • execution-options: Options for circuit execution (shots, etc.)

Returns: Function that takes parameters and returns energy expectation value

Examples: ;; For VQE: (create-variational-objective h2-hamiltonian ansatz-fn backend options)

;; For QAOA: (create-variational-objective problem-hamiltonian (partial qaoa-ansatz-circuit problem-h mixer-h num-qubits) backend options)

Create a generic objective function for variational quantum algorithms.

This function provides a common interface for creating objective functions
that work with both VQE and QAOA (and future variational algorithms). It
abstracts the common pattern of:
1. Convert parameters to quantum circuit
2. Execute circuit on backend
3. Extract Hamiltonian expectation value using result-specs
4. Return energy for optimization

The result extraction infrastructure handles backend capabilities transparently,
so this always uses result-specs for consistent and efficient operation.

Parameters:
- hamiltonian: Hamiltonian to minimize (collection of Pauli terms)
- circuit-construction-fn: Function that takes parameters and returns a circuit
- backend: Quantum backend for circuit execution
- execution-options: Options for circuit execution (shots, etc.)

Returns:
Function that takes parameters and returns energy expectation value

Examples:
;; For VQE:
(create-variational-objective h2-hamiltonian ansatz-fn backend options)

;; For QAOA:
(create-variational-objective problem-hamiltonian 
                              (partial qaoa-ansatz-circuit problem-h mixer-h num-qubits)
                              backend options)
sourceraw docstring

variational-optimizationclj

(variational-optimization objective-fn initial-parameters options)

Run optimization for variational quantum algorithms using specified method.

This function provides a common interface for optimization that works with both VQE and QAOA. It handles the method dispatching and delegates to the appropriate optimization functions from the qopt namespace.

Supported optimization methods:

  • :gradient-descent - Basic gradient descent with parameter shift gradients
  • :adam - Adam optimizer with parameter shift gradients (recommended default)
  • :quantum-natural-gradient - Quantum Natural Gradient using Fisher Information Matrix
  • :nelder-mead - Derivative-free Nelder-Mead simplex method
  • :powell - Derivative-free Powell's method
  • :cmaes - Covariance Matrix Adaptation Evolution Strategy (robust)
  • :bobyqa - Bound Optimization BY Quadratic Approximation (handles bounds well)
  • :gradient - Fastmath gradient-based optimizers
  • :lbfgsb - L-BFGS-B optimization

Parameters:

  • objective-fn: Objective function to minimize
  • initial-parameters: Starting parameter values
  • options: Optimization options map

Returns: Map with optimization results including convergence information

Run optimization for variational quantum algorithms using specified method.

This function provides a common interface for optimization that works with
both VQE and QAOA. It handles the method dispatching and delegates to the
appropriate optimization functions from the qopt namespace.

Supported optimization methods:
- :gradient-descent - Basic gradient descent with parameter shift gradients
- :adam - Adam optimizer with parameter shift gradients (recommended default)
- :quantum-natural-gradient - Quantum Natural Gradient using Fisher Information Matrix
- :nelder-mead - Derivative-free Nelder-Mead simplex method
- :powell - Derivative-free Powell's method
- :cmaes - Covariance Matrix Adaptation Evolution Strategy (robust)
- :bobyqa - Bound Optimization BY Quadratic Approximation (handles bounds well)
- :gradient - Fastmath gradient-based optimizers
- :lbfgsb - L-BFGS-B optimization

Parameters:
- objective-fn: Objective function to minimize
- initial-parameters: Starting parameter values
- options: Optimization options map

Returns:
Map with optimization results including convergence information
sourceraw docstring

zero-parameter-initializationclj

(zero-parameter-initialization num-parameters)

Generate zero initial parameters for variational algorithms.

Parameters:

  • num-parameters: Number of parameters to initialize

Returns: Vector of zero initial parameters

Generate zero initial parameters for variational algorithms.

Parameters:
- num-parameters: Number of parameters to initialize

Returns:
Vector of zero initial parameters
sourceraw docstring

cljdoc builds & hosts documentation for Clojure/Script libraries

Keyboard shortcuts
Ctrl+kJump to recent docs
Move to previous article
Move to next article
Ctrl+/Jump to the search field
× close