Liking cljdoc? Tell your friends :D

org.soulspace.qclojure.application.algorithm.variational-algorithm

Common infrastructure for variational quantum algorithms (VQE, QAOA, etc.).

This namespace provides reusable components that are shared between different variational quantum algorithms, reducing code duplication and ensuring consistent behavior across algorithms.

Key Features:

  • Generic objective function creation
  • Common optimization method dispatching
  • Shared algorithm structure templates
  • Common result analysis and processing
  • Parameter initialization strategies

Design Principles:

  • Algorithm-agnostic: Works with any parameterized quantum circuit
  • Composable: Functions can be mixed and matched as needed
  • Consistent: Uniform interfaces and error handling
  • Extensible: Easy to add new optimization methods or analysis functions
Common infrastructure for variational quantum algorithms (VQE, QAOA, etc.).

This namespace provides reusable components that are shared between different
variational quantum algorithms, reducing code duplication and ensuring
consistent behavior across algorithms.

Key Features:
- Generic objective function creation
- Common optimization method dispatching
- Shared algorithm structure templates
- Common result analysis and processing
- Parameter initialization strategies

Design Principles:
- Algorithm-agnostic: Works with any parameterized quantum circuit
- Composable: Functions can be mixed and matched as needed
- Consistent: Uniform interfaces and error handling
- Extensible: Easy to add new optimization methods or analysis functions
raw docstring

analyze-convergenceclj

(analyze-convergence optimization-result)

Comprehensive convergence analysis for variational quantum algorithms.

This function provides unified convergence analysis by examining both the optimization result metadata and the detailed optimization history. It serves as the primary convergence analysis tool for all variational algorithms.

Use Cases:

  • Post-optimization assessment of convergence quality
  • Debugging optimization problems and parameter tuning
  • Comparing different optimization methods or hyperparameters
  • Research analysis of algorithm behavior across different problems

Parameters:

  • optimization-result: Complete result map from optimization containing:
    • :success, :reason, :iterations, :function-evaluations (metadata)
    • :convergence-history or :history (energy trajectory data)
    • :optimal-energy, :optimal-parameters (final results)

Returns: Comprehensive map with convergence analysis including:

  • Basic convergence status and metadata
  • Energy improvement metrics and statistics
  • Convergence rate and trajectory analysis
  • Gradient-based convergence indicators (when available)
Comprehensive convergence analysis for variational quantum algorithms.

This function provides unified convergence analysis by examining both the 
optimization result metadata and the detailed optimization history. It serves
as the primary convergence analysis tool for all variational algorithms.

Use Cases:
- Post-optimization assessment of convergence quality
- Debugging optimization problems and parameter tuning
- Comparing different optimization methods or hyperparameters
- Research analysis of algorithm behavior across different problems

Parameters:
- optimization-result: Complete result map from optimization containing:
  - :success, :reason, :iterations, :function-evaluations (metadata)
  - :convergence-history or :history (energy trajectory data)
  - :optimal-energy, :optimal-parameters (final results)

Returns:
Comprehensive map with convergence analysis including:
- Basic convergence status and metadata
- Energy improvement metrics and statistics
- Convergence rate and trajectory analysis
- Gradient-based convergence indicators (when available)
sourceraw docstring

analyze-parameter-sensitivityclj

(analyze-parameter-sensitivity sensitivities)

Process and rank parameter sensitivities from landscape analysis.

This function takes raw sensitivity data (typically from analyze-variational-landscape) and provides normalized analysis, ranking, and categorization of parameter importance. It's designed to be used as a post-processing step after landscape analysis.

Use Cases:

  • Identifying the most important parameters for optimization focus
  • Reducing parameter space dimension by eliminating low-sensitivity parameters
  • Ansatz design guidance - understanding which parameter placements matter most
  • Adaptive optimization strategies based on parameter importance
  • Research into parameter efficiency and circuit expressivity

Computational Cost: Low - pure data processing, no additional circuit evaluations.

Parameters:

  • sensitivities: Vector of parameter sensitivities (from analyze-variational-landscape)

Returns: Map with processed sensitivity analysis:

  • :sensitivities - Original sensitivity values
  • :normalized-sensitivities - Normalized to [0,1] range
  • :sensitivity-range - Range between max and min sensitivities
  • :ranked-parameters - Parameters sorted by sensitivity (index, value pairs)
  • :high/low-sensitivity-params - Top/bottom 3 most important parameters
Process and rank parameter sensitivities from landscape analysis.

This function takes raw sensitivity data (typically from analyze-variational-landscape)
and provides normalized analysis, ranking, and categorization of parameter importance.
It's designed to be used as a post-processing step after landscape analysis.

Use Cases:
- Identifying the most important parameters for optimization focus
- Reducing parameter space dimension by eliminating low-sensitivity parameters
- Ansatz design guidance - understanding which parameter placements matter most
- Adaptive optimization strategies based on parameter importance
- Research into parameter efficiency and circuit expressivity

Computational Cost: Low - pure data processing, no additional circuit evaluations.

Parameters:
- sensitivities: Vector of parameter sensitivities (from analyze-variational-landscape)

Returns:
Map with processed sensitivity analysis:
- :sensitivities - Original sensitivity values
- :normalized-sensitivities - Normalized to [0,1] range
- :sensitivity-range - Range between max and min sensitivities
- :ranked-parameters - Parameters sorted by sensitivity (index, value pairs)
- :high/low-sensitivity-params - Top/bottom 3 most important parameters
sourceraw docstring

analyze-variational-landscapeclj

(analyze-variational-landscape objective-fn
                               optimal-params
                               &
                               {:keys [perturbation-size compute-gradients?]
                                :or {perturbation-size 0.01
                                     compute-gradients? true}})

Analyze the energy landscape around optimal parameters for variational algorithms.

This function performs computational analysis of the parameter space by evaluating the objective function at perturbed parameter values. It provides insights into the local structure of the energy landscape and parameter sensitivity.

Use Cases:

  • Understanding which parameters most affect the objective function
  • Identifying optimization challenges (flat vs steep landscapes)
  • Validating that optimization found a reasonable local minimum
  • Research into ansatz design and parameter initialization strategies
  • Debugging optimization convergence issues

Computational Cost: Medium to High - requires n additional circuit evaluations for finite difference sensitivities, plus optionally 2×n evaluations for gradients.

Parameters:

  • objective-fn: Objective function to analyze (typically the same used in optimization)
  • optimal-params: Optimal parameters found by optimization
  • perturbation-size: Size of parameter perturbations for analysis (default: 0.01)
  • compute-gradients?: Whether to compute gradients via parameter shift (default: true)

Returns: Map with comprehensive landscape analysis:

  • :optimal-energy - Energy at optimal parameters
  • :sensitivities - Finite difference sensitivities for each parameter
  • :most/least-sensitive-parameter - Indices of extreme sensitivity parameters
  • :gradients - Parameter shift gradients (if compute-gradients? true)
  • :gradient-norm - L2 norm of gradient vector (if gradients computed)
  • Metadata about analysis parameters and parameter count
Analyze the energy landscape around optimal parameters for variational algorithms.

This function performs computational analysis of the parameter space by evaluating
the objective function at perturbed parameter values. It provides insights into
the local structure of the energy landscape and parameter sensitivity.

Use Cases:
- Understanding which parameters most affect the objective function
- Identifying optimization challenges (flat vs steep landscapes)
- Validating that optimization found a reasonable local minimum
- Research into ansatz design and parameter initialization strategies
- Debugging optimization convergence issues

Computational Cost: Medium to High - requires n additional circuit evaluations for
finite difference sensitivities, plus optionally 2×n evaluations for gradients.

Parameters:
- objective-fn: Objective function to analyze (typically the same used in optimization)
- optimal-params: Optimal parameters found by optimization
- perturbation-size: Size of parameter perturbations for analysis (default: 0.01)
- compute-gradients?: Whether to compute gradients via parameter shift (default: true)

Returns:
Map with comprehensive landscape analysis:
- :optimal-energy - Energy at optimal parameters
- :sensitivities - Finite difference sensitivities for each parameter
- :most/least-sensitive-parameter - Indices of extreme sensitivity parameters
- :gradients - Parameter shift gradients (if compute-gradients? true)
- :gradient-norm - L2 norm of gradient vector (if gradients computed)
- Metadata about analysis parameters and parameter count
sourceraw docstring

convergence-monitorclj

(convergence-monitor history options)

Monitor variational algorithm convergence with sophisticated stopping criteria.

This function tracks optimization progress and implements intelligent stopping criteria based on energy convergence, gradient norms, and parameter stability. Works with any variational quantum algorithm.

Parameters:

  • history: Vector of optimization steps {:iteration :energy :gradients :parameters}
  • options: Convergence options map
    • :tolerance - Energy convergence tolerance (default: 1e-6)
    • :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
    • :min-iterations - Minimum iterations before convergence checking (default: 10)
    • :patience - Window size for convergence analysis (default: 20)

Returns: Map with convergence analysis and recommendations

Monitor variational algorithm convergence with sophisticated stopping criteria.

This function tracks optimization progress and implements intelligent
stopping criteria based on energy convergence, gradient norms, and 
parameter stability. Works with any variational quantum algorithm.

Parameters:
- history: Vector of optimization steps {:iteration :energy :gradients :parameters}
- options: Convergence options map
  - :tolerance - Energy convergence tolerance (default: 1e-6)
  - :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
  - :min-iterations - Minimum iterations before convergence checking (default: 10)
  - :patience - Window size for convergence analysis (default: 20)

Returns:
Map with convergence analysis and recommendations
sourceraw docstring

enhanced-variational-optimizationclj

(enhanced-variational-optimization objective-fn initial-parameters options)

Run variational algorithm optimization with integrated convergence monitoring.

This function wraps optimization methods with intelligent convergence monitoring, allowing for early stopping based on energy changes, gradient norms, and parameter stability. It tracks the full optimization history and provides detailed convergence analysis.

Supports enhanced objectives that provide gradients, falling back to standard optimization for regular objective functions.

Parameters:

  • objective-fn: Objective function to minimize (can be enhanced or standard)
  • initial-parameters: Starting parameter values
  • options: Optimization options including convergence monitoring parameters
    • :optimization-method - Method to use (default: :adam)
    • :max-iterations - Maximum iterations (default: 500)
    • :tolerance - Energy convergence tolerance (default: 1e-6)
    • :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
    • :min-iterations - Minimum iterations before convergence (default: 10)
    • :patience - Convergence analysis window (default: 20)
    • :learning-rate - Learning rate for gradient descent (default: 0.01)

Returns: Map with optimization results and convergence analysis

Run variational algorithm optimization with integrated convergence monitoring.

This function wraps optimization methods with intelligent convergence monitoring,
allowing for early stopping based on energy changes, gradient norms, and parameter
stability. It tracks the full optimization history and provides detailed convergence
analysis.

Supports enhanced objectives that provide gradients, falling back to standard
optimization for regular objective functions.

Parameters:
- objective-fn: Objective function to minimize (can be enhanced or standard)
- initial-parameters: Starting parameter values  
- options: Optimization options including convergence monitoring parameters
  - :optimization-method - Method to use (default: :adam)
  - :max-iterations - Maximum iterations (default: 500)
  - :tolerance - Energy convergence tolerance (default: 1e-6)
  - :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
  - :min-iterations - Minimum iterations before convergence (default: 10)
  - :patience - Convergence analysis window (default: 20)
  - :learning-rate - Learning rate for gradient descent (default: 0.01)

Returns:
Map with optimization results and convergence analysis
sourceraw docstring

gradient-based-variational-objectiveclj

(gradient-based-variational-objective hamiltonian
                                      circuit-construction-fn
                                      backend
                                      execution-options)

Create a variational objective function that provides gradients.

This function creates an objective that computes both energy and gradients efficiently using the parameter shift rule. The gradient computation is integrated with the result framework to enable sophisticated gradient-based optimization methods.

Parameters:

  • hamiltonian: Hamiltonian to minimize
  • circuit-construction-fn: Function that takes parameters and returns a circuit
  • backend: Quantum backend for circuit execution
  • execution-options: Execution options (can include :parallel? for gradient computation)

Returns: Function that takes parameters and returns {:energy value :gradients [...] :quantum-state state}

Create a variational objective function that provides gradients.

This function creates an objective that computes both energy and gradients
efficiently using the parameter shift rule. The gradient computation is integrated 
with the result framework to enable sophisticated gradient-based optimization methods.

Parameters:
- hamiltonian: Hamiltonian to minimize  
- circuit-construction-fn: Function that takes parameters and returns a circuit
- backend: Quantum backend for circuit execution
- execution-options: Execution options (can include :parallel? for gradient computation)

Returns:
Function that takes parameters and returns {:energy value :gradients [...] :quantum-state state}
sourceraw docstring

random-parameter-initializationclj

(random-parameter-initialization num-parameters
                                 &
                                 {:keys [range] :or {range [-0.1 0.1]}})

Generate random initial parameters for variational algorithms.

Parameters:

  • num-parameters: Number of parameters to initialize
  • range: Parameter range as [min max] (default: [-0.1 0.1])

Returns: Vector of random initial parameters

Generate random initial parameters for variational algorithms.

Parameters:
- num-parameters: Number of parameters to initialize
- range: Parameter range as [min max] (default: [-0.1 0.1])

Returns:
Vector of random initial parameters
sourceraw docstring

summarize-algorithm-performanceclj

(summarize-algorithm-performance algorithm-result algorithm-name)

Create a high-level performance summary for variational quantum algorithms.

This function provides a concise, standardized summary focused on practical performance metrics and overall algorithm assessment. It's designed for benchmarking, reporting, and quick performance comparison across runs.

Use Cases:

  • Benchmarking different algorithms (VQE vs QAOA) or configurations
  • Performance reporting for research papers or technical documentation
  • Quick assessment of whether an optimization run was successful
  • Comparative analysis across different quantum backends or hardware
  • Automated performance monitoring in production quantum workflows

Parameters:

  • algorithm-result: Complete result map from algorithm execution containing:
    • :convergence-analysis (from analyze-convergence)
    • :optimal-energy, :success, :iterations (optimization results)
    • :total-runtime-ms (timing information)
  • algorithm-name: Name of the algorithm (e.g., 'VQE', 'QAOA', 'QAOA-MaxCut')

Returns: Standardized performance summary with:

  • Algorithm identification and success status
  • Key performance metrics (energy, runtime, efficiency)
  • Qualitative assessments (convergence quality, efficiency score)
Create a high-level performance summary for variational quantum algorithms.

This function provides a concise, standardized summary focused on practical
performance metrics and overall algorithm assessment. It's designed for
benchmarking, reporting, and quick performance comparison across runs.

Use Cases:
- Benchmarking different algorithms (VQE vs QAOA) or configurations
- Performance reporting for research papers or technical documentation
- Quick assessment of whether an optimization run was successful
- Comparative analysis across different quantum backends or hardware
- Automated performance monitoring in production quantum workflows

Parameters:
- algorithm-result: Complete result map from algorithm execution containing:
  - :convergence-analysis (from analyze-convergence)
  - :optimal-energy, :success, :iterations (optimization results)
  - :total-runtime-ms (timing information)
- algorithm-name: Name of the algorithm (e.g., 'VQE', 'QAOA', 'QAOA-MaxCut')

Returns:
Standardized performance summary with:
- Algorithm identification and success status
- Key performance metrics (energy, runtime, efficiency)
- Qualitative assessments (convergence quality, efficiency score)
sourceraw docstring

variational-algorithmclj

(variational-algorithm backend options algorithm-fns)

Enhanced template for variational quantum algorithms with advanced features.

This enhanced version supports gradient-enhanced objectives, advanced convergence monitoring, and sophisticated optimization strategies required by algorithms like VQE.

Parameters:

  • backend: Quantum backend for circuit execution
  • options: Algorithm options map including advanced optimization settings
    • :optimization-method - Optimization method (default: :adam)
    • :max-iterations - Maximum iterations (default: 500)
    • :tolerance - Convergence tolerance (default: 1e-6)
    • :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
    • :use-enhanced-objective - Whether to use gradient-enhanced objectives (default: auto-detect)
    • :shots - Number of shots for execution (default: 1024)
    • Other algorithm-specific options
  • algorithm-fns: Map of algorithm-specific functions:
    • :hamiltonian-constructor - (fn [config] -> hamiltonian)
    • :circuit-constructor - (fn [config] -> circuit-construction-fn)
    • :parameter-count - (fn [config] -> number)
    • :result-processor - (fn [optimization-result config] -> final-result)

Returns: Complete algorithm result map with enhanced analysis

Enhanced template for variational quantum algorithms with advanced features.

This enhanced version supports gradient-enhanced objectives, advanced convergence
monitoring, and sophisticated optimization strategies required by algorithms like VQE.

Parameters:
- backend: Quantum backend for circuit execution
- options: Algorithm options map including advanced optimization settings
  - :optimization-method - Optimization method (default: :adam)
  - :max-iterations - Maximum iterations (default: 500)
  - :tolerance - Convergence tolerance (default: 1e-6)
  - :gradient-tolerance - Gradient norm tolerance (default: 1e-4)
  - :use-enhanced-objective - Whether to use gradient-enhanced objectives (default: auto-detect)
  - :shots - Number of shots for execution (default: 1024)
  - Other algorithm-specific options
- algorithm-fns: Map of algorithm-specific functions:
  - :hamiltonian-constructor - (fn [config] -> hamiltonian)
  - :circuit-constructor - (fn [config] -> circuit-construction-fn)
  - :parameter-count - (fn [config] -> number)
  - :result-processor - (fn [optimization-result config] -> final-result)

Returns:
Complete algorithm result map with enhanced analysis
sourceraw docstring

variational-objectiveclj

(variational-objective hamiltonian
                       circuit-construction-fn
                       backend
                       execution-options)

Create a generic objective function for variational quantum algorithms.

This function provides a common interface for creating objective functions that work with both VQE and QAOA (and future variational algorithms). It abstracts the common pattern of:

  1. Convert parameters to quantum circuit
  2. Execute circuit on backend
  3. Extract Hamiltonian expectation value using result-specs
  4. Return energy for optimization

The result extraction infrastructure handles backend capabilities transparently, so this always uses result-specs for consistent and efficient operation.

Parameters:

  • hamiltonian: Hamiltonian to minimize (collection of Pauli terms)
  • circuit-construction-fn: Function that takes parameters and returns a circuit
  • backend: Quantum backend for circuit execution
  • execution-options: Options for circuit execution (shots, etc.)

Returns: Function that takes parameters and returns energy expectation value

Examples: ;; For VQE: (create-variational-objective h2-hamiltonian ansatz-fn backend options)

;; For QAOA: (create-variational-objective problem-hamiltonian (partial qaoa-ansatz-circuit problem-h mixer-h num-qubits) backend options)

Create a generic objective function for variational quantum algorithms.

This function provides a common interface for creating objective functions
that work with both VQE and QAOA (and future variational algorithms). It
abstracts the common pattern of:
1. Convert parameters to quantum circuit
2. Execute circuit on backend
3. Extract Hamiltonian expectation value using result-specs
4. Return energy for optimization

The result extraction infrastructure handles backend capabilities transparently,
so this always uses result-specs for consistent and efficient operation.

Parameters:
- hamiltonian: Hamiltonian to minimize (collection of Pauli terms)
- circuit-construction-fn: Function that takes parameters and returns a circuit
- backend: Quantum backend for circuit execution
- execution-options: Options for circuit execution (shots, etc.)

Returns:
Function that takes parameters and returns energy expectation value

Examples:
;; For VQE:
(create-variational-objective h2-hamiltonian ansatz-fn backend options)

;; For QAOA:
(create-variational-objective problem-hamiltonian 
                              (partial qaoa-ansatz-circuit problem-h mixer-h num-qubits)
                              backend options)
sourceraw docstring

variational-optimizationclj

(variational-optimization objective-fn initial-parameters options)

Run optimization for variational quantum algorithms using specified method.

This function provides a common interface for optimization that works with both VQE and QAOA. It handles the method dispatching and delegates to the appropriate optimization functions from the qopt namespace.

Supported optimization methods:

  • :gradient-descent - Basic gradient descent with parameter shift gradients
  • :adam - Adam optimizer with parameter shift gradients (recommended default)
  • :quantum-natural-gradient - Quantum Natural Gradient using Fisher Information Matrix
  • :nelder-mead - Derivative-free Nelder-Mead simplex method
  • :powell - Derivative-free Powell's method
  • :cmaes - Covariance Matrix Adaptation Evolution Strategy (robust)
  • :bobyqa - Bound Optimization BY Quadratic Approximation (handles bounds well)
  • :gradient - Fastmath gradient-based optimizers
  • :lbfgsb - L-BFGS-B optimization

Parameters:

  • objective-fn: Objective function to minimize
  • initial-parameters: Starting parameter values
  • options: Optimization options map

Returns: Map with optimization results including convergence information

Run optimization for variational quantum algorithms using specified method.

This function provides a common interface for optimization that works with
both VQE and QAOA. It handles the method dispatching and delegates to the
appropriate optimization functions from the qopt namespace.

Supported optimization methods:
- :gradient-descent - Basic gradient descent with parameter shift gradients
- :adam - Adam optimizer with parameter shift gradients (recommended default)
- :quantum-natural-gradient - Quantum Natural Gradient using Fisher Information Matrix
- :nelder-mead - Derivative-free Nelder-Mead simplex method
- :powell - Derivative-free Powell's method
- :cmaes - Covariance Matrix Adaptation Evolution Strategy (robust)
- :bobyqa - Bound Optimization BY Quadratic Approximation (handles bounds well)
- :gradient - Fastmath gradient-based optimizers
- :lbfgsb - L-BFGS-B optimization

Parameters:
- objective-fn: Objective function to minimize
- initial-parameters: Starting parameter values
- options: Optimization options map

Returns:
Map with optimization results including convergence information
sourceraw docstring

zero-parameter-initializationclj

(zero-parameter-initialization num-parameters)

Generate zero initial parameters for variational algorithms.

Parameters:

  • num-parameters: Number of parameters to initialize

Returns: Vector of zero initial parameters

Generate zero initial parameters for variational algorithms.

Parameters:
- num-parameters: Number of parameters to initialize

Returns:
Vector of zero initial parameters
sourceraw docstring

cljdoc builds & hosts documentation for Clojure/Script libraries

Keyboard shortcuts
Ctrl+kJump to recent docs
Move to previous article
Move to next article
Ctrl+/Jump to the search field
× close