top of page
Writer's pictureBosonQ Psi

PERFORMANCE SHOWDOWN: GPU OPTIMIZED QIEO VS. GENETIC ALGORITHM ON BENCHMARK FUNCTIONS

Updated: 12 minutes ago


Summary of Results

QIEO vs. GA

The performance of optimization algorithms in solving highly non-linear, non-convex, and non-separable function engineering depends on their ability to converge to accurate solutions efficiently, determining their effectiveness in real-world engineering applications. 

 

Brief overview of optimization algorithms in computational intelligence 

Optimization is the mathematical process that finds the best way to perform specific tasks. It involves finding the most efficient solutions to problems with specific constraints and objectives, such as determining optimal routes for transportation or scheduling production to minimize costs and maximize productivity. Some real-world examples of Optimization include estimating the minimum time to make a journey, minimizing the cost of doing tasks, reducing material usage used in engineering structures, or optimizing energy usage.  

However, conventional optimization techniques have limitations, such as computational resources, when solving complex multi-objective and multi-constraint optimization problems. Quantum-inspired evolutionary Algorithms (QIEAs) are unmissable as they transform conventional optimization methods. With the unique ability to mirror the natural evolution process and leverage quantum computing principles, QIEAs are inspiring breakthroughs in various industries. 

 

Approaches to Optimization Problems: 

Gradient-Based Approach 

Traditional optimization methods like gradient descent and Newton-Raphson struggle in complex, non-convex landscapes, leading to higher computational costs and time. These algorithms depend on gradient information and require continuous and smooth design spaces, which may not match real-world scenarios. 

They are computationally expensive in high-dimensional spaces, while enumeration algorithms struggle with the curse of dimensionality as the number of design variables grows. 

 

Challenges of the Gradient-Based Approach: 

 

  • Traditional optimization methods struggle to find good-quality solutions in complex, non-convex landscapes, leading to higher computational costs and time. 

  • Dependence on gradient information and requirement for continuous and smooth design spaces, which may not be present in real-world scenarios. 

  • Computationally expensive in high-dimensional spaces due to frequent gradient evaluations. 

  • Enumeration algorithms face challenges with the complexity of dimensionality as the number of design variables grows. 


Meta-heuristic Approach 

Meta-heuristics like genetic algorithms, particle swarm optimization, and ant colony optimization have emerged as powerful tools for overcoming the limitations of traditional optimization methods. They efficiently find near-optimal solutions for a wide range of problems.  

Meta-heuristics offer scalability for multiple objectives and can adapt to various problem domains, including combinatorial, discrete, and continuous Optimization. They demonstrate robustness to imperfect objective functions and can be implemented in distributed and parallel computing infrastructures. 

Challenges of the Meta-heuristic Approach: 

  • Scaling meta-heuristic algorithms to handle large sizes of design variables and the need for hyper-parameter tuning. 

  • Limited feasibility of addressing challenges within acceptable computing times as problem dimensionality grows. 


Potential of Quantum Meta-heuristic Algorithm in Optimization: 

The need for population sizes and lengthy Finite Element Method (FEM) evaluations highlights the importance of parallelizing meta-heuristics to reduce overall runtime. Quantum meta-heuristic algorithms offer the potential to mitigate the limitations of conventional meta-heuristics by utilizing independent evolutionary strategies inspired by quantum physics. 


Significance of benchmark functions in assessing optimization algorithms 

The initial focus on benchmark functions is crucial to understanding the algorithm’s efficacy in addressing real-world optimization challenges. Specifically, BQP has selected challenging multi-modal, multidimensional, and non-separable functions from the CEC2014 benchmark suite, closely reflecting the complexity inherent in real-world optimization problems. 


Importance of using benchmark functions:  

Benchmarking is essential for evaluating the effectiveness of new optimization techniques in engineering. Standard functions serve as standardized tests to assess algorithm performance under controlled conditions. These functions are categorized based on problem type, complexity, and characteristics, enabling researchers to compare algorithms and understand their strengths and weaknesses. 


Benchmarking of QIEO

In our QIEO benchmarking, we have chosen three multi-modal functions to evaluate performance:

  • Ackley function: This function features numerous local minima, a narrow global minimum, and a nearly flat region, making it difficult for algorithms to distinguish between optima.

  • Rosenbrock function: This function is characterized by a narrow, parabolic ridge, challenging algorithms to identify promising search directions.

  • Rastrigin function: This highly multimodal function with numerous local minima arranged in a grid presents a significant challenge for optimization algorithms.

These functions are highly nonlinear and non-convex, increasing both function and design space complexity, with a global minimum of zero for all three. Converging to this global minimum is difficult for gradient-based approaches, which often converge to local minima. These standard functions provide valuable insights into the performance characteristics of different algorithms, informing their applicability to real-world problems.


Benchmarking Results: Comparative analysis of performance metrics 


The GA and QIEO algorithms used in these simulations were developed using CUDA C++, leveraging an NVIDIA A100 GPU. Both algorithms used the same encoding of the design space and identical convergence criteria: 

  • Maximum number of generations: 3000 

  • Fitness value tolerance: 1e−8 (1e−3 for Ackley and Rosenbrock, 1e−6 for Rastrigin) 


Performance was evaluated by comparing results obtained with varying population sizes across 30 trials. Performance was assessed using: 

  • Accuracy: The ability to reach the fitness value within the given tolerance. 

  • Convergence rate: The number of generations required to reach the specified accuracy. 


Algorithm Assessment 

The results demonstrate that GA consistently requires larger population sizes than QIEO to satisfy convergence criteria. For example, in the ten-dimensional Ackley function, GA required a population of 2000, while QIEO achieved the same with only 100. Similar trends were observed for the Rosenbrock and Rastrigin functions. 

QIEO also generally required fewer generations to reach the desired accuracy, even with the same population size. This is evident from the lower maximum points in the box-and-whisker plots for QIEO. 


Performance Evaluation Findings 

Analysis of the total number of function evaluations (population size multiplied by average convergence rate) revealed that GA required significantly more evaluations than QIEO to reach the same fitness level. For example, GA required 12 times more evaluations than QIEO for the Ackley function. Similar results were found for the Rastrigin (5.1 times more) and Rosenbrock (2.2 times more) functions. 

QIEO also exhibited less variance in the total number of function evaluations, indicating greater reliability. 


Superiority of QIEO 

QIEO demonstrated several key advantages over GA: 

  • Higher Speedup: QIEO achieved significant speedups: 2.9x (Ackley), 3.9x (Rosenbrock), and 3.84x (Rastrigin). 

  • Smaller Populations: QIEO consistently achieved convergence with smaller population sizes. 

  • Faster Convergence: QIEO generally requires fewer generations to reach the desired accuracy. 

  • Enhanced Exploration: QIEO’s use of the Ry gate promotes better exploration, reducing premature convergence. 

Additionally, QIEO's lower variance in convergence rate indicates less dependence on initial conditions than GA. The quantum nature of QIEO allows for a thorough initial exploration of the search space, leading to a more consistent convergence behavior. 


Overall Assessment 

The GPU-optimized QIEO significantly outperforms the GPU-optimized GA on the Ackley, Rastrigin, and Rosenbrock benchmark functions. QIEO required substantially fewer function evaluations (12x fewer for Ackley, 5x fewer for Rosenbrock, and 2.5x fewer for Rastrigin) and converged much faster (one-third of the time for Ackley and one-fourth for Rosenbrock and Rastrigin) than GA. Furthermore, QIEO exhibited lower variance in fitness outcomes and convergence rates, highlighting its superior reliability. This combination of enhanced reliability, reduced computational effort, and faster Optimization makes QIEO particularly valuable for engineering optimization problems, where function evaluation is often the most computationally expensive process. 


Final remarks on the broader implications and future 

QIEO's inherent parallelization, due to the independent evolution of individuals and the application of rotational gates and measurements, reduces its serial footprint compared to GA. Using a single parameter-dependent evolution operator (Ry gate) balances exploration and exploitation. 

Quantum computing concepts in QIEO enhance global search capability while maintaining local search effectiveness. The mechanism of updating quantum chromosomes based on the best solution from previous iterations guides them toward high-quality solutions while promoting exploration. 


Future Studies 

Further research is recommended to explore QIEO's potential for accuracy and efficiency in various function optimization scenarios. Comprehensive evaluations against established methods like GA across a broader range of engineering optimization problems would provide further insights into QIEO's generalizability and effectiveness. 

9 views0 comments

Recent Posts

See All

Commenti


bottom of page